Further Reading, Other Developments, and Coming Events (26 October)

Further Reading

  •  “Google is giving data to police based on search keywords, court docs show” By Alfred Ng — c|net. Google is responding to keyword warrants where prosecutors ask the company to provide IP addresses for all people who made a certain search within a geographical area during a certain time. In the case discussed in the piece (bizarrely witness intimidation of someone testifying against R. Kelly), a keyword warrant allowed them to locate a person who may have burned down someone’s house. It is likely this warrant will be challenged on Fourth Amendment grounds.
  • Google AI Tech Will Be Used for Virtual Border Wall, CBP Contract Shows” By Lee Fang and Sam Biddle — The Intercept. Google may again be wading into territory its labor force may find objectionable. The United States (U.S.) Customs and Border Protection (CBP) will use Google Cloud in its artificial intelligence-driven virtual fence on the U.S.-Mexico border. This may result in employee push back as it did in 2018 when this sort of internal pressure caused Google to walk away from a Department of Defense program, Project Maven. A whistleblower group ferreted out the fact that Google is contracting with CBP, which took some effort considering Google appears to be a subcontractor to a prime contractor.
  • Facebook Manipulated the News You See to Appease Republicans, Insiders Say” By Monika Bauerlein and Clara Jeffery — Mother Jones. In January 2018 Facebook changed its algorithm to try to address the growing toxicity during and after the 2016 election. The supposed solution was to remove untrustworthy information. However, the original test of this new algorithm led to deprioritizing many conservative sources that traffic in misinformation and slanted stories. This was deemed unacceptable from a political point of view, and the opposite was done. A number of liberal media organizations saw their traffic drop off a cliff.
  • Why A Gamer Started A Web Of Disinformation Sites Aimed At Latino Americans” By Kaleigh Rogers and Jaime Longoria — FiveThirtyEight. The reason why a gamer and YouTuber started fake sites aimed at Latinos was profit, nothing else.
  • Twitter and White House deny claims that researcher hacked Trump’s account” By Adi Robertson — The Verge. A Dutch researcher claims the password maga2020 got him into President Donald Trump’s Twitter account even though the White House and Twitter both deny the claim. There is a bizarre tweet Trump sent earlier this month that may, in fact, be the work of this researcher. In any event, he is being coy about whether he sent it or not.

Other Developments

  • The United Kingdom’s Information Commissioner’s Office (ICO) reduced its fine on British Airways (BA) to a ninth of the preliminary total for violations of the General Data Protection Regulation (GDPR). The ICO has levied a £20 million fine on the airline “for failing to protect the personal and financial details of more than 400,000 of its customers.” In July 2019, the ICO issued a notice of its intention to fine British Airways £183.39 million because the “[p]ersonal data of approximately 500,000 customers were compromised.” After BA made its case, the ICO reduced the fine to £30 million before knocking off another £10 million because of mitigating factors and a British government policy to ease up on businesses during the pandemic. Conceivably, the fine could have been much higher for the GDPR allows for fines of up to 4% of worldwide revenue, and in this case, for the period in question, BA had £12.26 billion in revenue. The ICO explained:
    • The attacker is believed to have potentially accessed the personal data of approximately 429,612 customers and staff. This included names, addresses, payment card numbers and CVV numbers of 244,000 BA customers.
    • Other details thought to have been accessed include the combined card and CVV numbers of 77,000 customers and card numbers only for 108,000 customers.
    • Usernames and passwords of BA employee and administrator accounts as well as usernames and PINs of up to 612 BA Executive Club accounts were also potentially accessed.
    • The ICO found:
      • There were numerous measures BA could have used to mitigate or prevent the risk of an attacker being able to access the BA network. These include:
        • limiting access to applications, data and tools to only that which are required to fulfil a user’s role
        • undertaking rigorous testing, in the form of simulating a cyber-attack, on the business’ systems;
        • protecting employee and third party accounts with multi-factor authentication.
      • Additional mitigating measures BA could have used are listed in the penalty notice.
      • None of these measures would have entailed excessive cost or technical barriers, with some available through the Microsoft Operating System used by BA.
      • Since the attack, BA has made considerable improvements to its IT security.
      • ICO investigators found that BA did not detect the attack on 22 June 2018 themselves but were alerted by a third party more than two months afterwards on 5 September. Once they became aware BA acted promptly and notified the ICO.
      • It is not clear whether or when BA would have identified the attack themselves. This was considered to be a severe failing because of the number of people affected and because any potential financial harm could have been more significant.
  • The Congressionally created Cyberspace Solarium Commission (CSC) issued a white paper “Building a Trusted ICT Supply Chain,” with its assessment as to why the United States (U.S.) no longer has a thriving technological industrial base and how it might again, which is nothing less than a matter of signal importance considering the growing dominance of the People’s Republic of China (PRC) in those fields. With the CSC releasing this white paper, it has become another player on the field in U.S. government policy circles proposing how the U.S. may protect its information and communications technology (ICT) supply chain against sabotage, malice, or control by an adversarial power. 
    • The CSC claimed:
      • United States lacks key industrial capacities crucial to the production of essential technologies, including fifth-generation (5G) telecommunications equipment. Among other factors, the willingness of countries such as China to subsidize and support their domestic industries has created the uneven playing field that hinders the competitiveness and, ultimately, the viability of U.S. companies in global markets. The resulting lack of industrial capacity has forced critical dependencies on companies that manufacture in adversary countries, such as China, where companies are beholden to Chinese national intelligence, national cybersecurity, and national security laws. While dependency on foreign production and foreign goods is not inherently bad—indeed, the United States relies on manufacturing and companies headquartered in partner countries such as Finland, Sweden, South Korea, and Taiwan—the U.S. government must emphasize the importance of trusted suppliers, and these dependencies pose three concrete risks to the security of the United States.
    • The CSC explained why fostering a supply chain for ICT in the U.S. will not be easy:
      • Three main challenges confront attempts to rebuild U.S. high-tech manufacturing capacity: (1) lack of patient funding capital, (2) high investment barriers to entry, and (3) standards and intellectual property barriers to entry. These challenges arise from the simple fact that the economics of the hardware industry are not as attractive as those of many other technology sectors. One of the major shortcomings of U.S. efforts to date to secure ICT supply chains is their failure to address how the United States got to this point, where ICT equipment manufacturing and production is a critical economic weakness. In order to craft an effective strategy to rebuild high-tech manufacturing and gain greater industrial independence, policymakers must first understand the challenges to reinvigorating the United States’ high-tech manufacturing industry. Only then can they comprehend why market forces have pushed U.S. high-tech industrial capacity to atrophy over the past two decades and recognize the issues that they must tackle in developing an industrial base strategy.
      • None of these barriers are insurmountable, but the reality is that the United States has lost much of its market share for the manufacture of electronics components and nearly all of its market share for the manufacture and assembly of finished electronics products. Nonetheless, a U.S. strategy to secure its ICT supply chain from all threats must include a plan to identify the key technologies and materials, and then attract more patient investment in hardware manufacturing, devise a method to retrain the atrophied muscles of production, and set the conditions to overcome barriers to entry posed by the constraints of standards and intellectual property.
    • The CSC “specifies a strategy to build trusted supply chains for critical ICT by:
      • Identifying key technologies and equipment through government reviews and public-private partnerships to identify risk.
      • Ensuring minimum viable manufacturing capacity through both strategic investment and the creation of economic clusters.
      • Protecting supply chains from compromise through better intelligence, information sharing, and product testing.
      • Stimulating a domestic market through targeted infrastructure investment and ensuring the ability of firms to offer products in the United States similar to those offered in foreign markets.
      • Ensuring global competitiveness of trusted supply chains, including American and partner companies, in the face of Chinese anti-competitive behavior in global markets.
    • The CSC also highlighted “five key and eight supporting recommendations to build trusted supply chains for critical ICT technologies:
      • Supply Chain 1: Congress should direct the executive branch to develop and implement an information and communication technologies industrial base strategy.
      • Supply Chain 2: Congress should direct the Department of Homeland Security, in coordination with the Department of Commerce, Department of Defense, Department of State, and other departments and agencies, to identify key information and communication technologies and materials through industry consultation and government review.
      • Supply Chain 3: Congress should direct the Department of Commerce, in consultation with the Department of Homeland Security, the Department of State, and the Department of Defense, to conduct a viability study of localities fit for economic clustering. It should fund the Department of Commerce, in consultation with the Department of Homeland Security, Department of State, and Department of Defense, to solicit competitive bids and applications from candidate states, municipalities, and localities for the designation of no fewer than three and no more than five critical technology manufacturing clusters.
        • Supply Chain 3.1: The federal government should commit significant and consistent funding toward research and development in emerging technologies.
        • Supply Chain 3.2: The federal government should, in partnership with partner and ally governments, develop programs to incentivize the movement of critical chip and technology manufacturing out of China.
        • Supply Chain 3.3: Congress should direct the President to conduct a study on the viability of a public-private national security investment corporation to attract private capital for investment in strategically important areas.
      • Supply Chain 4: The President should designate a lead agency to integrate and coordinate government ICT supply chain risk management efforts into an ongoing national strategy and to serve as the nexus for public-private partnerships on supply chain risk management.
        • Supply Chain 4.1: Congress should direct the President to construct or designate a National Supply Chain Intelligence Center.
        • Supply Chain 4.2: Congress should fund three Critical Technology Security Centers, selected and designated by DHS, in collaboration with the Department of Commerce, Department of Energy, Office of the Director of National Intelligence (ODNI), and Department of Defense.
      • Supply Chain 5: The Federal Communications Commission (FCC) should tie 5G infrastructure investment to open and interoperable standards and work with the Department of Defense and the National Telecommunications and Information Agency to facilitate the release of more mid-band spectrum in order to ensure a strong domestic market for telecommunications equipment.
        • Supply Chain 5.1: The U.S. Agency for International Development (USAID) should work with international partners to develop a digital risk impact assessment that highlights the risks associated with the use of untrusted technologies in implementing digitization and telecommunications infrastructure projects.
        • Supply Chain 5.2: Congress should ensure that the Export-Import Bank (EXIM), U.S. International Development Finance Corporation (DFC), and United States Trade Development Agency (USTDA) all operate in legal, regulatory, and funding environments conducive to successfully competing with Chinese state-owned and state-backed enterprises, including their ability to support investments from companies headquartered in partner and ally countries.
        • Supply Chain 5.3: USAID, DFC, and USTDA should develop and maintain a list of prohibited contractors and clients, including companies subject to the Chinese national security and national intelligence laws, that may not be used to implement USAID-, DFC-, and USTDA-funded projects.
  • The Federal Trade Commission (FTC) has reportedly met to review its anti-trust case against Facebook that could get filed as soon as next month. The FTC start looking into Facebook’s dominance in the social messaging market about the same time it handed down a $5 billion fire for the tech giant’s involvement with Cambridge Analytica that violated the 2012 consent decree. The anti-trust investigation is reportedly focused on Facebook’s acquisitions of WhatsApp and Instagram, two of the world’s largest messaging platforms. The FTC is reportedly focused on the effects of Facebook’s buying two potential competitors, WhatsApp and Instagram, and if the FTC succeeds in a suit against Facebook, the company may be forced to spin off those two entities. Moreover, New York Attorney General Tish James is leading a state investigation of Facebook that “focuses on Facebook’s dominance in the industry and the potential anticompetitive conduct stemming from that dominance.” This inquiry started over a year ago, and any timing on possible action is not clear. The European Commission is also reportedly looking at Facebook for anti-trust violations as media accounts indicated in late 2019.
    • The House Judiciary Committee argued in its recent report on competition in digital markets that “the strong network effects associated with Facebook has tipped the market toward monopoly such that Facebook competes more vigorously among its own products—Facebook, Instagram, WhatsApp, and Messenger—than with actual competitors.” In response to the House Judiciary Committee’s view on these deals, a Facebook spokesperson claimed “[a] strongly competitive landscape existed at the time of both acquisitions and exists today…[and] [r]egulators thoroughly reviewed each deal and rightly did not see any reason to stop them at the time.”
    • In February 2019, the German agency with jurisdiction over competition issued a decision that potentially could block Facebook from combining the personal data of Germans from other Facebook-owned entities such as Instagram and WhatsApp or from unrelated third-party sources. According to the Bundeskartellamt’s press release, the agency “has imposed on Facebook far-reaching restrictions in the processing of user data.”
  • A group of nations are proposing a third way to bridge the dual efforts of two United Nations (U.N.) groups to develop cyber norms. In the “The future of discussions on ICTs and cyberspace at the UN,” this group of nations propose to “explore establishment of a Programme of Action for advancing responsible State behaviour in cyberspace with a view to ending the dual track discussions (GGE/OEWG) and establishing a permanent UN forum to consider the use of ICTs by States in the context of international security.” They stressed “the urgent need for the international community to address the use of ICTs in the context of international peace and security.” France, Egypt, Argentina, Colombia, Ecuador, Gabon, Georgia, Japan, Morocco, Norway, Salvador, Singapore, the Republic of Korea, the Republic of Moldova, The Republic of North Macedonia, the United Kingdom, the EU and its member States submitted the proposal.
    • These nations argued:
      • Since 2018, two working groups and many initiatives have started under the auspices of the UN. We welcome the willingness of the international community to engage, and recognize that each of those initiatives has its own merits and specificities. Yet, they aim at tackling the same issues: advancing norms of responsible behaviour, understanding how international law concretely applies to cyberspace, developing CBMs and fostering capacity building. We consider that this situation, although evidencing the growing commitment of the international community to dedicating time and resources to the matters at hand, creates redundancies and, at times, can be counter-productive. It is therefore a cause for concern.
      • In the fall of 2019, the U.N. Group of Governmental Experts (GGE) and the U.N. Open-ended Working Group (OEWG) started meeting per U.N. resolutions to further consultative discussions on an international agreement or set of agreements on what is considered acceptable and unacceptable cyber practices. Previous efforts largely stalled over disagreements between a bloc led by the U.S. and its allies and nations like the People’s Republic of China (PRC), Russia, and others with a different view on acceptable practices. Notably, unlike 2010, 2013 and 2015, the 2017 U.N. GGE could not reach agreement on additional voluntary, non-binding norms on how nations should operate in cyberspace. The OEWG was advocated for by countries like Russia, the PRC, and others seen as being in opposition to some of the views propagated by the U.S. and its allies, notably on the issue of what kind of measures a nation may use inside its borders to limit internet usage for its citizens.
      • As explained in a 2018 U.N. press release, competing resolutions were offered to create groups “aimed at shaping norm-setting guidelines for States to ensure responsible conduct in cyberspace:”
        • the draft resolution “Developments in the field of information and telecommunications in the context of international security” (document A/C.1/73/L.27.Rev.1), tabled by the Russian Federation.  By the text, the Assembly would decide to convene in 2019 an open-ended working group acting on a consensus basis to further develop the rules, norms and principles of responsible behaviour of States.
        • the draft resolution “Advancing Responsible State Behaviour in Cyberspace in the Context of International Security” (document A/C.1/73/L.37), tabled by the United States…[that] would request the Secretary-General, with the assistance of a group of governmental experts to be established in 2019, to continue to study possible cooperative measures to address existing and potential threats in the sphere of information security, including norms, rules and principles of responsible behaviour of States.
  • The United Kingdom’s Information Commissioner’s Office (ICO) published a compulsory audit of the Department for Education (DfE) and found:
    • The audit found that data protection was not being prioritised and this had severely impacted the DfE’s ability to comply with the UK’s data protection laws. A total of 139 recommendations for improvement were found, with over 60% classified as urgent or high priority.
    • The ICO explained:
      • The Commissioner’s Enforcement team ran a broad range investigation in 2019 following complaints from DefendDigitalMe and Liberty and their concerns around the National Pupil Database (NPD). The ICO met with key senior level data protection professionals at the DfE’s offices in London in November2019 where the possibilities of a consensual audit were discussed. However, due to the risks associated with the volume and types of personal data processed within the NPD as well as the ages of the data subjects involved, the Commissioner decided, in line with her own Regulatory Action Policy, to undertake a compulsory audit using her powers under section 146 of the DPA18.The Commissioner determined this approach would provide a comprehensive review of DfE data protection practices, governance and other key control measures supporting the NPD and internally held databases, using the framework of scope areas of audit as listed below. This would allow the Commissioner to identify any risk associated with the data processed and implications to the individual rights of over 21 million data subjects.
  • The European Commission (EC) announced it “made commitments offered by [United States firm] Broadcom legally binding under EU antitrust rules.” The EC started looking into the company in mid-2019 for supposedly abusive behavior that was harming players and people in the TV and modem chipset markets in the European Union.
    • The EC explained:
      • In June 2019, the Commission initiated proceedings into alleged abuse of dominance by Broadcom and at the same time issued a Statement of Objections seeking the imposition of interim measures. In October 2019, the Commission took a decision concluding that interim measures were necessary to prevent serious and irreparable damage to competition from occurring in the worldwide markets for SoCs for (i) TV set-top boxes, (ii) xDSL modems, (iii) fibre modems, as well as (iv) cable modems.
      • The Commission took issue with certain exclusivity or quasi-exclusivity and leveraging arrangements imposed by Broadcom in relation to SoCs for TV set top boxes, xDSL and fibre modems. The decision ordered Broadcom to stop applying these provisions contained in agreements with six of its main customers and ordered the implementation of interim measures applicable for a period of three years.
    • The EC asserted Broadcom has agreed to the following:
      • At European Economic Area (EEA) level, Broadcom will:
        • a) Not require or induce by means of price or non-price advantages an OEM to obtain any minimum percentage of its EEA requirements for SoCs for TV set-top boxes, xDSL modems and fibre modems from Broadcom; and
        • b) Not condition the supply of, or the granting of advantages for, SoCs for TV set-top boxes, xDSL modems and fibre modems on an OEM obtaining from Broadcom another of these products or any other product within the scope of the commitments (i.e. SoCs for cable modems, Front End Chips for set-top boxes and modems and/or Wi-Fi Chips for set-top boxes and modems).
      • At worldwide level (excluding China), Broadcom will:
        • a) Not require or induce an OEM by means of certain types of advantages to obtain more than 50% of its requirements for SoCs for TV set-top boxes, xDSL modems and fibre modems from Broadcom; and
        • b) Not condition the supply of, or the granting of advantages for, SoCs for TV set-top boxes, xDSL modems and fibre modems on an OEM obtaining from Broadcom more than 50% of its requirements for any other of these products, or for other products within the scope of the commitments.
      • The commitments also include specific provisions regarding incentives to bid equipment based on Broadcom products as well as certain additional clauses with regard to service providers in the EEA.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released its agenda:
    • Restoring Internet Freedom Order Remand. The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11-42)
    • Establishing a 5G Fund for Rural America . The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces. The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications . The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option . The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets. The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements. The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next-generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Isaac Struna on Unsplash

Further Reading, Other Developments, and Coming Events (22 October)

Further Reading

  •  “A deepfake porn Telegram bot is being used to abuse thousands of women” By Matt Burgess — WIRED UK. A bot set loose on Telegram can take pictures of women and, apparently teens, too, and “takes off” their clothing, rendering a naked image of females who never took naked pictures. This seems to be the next iteration in deepfake porn, a problem that will surely get worse until governments legislate against it and technology companies have incentives to locate and take down such material.
  • The Facebook-Twitter-Trump Wars Are Actually About Something Else” By Charlie Warzel — The New York Times. This piece makes the case that there are no easy fixes for American democracy or for misinformation on social media platforms.
  • Facebook says it rejected 2.2m ads for breaking political campaigning rules” — Agence France-Presse. Facebook’s Vice President of Global Affairs and Communications Nick Clegg said the social media giant is employing artificial intelligence and humans to find and remove political advertisements that violate policy in order to avoid a repeat of 2016 where untrue information and misinformation played roles in both Brexit and the election of Donald Trump as President of the United States.
  • Huawei Fallout—Game-Changing New China Threat Strikes At Apple And Samsung” By Zak Doffman — Forbes. Smartphone manufacturers from the People’s Republic of China (PRC) appear ready to step into the projected void caused by the United States (U.S.) strangling off Huawei’s access to chips. Xiaomi and Oppo have already seen sales surge worldwide and are poised to pick up where Huawei is being forced to leave off, perhaps demonstrating the limits of U.S. power to blunt the rise of PRC technology companies.
  • As Local News Dies, a Pay-for-Play Network Rises in Its Place” By Davey Alba and Jack Nicas — The New York Times. With a decline and demise of many local media outlets in the United States, new groups are stepping into the void, and some are politically minded but not transparent about biases. The organization uncovered in this article is nakedly Republican and is running and planting articles at both legitimate and artificial news sites for pay. Sometimes conservative donors pay, sometimes campaigns do. Democrats are engaged in the same activity but apparently to a lesser extent. These sorts of activities will only erode further faith in the U.S. media.
  • Forget Antitrust Laws. To Limit Tech, Some Say a New Regulator Is Needed.” By Steve Lohr — The New York Times. This piece argues that anti-trust enforcement actions are plodding, tending to take years to finish. Consequently, this body of law is inadequate to the task of addressing the market dominance of big technology companies. Instead, a new regulatory body is needed along the lines of those regulating the financial services industries that is more nimble than anti-trust. Given the problems in that industry with respect to regulation, this may not be the best model.
  • “‘Do Not Track’ Is Back, and This Time It Might Work” By Gilad Edelman — WIRED. Looking to utilize the requirement in the “California Consumer Privacy Act” (CCPA) (AB 375) that requires regulated entities to respect and effectuate the use of a one-time opt-out mechanism, a group of entities have come together to build and roll out the Global Privacy Control. In theory, users could download this technical specification to their phones and computers, install it, use it once, and then all websites would be on notice regarding that person’s privacy preferences. Such a means would go to the problem turned up by Consumer Reports recent report on the difficulty of trying to opt out of having one’s personal information sold.
  • EU countries sound alarm about growing anti-5G movement” By Laurens Cerulus — Politico. 15 European Union (EU) nations wrote the European Commission (EC) warning that the nascent anti-5G movement borne of conspiracy thinking and misinformation threatens the Eu’s position vis-à-vis the United States (U.S.) and the People’s Republic of China (PRC). There have been more than 200 documented arson attacks in the EU with the most having occurred in the United Kingdom, France, and the Netherlands. These nations called for a more muscular, more forceful debunking of the lies and misinformation being spread about 5G.
  • Security firms call Microsoft’s effort to disrupt botnet to protect against election interference ineffective” By Jay Greene — The Washington Post. Microsoft seemingly acted alongside the United States (U.S.) Cyber Command to take down and impair the operation of Trickbot, but now cybersecurity experts are questioning how effective Microsoft’s efforts really were. Researchers have shown the Russian operated Trickbot has already stood up operations and has dispersed across servers around the world, showing how difficult it is to address some cyber threats.
  • Governments around the globe find ways to abuse Facebook” By Sara Fischer and Ashley Gold — Axios. This piece puts a different spin on the challenges Facebook faces in countries around the world, especially those that ruthlessly use the platform to spread lies and misinformation than the recent BuzzFeed News article. The new article paints Facebook as the well-meaning company being taken advantage of while the other one portrays a company callous to content moderation except in nations where it causes them political problems such as the United States, the European Union, and other western democracies.

Other Developments

  • The United States (U.S.) Department of Justice’s (DOJ) Cyber-Digital Task Force (Task Force) issued “Cryptocurrency: An Enforcement Framework,” that “provides a comprehensive overview of the emerging threats and enforcement challenges associated with the increasing prevalence and use of cryptocurrency; details the important relationships that the Department of Justice has built with regulatory and enforcement partners both within the United States government and around the world; and outlines the Department’s response strategies.” The Task Force noted “[t]his document does not contain any new binding legal requirements not otherwise already imposed by statute or regulation.” The Task Force summarized the report:
    • [I]n Part I, the Framework provides a detailed threat overview, cataloging the three categories into which most illicit uses of cryptocurrency typically fall: (1) financial transactions associated with the commission of crimes; (2) money laundering and the shielding of legitimate activity from tax, reporting, or other legal requirements; and (3) crimes, such as theft, directly implicating the cryptocurrency marketplace itself. 
    • Part II explores the various legal and regulatory tools at the government’s disposal to confront the threats posed by cryptocurrency’s illicit uses, and highlights the strong and growing partnership between the Department of Justice and the Securities and Exchange Commission, the Commodity Futures Commission, and agencies within the Department of the Treasury, among others, to enforce federal law in the cryptocurrency space.
    • Finally, the Enforcement Framework concludes in Part III with a discussion of the ongoing challenges the government faces in cryptocurrency enforcement—particularly with respect to business models (employed by certain cryptocurrency exchanges, platforms, kiosks, and casinos), and to activity (like “mixing” and “tumbling,” “chain hopping,” and certain instances of jurisdictional arbitrage) that may facilitate criminal activity.    
  • The White House’s Office of Science and Technology Policy (OSTP) has launched a new website for the United States’ (U.S.) quantum initiative and released a report titled “Quantum Frontiers: Report On Community Input To The Nation’s Strategy For Quantum Information Science.” The Quantum Initiative flows from the “National Quantum Initiative Act” (P.L. 115-368) “to  provide  for  a  coordinated  Federal  program  to  accelerate  quantum  research  and  development  for  the  economic and national security of the United States.” The OSTP explained that the report “outlines eight frontiers that contain core problems with fundamental questions confronting quantum information science (QIS) today:
    • Expanding Opportunities for Quantum Technologies to Benefit Society
    • Building the Discipline of Quantum Engineering
    • Targeting Materials Science for Quantum Technologies
    • Exploring Quantum Mechanics through Quantum Simulations
    • Harnessing Quantum Information Technology for Precision Measurements
    • Generating and Distributing Quantum Entanglement for New Applications
    • Characterizing and Mitigating Quantum Errors
    • Understanding the Universe through Quantum Information
    • OSTP asserted “[t]hese frontier areas, identified by the QIS research community, are priorities for the government, private sector, and academia to explore in order to drive breakthrough R&D.”
  • The New York Department of Financial Services (NYDFS) published its report on the July 2020 Twitter hack during which a team of hacker took over a number of high-profile accounts (e.g. Barack Obama, Kim Kardashian West, Jeff Bezos, and Elon Musk) in order to perpetrate a cryptocurrency scam. The NYDFS has jurisdiction over cryptocurrencies and companies dealing in this item in New York. The NYDFS found that the hackers used the most basic means to acquire permission to take over accounts. The NYDFS explained:
    • Given that Twitter is a publicly traded, $37 billion technology company, it was surprising how easily the Hackers were able to penetrate Twitter’s network and gain access to internal tools allowing them to take over any Twitter user’s account. Indeed, the Hackers used basic techniques more akin to those of a traditional scam artist: phone calls where they pretended to be from Twitter’s Information Technology department. The extraordinary access the Hackers obtained with this simple technique underscores Twitter’s cybersecurity vulnerability and the potential for devastating consequences. Notably, the Twitter Hack did not involve any of the high-tech or sophisticated techniques often used in cyberattacks–no malware, no exploits, and no backdoors.
    • The implications of the Twitter Hack extend far beyond this garden-variety fraud. There are well-documented examples of social media being used to manipulate markets and interfere with elections, often with the simple use of a single compromised account or a group of fake accounts.In the hands of a dangerous adversary, the same access obtained by the Hackers–the ability to take control of any Twitter users’ account–could cause even greater harm.
    • The Twitter Hack demonstrates the need for strong cybersecurity to curb the potential weaponization of major social media companies. But our public institutions have not caught up to the new challenges posed by social media. While policymakers focus on antitrust and content moderation problems with large social media companies, their cybersecurity is also critical. In other industries that are deemed critical infrastructure, such as telecommunications, utilities, and finance, we have established regulators and regulations to ensure that the public interest is protected. With respect to cybersecurity, that is what is needed for large, systemically important social media companies.
    • The NYDFS recommended the cybersecurity measures cryptocurrency companies in New York should implement to avoid similar hacks, including its own cybersecurity regulations that bind its regulated entities in New York. The NYDFS also called for a national regulator to address the lack of a dedicated regulator of Twitter and other massive social media platforms. The NYDFS asserted:
      • Social media companies currently have no dedicated regulator. They are subject to the same general oversight applicable to other companies. For instance, the SEC’s regulations for all public companies apply to public social media companies, and antitrust and related laws and regulations enforced by the Department of Justice and the FTC apply to social media companies as they do to all companies. Social media companies are also subject to generally applicable laws, such as the California Consumer Privacy Act and the New York SHIELD Act. The European Union’s General Data Protection Regulation, which regulates the storage and use of personal data, also applies to social media entities doing business in Europe.
      • But there are no regulators that have the authority to uniformly regulate social media platforms that operate over the internet, and to address the cybersecurity concerns identified in this Report. That regulatory vacuum must be filled.
      • A useful starting point is to create a “systemically important” designation for large social media companies, like the designation for critically important bank and non-bank financial institutions. In the wake of the 2007-08 financial crisis, Congress established a new regulatory framework for financial institutions that posed a systemic threat to the financial system of the United States. An institution could be designated as a Systemically Important Financial Institution (“SIFI”) “where the failure of or a disruption to the functioning of a financial market utility or the conduct of a payment, clearing, or settlement activity could create, or increase, the risk of significant liquidity or credit problems spreading among financial institutions or markets and thereby threaten the stability of the financial system of the United States.”
      • The risks posed by social media to our consumers, economy, and democracy are no less grave than the risks posed by large financial institutions. The scale and reach of these companies, combined with the ability of adversarial actors who can manipulate these systems, require a similarly bold and assertive regulatory approach.
      • The designation of an institution as a SIFI is made by the Financial Stability Oversight Council (“FSOC”), which Congress established to “identify risks to the financial stability of the United States” and to provide enhanced supervision of SIFIs.[67] The FSOC also “monitors regulatory gaps and overlaps to identify emerging sources of systemic risk.” In determining whether a financial institution is systemically important, the FSOC considers numerous factors including: the effect that a failure or disruption to an institution would have on financial markets and the broader financial system; the nature of the institution’s transactions and relationships; the nature, concentration, interconnectedness, and mix of the institution’s activities; and the degree to which the institution is regulated.
      • An analogue to the FSOC should be established to identify systemically important social media companies. This new Oversight Council should evaluate the reach and impact of social media companies, as well as the society-wide consequences of a social media platform’s misuse, to determine which companies they should designate as systemically important. Once designated, those companies should be subject to enhanced regulation, such as through the provision of “stress tests” to evaluate the social media companies’ susceptibility to key threats, including cyberattacks and election interference.
      • Finally, the success of such oversight will depend on the establishment of an expert agency to oversee designated social media companies. Systemically important financial companies designated by the FSOC are overseen by the Federal Reserve Board, which has a long-established and deep expertise in banking and financial market stability. A regulator for systemically important social media would likewise need deep expertise in areas such as technology, cybersecurity, and disinformation. This expert regulator could take various forms; it could be a completely new agency or could reside within an established agency or at an existing regulator.
  • The Government Accountability Office (GAO) evaluated how well the Trump Administration has been implementing the “Open, Public, Electronic and Necessary Government Data Act of 2018” (OPEN Government Data Act) (P.L. 115-435). As the GAO explained, this statute “requires federal agencies to publish their information as open data using standardized, nonproprietary formats, making data available to the public open by default, unless otherwise exempt…[and] codifies and expands on existing federal open data policy including the Office of Management and Budget’s (OMB) memorandum M-13-13 (M-13-13), Open Data Policy—Managing Information as an Asset.”
    • The GAO stated
      • To continue moving forward with open government data, the issuance of OMB implementation guidance should help agencies develop comprehensive inventories of their data assets, prioritize data assets for publication, and decide which data assets should or should not be made available to the public.
      • Implementation of this statutory requirement is critical to agencies’ full implementation and compliance with the act. In the absence of this guidance, agencies, particularly agencies that have not previously been subject to open data policies, could fall behind in meeting their statutory timeline for implementing comprehensive data inventories.
      • It is also important for OMB to meet its statutory responsibility to biennially report on agencies’ performance and compliance with the OPEN Government Data Act and to coordinate with General Services Administration (GSA) to improve the quality and availability of agency performance data that could inform this reporting. Access to this information could inform Congress and the public on agencies’ progress in opening their data and complying with statutory requirements. This information could also help agencies assess their progress and improve compliance with the act.
    • The GAO made three recommendations:
      • The Director of OMB should comply with its statutory requirement to issue implementation guidance to agencies to develop and maintain comprehensive data inventories. (Recommendation 1)
      • The Director of OMB should comply with the statutory requirement to electronically publish a report on agencies’ performance and compliance with the OPEN Government Data Act. (Recommendation 2)
      • The Director of OMB, in collaboration with the Administrator of GSA, should establish policy to ensure the routine identification and correction of errors in electronically published performance information. (Recommendation 3)
  • The United States’ (U.S.) National Security Agency (NSA) issued a cybersecurity advisory titled “Chinese State-Sponsored Actors Exploit Publicly Known Vulnerabilities,” that “provides Common Vulnerabilities and Exposures (CVEs) known to be recently leveraged, or scanned-for, by Chinese state-sponsored cyber actors to enable successful hacking operations against a multitude of victim networks.” The NSA recommended a number of mitigations generally for U.S. entities, including:
    • Keep systems and products updated and patched as soon as possible after patches are released.
    • Expect that data stolen or modified (including credentials, accounts, and software) before the device was patched will not be alleviated by patching, making password changes and reviews of accounts a good practice.
    • Disable external management capabilities and set up an out-of-band management network.
    • Block obsolete or unused protocols at the network edge and disable them in device configurations.
    • Isolate Internet-facing services in a network Demilitarized Zone (DMZ) to reduce the exposure of the internal network.
    • Enable robust logging of Internet-facing services and monitor the logs for signs of compromise.
    • The NSA then proceeded to recommend specific fixes.
    • The NSA provided this policy backdrop:
      • One of the greatest threats to U.S. National Security Systems (NSS), the U.S. Defense Industrial Base (DIB), and Department of Defense (DOD) information networks is Chinese state-sponsored malicious cyber activity. These networks often undergo a full array of tactics and techniques used by Chinese state-sponsored cyber actors to exploit computer networks of interest that hold sensitive intellectual property, economic, political, and military information. Since these techniques include exploitation of publicly known vulnerabilities, it is critical that network defenders prioritize patching and mitigation efforts.
      • The same process for planning the exploitation of a computer network by any sophisticated cyber actor is used by Chinese state-sponsored hackers. They often first identify a target, gather technical information on the target, identify any vulnerabilities associated with the target, develop or re-use an exploit for those vulnerabilities, and then launch their exploitation operation.
  • Belgium’s data protection authority (DPA) (Autorité de protection des données in French or Gegevensbeschermingsautoriteit in Dutch) (APD-GBA) has reportedly found that the Transparency & Consent Framework (TCF) developed by the Interactive Advertising Bureau (IAB) violates the General Data Protection Regulation (GDPR). The Real-Time Bidding (RTB) system used for online behavioral advertising allegedly transmits the personal information of European Union residents without their consent even before a popup appears on their screen asking for consent. The APD-GBA is the lead DPA in the EU in investigating the RTB and will likely now circulate their findings and recommendations to other EU DPAs before any enforcement will commence.
  • None Of Your Business (noyb) announced “[t]he Irish High Court has granted leave for a “Judicial Review” against the Irish Data Protection Commission (DPC) today…[and] [t]he legal action by noyb aims to swiftly implement the [Court of Justice for the European Union (CJEU)] Decision prohibiting Facebook’s” transfer of personal data from the European Union to the United States (U.S.) Last month, after the DPC directed Facebook to stop transferring the personal data of EU citizens to the U.S., the company filed suit in the Irish High Court to stop enforcement of the order and succeeded in staying the matter until the court rules on the merits of the challenge.
    • noyb further asserted:
      • Instead of making a decision in the pending procedure, the DPC has started a second, new investigation into the same subject matter (“Parallel Procedure”), as widely reported (see original reporting by the WSJ). No logical reasons for the Parallel Procedure was given, but the DPC has maintained that Mr Schrems will not be heard in this second case, as he is not a party in this Parallel Procedure. This Paralell procedure was criticised by Facebook publicly (link) and instantly blocked by a Judicial Review by Facebook (see report by Reuters).
      • Today’s Judicial Review by noyb is in many ways the counterpart to Facebook’s Judicial Review: While Facebook wants to block the second procedure by the DPC, noyb wants to move the original complaints procedure towards a decision.
      • Earlier this summer, the CJEU struck down the adequacy decision for the agreement between the EU and (U.S. that had provided the easiest means to transfer the personal data of EU citizens to the U.S. for processing under the General Data Protection Regulation (GDPR) (i.e. the EU-U.S. Privacy Shield). In the case known as Schrems II, the CJEU also cast doubt on whether standard contractual clauses (SCC) used to transfer personal data to the U.S. would pass muster given the grounds for finding the Privacy Shield inadequate: the U.S.’s surveillance regime and lack of meaningful redress for EU citizens. Consequently, it has appeared as if data protection authorities throughout the EU would need to revisit SCCs for transfers to the U.S., and it appears the DPC was looking to stop Facebook from using its SCC. Facebook is apparently arguing in its suit that it will suffer “extremely significant adverse effects” if the DPC’s decision is implemented.
  • Most likely with the aim of helping British chances for an adequacy decision from the European Union (EU), the United Kingdom’s Information Commissioner’s Office (ICO) published guidance that “discusses the right of access [under the General Data Protection Regulation (GDPR)] in detail.” The ICO explained “is aimed at data protection officers (DPOs) and those with specific data protection responsibilities in larger organisations…[but] does not specifically cover the right of access under Parts 3 and 4 of the Data Protection Act 2018.”
    • The ICO explained
      • The right of access, commonly referred to as subject access, gives individuals the right to obtain a copy of their personal data from you, as well as other supplementary information.
  • The report the House Education and Labor Ranking Member requested from the Government Accountability Office (GAO) on the data security and data privacy practices of public schools. Representative Virginia Foxx (R-NC) asked the GAO “to review the security of K-12 students’ data. This report examines (1) what is known about recently reported K-12 cybersecurity incidents that compromised student data, and (2) the characteristics of school districts that experienced these incidents.” Strangely, the report did have GAO’s customary conclusions or recommendations. Nonetheless, the GAO found:
    • Ninety-nine student data breaches reported from July 1, 2016 through May 5, 2020 compromised the data of students in 287 school districts across the country, according to our analysis of K-12 Cybersecurity Resource Center (CRC) data (see fig. 3). Some breaches involved a single school district, while others involved multiple districts. For example, an attack on a vendor system in the 2019-2020 school year affected 135 districts. While information about the number of students affected was not available for every reported breach, examples show that some breaches affected thousands of students, for instance, when a cybercriminal accessed 14,000 current and former students’ personally identifiable information (PII) in one district.
    • The 99 reported student data breaches likely understate the number of breaches that occurred, for different reasons. Reported incidents sometimes do not include sufficient information to discern whether data were breached. We identified 15 additional incidents in our analysis of CRC data in which student data might have been compromised, but the available information was not definitive. In addition, breaches can go undetected for some time. In one example, the personal information of hundreds of thousands of current and former students in one district was publicly posted for 2 years before the breach was discovered.
    • The CRC identified 28 incidents involving videoconferences from April 1, 2020 through May 5, 2020, some of which disrupted learning and exposed students to harm. In one incident, 50 elementary school students were exposed to pornography during a virtual class. In another incident in a different district, high school students were targeted with hate speech during a class, resulting in the cancellation that day of all classes using the videoconferencing software. These incidents also raise concerns about the potential for violating students’ privacy. For example, one district is reported to have instructed teachers to record their class sessions. Teachers said that students’ full names were visible to anyone viewing the recording.
    • The GAO found gaps in the protection and enforcement of student privacy by the United States government:
      • [The Department of] Education is responsible for enforcing Family Educational Rights and Privacy Act (FERPA), which addresses the privacy of PII in student education records and applies to all schools that receive funds under an applicable program administered by Education. If parents or eligible students believe that their rights under FERPA have been violated, they may file a formal complaint with Education. In response, Education is required to take appropriate actions to enforce and deal with violations of FERPA. However, because the department’s authority under FERPA is directly related to the privacy of education records, Education’s security role is limited to incidents involving potential violations under FERPA. Further, FERPA amendments have not directly addressed educational technology use.
      • The “Children’s Online Privacy Protection Act” (COPPA) requires the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s privacy. The COPPA Rule, which took effect in 2000 and was later amended in 2013, requires operators of covered websites or online services that collect personal information from children under age 13 to provide notice and obtain parental consent, among other things. COPPA generally applies to the vendors who provide educational technology, rather than to schools directly. However, according to FTC guidance, schools can consent on behalf of parents to the collection of students’ personal information if such information is used for a school-authorized educational purpose and for no other commercial purpose.
  • Upturn, an advocacy organization that “advances equity and justice in the design, governance, and use of technology,” has released a report showing that United States (U.S.) law enforcement agencies have multiple means of hacking into encrypted or protected smartphones. There have long been the means and vendors available in the U.S. and abroad for breaking into phones despite the claims of a number of nations like the Five Eyes (U.S., the United Kingdom, Australia, Canada, and New Zealand) that default end-to-end encryption was a growing problem that allowed those preying on children and engaged in terrorism to go undetected. In terms of possible bias, Upturn is “is supported by the Ford Foundation, the Open Society Foundations, the John D. and Catherine T. MacArthur Foundation, Luminate, the Patrick J. McGovern Foundation, and Democracy Fund.”
    • Upturn stated:
      • Every day, law enforcement agencies across the country search thousands of cellphones, typically incident to arrest. To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”
      • This report documents the widespread adoption of MDFTs by law enforcement in the United States. Based on 110 public records requests to state and local law enforcement agencies across the country, our research documents more than 2,000 agencies that have purchased these tools, in all 50 states and the District of Columbia. We found that state and local law enforcement agencies have performed hundreds of thousands of cellphone extractions since 2015, often without a warrant. To our knowledge, this is the first time that such records have been widely disclosed.
    • Upturn argued:
      • Law enforcement use these tools to investigate not only cases involving major harm, but also for graffiti, shoplifting, marijuana possession, prostitution, vandalism, car crashes, parole violations, petty theft, public intoxication, and the full gamut of drug-related offenses. Given how routine these searches are today, together with racist policing policies and practices, it’s more than likely that these technologies disparately affect and are used against communities of color.
      • We believe that MDFTs are simply too powerful in the hands of law enforcement and should not be used. But recognizing that MDFTs are already in widespread use across the country, we offer a set of preliminary recommendations that we believe can, in the short-term, help reduce the use of MDFTs. These include:
        • banning the use of consent searches of mobile devices,
        • abolishing the plain view exception for digital searches,
        • requiring easy-to-understand audit logs,
        • enacting robust data deletion and sealing requirements, and
        • requiring clear public logging of law enforcement use.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Mehmet Turgut Kirkgoz from Pixabay

Further Reading, Other Developments, and Coming Events (6 October)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Operational IoT – 7 October at 15:00 to 16:30 CET
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, but the agenda has not yet been announced.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that a “malicious cyber actor” had penetrated an unnamed federal agency and “implanted sophisticated malware—including multi-stage malware that evaded the affected agency’s anti-malware protection—and gained persistent access through two reverse Socket Secure (SOCKS) proxies that exploited weaknesses in the agency’s firewall.” Since CISA said it became aware of the penetration via EINSTEIN, it is likely a civilian agency that was compromised. The actor used “compromised credentials” to get into the agency, but “CISA analysts were not able to determine how the cyber threat actor initially obtained the credentials.” It is not clear whether this is a nation state or sophisticated hackers working independently.
    • It should be noted that last month, the Department of Veterans Affairs (VA) revealed it had been breached and “the personal information of approximately 46,000 Veterans” has been compromised. This announcement came the same day as an advisory issued by CISA that Chinese Ministry of State Security (MSS)-affiliated cyber threat actors have been targeting and possibly penetrating United States (U.S.) agency networks. 
  • Senators Ron Wyden (D-OR) and Jeff Merkley (D-OR) and Representatives Earl Blumenauer (D-OR) and Suzanne Bonamici (D-OR) wrote the Department of Homeland Security (DHS) regarding a report in The Nation alleging the DHS and Department of Justice (DOJ) surveilled the phones of protestors in Portland, Oregon in possible violation of United States (U.S.) law. These Members asked DHS to respond to the following questions by October 9:
    • During a July 23, 2020, briefing for Senate intelligence committee staff, Brian Murphy, then the Acting Under Secretary for Intelligence and Analysis (I&A) stated that DHS I&A had neither collected nor exploited or analyzed information obtained from the devices or accounts of protesters or detainees. On July 31, 2020, Senator Wyden and six other Senators on the Senate Select Committee on Intelligence wrote to Mr. Murphy to confirm the statement he had made to committee staff. DHS has yet to respond to that letter. Please confirm whether or not Mr. Murphy’s statement during the July 23, 2020, briefing was accurate at the time, and if it is still   
    • accurate.
    • Has DHS, whether directly, or with the assistance of any other government agency, obtained or analyzed data collected through the surveillance of protesters’ phones, including tracking their locations or intercepting communications content or metadata? If yes, for each phone that was surveilled, did the government obtain prior authorization from a judge before conducting this surveillance?
    • Has DHS used commercial data sources, including open source intelligence products, to investigate, identify, or track protesters or conduct network analysis? If yes, please identify each commercial data source used by DHS, describe the information DHS obtained, how DHS used it, whether it was subsequently shared with any other government agency, and whether DHS sought and obtained authorization from a court before querying the data source.
  • The National Cybersecurity Center of Excellence (NCCoE) at the National Institute of Standards and Technology (NIST) has published for comment the “Securing Data Integrity Against Ransomware Attacks: Using the NIST Cybersecurity Framework and NIST Cybersecurity Practice Guides” that provides an overview of [NCCoE and NIST’s]  Data Integrity projects…a high-level explanation of the architecture and capabilities, and how these projects can be brought together into one comprehensive data integrity solution…[that] can then be integrated into a larger security picture to address all of an organization’s data security needs.” Comments are due by 13 November. NCCoE and NIST explained:
    • This guide is designed for organizations that are not currently experiencing a loss of data integrity event (ransomware or otherwise). This document prepares an organization to adequately address future data integrity events. For information on dealing with a current attack, please explore guidance from organizations like the Federal Bureau of Investigation the United States Secret Service, or other pertinent groups or government bodies.
    • Successful ransomware impacts data’s integrity, yet ransomware is just one of many potential vectors through which an organization could suffer a loss of data integrity. Integrity is part of the CIA security triad which encompasses Confidentiality, Integrity, and Availability. As the CIA triad is applied to data security, data integrity is defined as “the property that data has not been changed, destroyed, or lost in an unauthorized or accidental manner.” An attack against data integrity can cause corruption, modification, and/or destruction of the data which ultimately results in a loss in trust in the data.
  • As referenced in media reports, Graphika released a report on a newly discovered Russian disinformation efforts that led to the creation and propagation of propaganda to appeal to the right wing in the United States (U.S.) In “Step into My Parler: Suspected Russian Operation Targeted Far-Right American Users on Platforms Including Gab and Parler, Resembled Recent IRA-Linked Operation that Targeted Progressives,” Graphika explained:
    • Russian operators ran a far-right website and social media accounts that targeted American users with pro-Trump and anti-Biden messaging, according to information from Reuters and Graphika’s investigation. This included the first known Russian activity on the platforms Gab and Parler. The operation appeared connected to a recent Russian website that targeted progressives in America with anti-Biden messaging.
    • The far-right “Newsroom for American and European Based Citizens,” naebc[.]com, pushed the opposite end of the political spectrum from the ostensibly progressive PeaceData site, but the two assets showed such a strong family resemblance that they appear to be two halves of the same operation. Both ran fake editorial personas whose profile pictures were generated by artificial intelligence; both claimed to be young news outlets based in Europe; both made language errors consistent with Russian speakers; both tried to hire freelance writers to provide their content; and, oddly enough, both had names that translate to obscenities in Russian.
    • Reuters first tipped Graphika off to the existence of the NAEBC website and its likely relationship to PeaceData. U.S. law enforcement originally alerted the social media platforms to the existence of PeaceData. On September 1, Facebook attributed PeaceData to “individuals associated with past activity by the Russian Internet Research Agency (IRA).” Twitter attributed it to Russian state actors. Social media platforms (Facebook, Twitter, LinkedIn) have taken similar action to stop activity related to NAEBC on their platforms. To date, Parler and Gab have not taken action on their platforms.
  • The Cybersecurity and Infrastructure Security Agency (CISA) and Multi-State Information Sharing and Analysis Center (MS-ISAC) issued a joint Ransomware Guide “meant to be a one-stop resource for stakeholders on how to be proactive and prevent these attacks from happening and also a detailed approach on how to respond to an attack and best resolve the cyber incident.” The organizations explained:
    • First, the guide focuses on best practices for ransomware prevention, detailing practices that organizations should continuously do to help manage the risk posed by ransomware and other cyber threats. It is intended to enable forward-leaning actions to successfully thwart and confront malicious cyber activity associated with ransomware. Some of the several CISA and MS-ISAC preventive services that are listed are Malicious Domain Blocking and Reporting, regional CISA Cybersecurity Advisors, Phishing Campaign Assessment, and MS-ISAC Security Primers on ransomware variants such as Ryuk.
    • The second part of this guide, response best practices and services, is divided up into three sections: (1) Detection and Analysis, (2) Containment and Eradication, and (3) Recovery and Post-Incident Activity. One of the unique aspects that will significantly help an organization’s leadership as well as IT professional with response is a comprehensive, step-by-step checklist. With many technical details on response actions and lists of CISA and MS-ISAC services available to the incident response team, this part of the guide can enable a methodical, measured and properly managed approach.  
  • The Government Accountability Office (GAO) released a guide on best practices for agile software development for federal agencies and contracting officers. The GAO stated:
    • The federal government spends at least $90 billion annually on information technology (IT) investments. In our January 2019 High Risk List report, GAO reported on 35 high risk areas, including the management of IT acquisitions and operations. While the executive branch has undertaken numerous initiatives to help agencies better manage their IT investments, these programs frequently fail or incur cost overruns and schedule slippages while contributing little to mission-related outcomes.
    • GAO has found that the Office of Management and Budget (OMB) continues to demonstrate its leadership commitment by issuing guidance for covered departments and agencies to implement statutory provisions commonly referred to as Federal Information Technology Acquisition Reform Act (FITARA.) However, application of FITARA at federal agencies has not been fully implemented. For example, as we stated in the 2019 High Risk report, none of the 24 major federal agencies had IT management policies that fully addressed the roles of their Chief Information Officers (CIO) consistent with federal laws and guidance.
    • This Agile Guide is intended to address generally accepted best practices for Agile adoption, execution, and control. In this guide, we use the term best practice to be consistent with the use of the term in GAO’s series of best practices guides.

Further Reading

  • GOP lawmaker: Democrats’ tech proposals will include ‘non-starters for conservatives’” By Cristiano Lima — Politico. Representative Ken Buck (R-CO) is quoted extensively in this article about Republican concerns that the House Judiciary Committee’s antitrust recommendations may include policy changes he and other GOP Members of the committee will not be able to go along with. Things like banning mandatory arbitration clauses and changing evidentiary burdens (i.e. rolling back court decisions that have made antitrust actions harder to mount) are not acceptable to Republicans who apparently agree in the main that large technology companies do indeed have too much market power. Interestingly, Buck and others think the solution is more resources for the Department of Justice and the Federal Trade Commission (FTC), which is rapidly becoming a favored policy prescription for federal privacy legislation, too. However, even with a massive infusion of funding, the agencies could not act in all cases, and, in any event, would need to contend with a more conservative federal judiciary unlikely to change the antitrust precedents that have reduced the ability of these agencies to take action in the first place. Nonetheless, Republicans may join the report if the recommendations are changed. Of course, the top Republican on the committee, Representative Jim Jordan (R-OH), is allegedly pressuring Republicans not to join the report.
  • Why Is Amazon Tracking Opioid Use All Over the United States?” By Lauren Kaori Gurley — Motherboard. The online shopping giant is apparently tracking a range of data related to opioid usage for reasons that are not entirely clear. To be fair, the company tracks all sort of data.
  • As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature” By Craig Timberg and Elizabeth Dwoskin — The Washington Post. This article traces the history of how Facebook and Twitter opted not to act against QAnon while other platforms like Reddit did, quite possibly contributing the rise and reach of the conspiracy. However, they were afraid of angering some on the right wing given the overlap between some QAnon supports and some Trump supporters.
  • Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation” By Shirin Ghaffary — recode. Democratic officials who have been on calls with Facebook officials are saying the platform is not doing enough to combat disinformation and lies about the election. Facebook, of course, disputes this assessment. Democratic officials are especially concerned about the period between election day and when results are announced and think Facebook is not ready to handle the predicted wave of disinformation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Bermix Studio on Unsplash

Further Reading, Other Developments, and Coming Events (18 September)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” The agency has released its agenda and explained:
    • The workshop will also feature four panel discussions that will focus on: case studies on data portability rights in the European Union, India, and California; case studies on financial and health portability regimes; reconciling the benefits and risks of data portability; and the material challenges and solutions to realizing data portability’s potential.
  • The Senate Judiciary Committee’s Intellectual Property Subcommittee will hold a hearing on 23 September titled “Examining Threats to American Intellectual Property: Cyber-attacks and Counterfeits During the COVID-19 Pandemic” with these witnesses:
    • Adam Hickey, Deputy Assistant Attorney General National Security Division, Department of Justice
    • Clyde Wallace, Deputy Assistant Director Cyber Division, Federal Bureau of Investigation
    • Steve Francis, Assistant Director, HSI Global Trade Investigations Division Director, National Intellectual Property Rights Center, U.S. Immigration and Customs Enforcement, Department of Homeland Security
    • Bryan S. Ware, Assistant Director for Cybersecurity Cyber Security and Infrastructure Security Agency, Department of Homeland Security
  • On 23 September, the Commerce, Science, and Transportation Committee will hold a hearing titled “Revisiting the Need for Federal Data Privacy Legislation,” with these witnesses:
    • The Honorable Julie Brill, Former Commissioner, Federal Trade Commission
    • The Honorable William Kovacic, Former Chairman and Commissioner, Federal Trade Commission
    • The Honorable Jon Leibowitz, Former Chairman and Commissioner, Federal Trade Commission
    • The Honorable Maureen Ohlhausen, Former Commissioner and Acting Chairman, Federal Trade Commission
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 30 September titled “Oversight of the Enforcement of the Antitrust Laws” with Federal Trade Commission Chair Joseph Simons and United States Department of Justice Antitrust Division Assistant Attorney General Makan Delhrahim.
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September and has made available its agenda with these items:
    • Facilitating Shared Use in the 3.1-3.55 GHz Band. The Commission will consider a Report and Order that would remove the existing non-federal allocations from the 3.3-3.55 GHz band as an important step toward making 100 megahertz of spectrum in the 3.45-3.55 GHz band available for commercial use, including 5G, throughout the contiguous United States. The Commission will also consider a Further Notice of Proposed Rulemaking that would propose to add a co-primary, non-federal fixed and mobile (except aeronautical mobile) allocation to the 3.45-3.55 GHz band as well as service, technical, and competitive bidding rules for flexible-use licenses in the band. (WT Docket No. 19-348)
    • Expanding Access to and Investment in the 4.9 GHz Band. The Commission will consider a Sixth Report and Order that would expand access to and investment in the 4.9 GHz (4940-4990 MHz) band by providing states the opportunity to lease this spectrum to commercial entities, electric utilities, and others for both public safety and non-public safety purposes. The Commission also will consider a Seventh Further Notice of Proposed Rulemaking that would propose a new set of licensing rules and seek comment on ways to further facilitate access to and investment in the band. (WP Docket No. 07-100)
    • Improving Transparency and Timeliness of Foreign Ownership Review Process. The Commission will consider a Report and Order that would improve the timeliness and transparency of the process by which it seeks the views of Executive Branch agencies on any national security, law enforcement, foreign policy, and trade policy concerns related to certain applications filed with the Commission. (IB Docket No. 16-155)
    • Promoting Caller ID Authentication to Combat Spoofed Robocalls. The Commission will consider a Report and Order that would continue its work to implement the TRACED Act and promote the deployment of caller ID authentication technology to combat spoofed robocalls. (WC Docket No. 17-97)
    • Combating 911 Fee Diversion. The Commission will consider a Notice of Inquiry that would seek comment on ways to dissuade states and territories from diverting fees collected for 911 to other purposes. (PS Docket Nos. 20-291, 09-14)
    • Modernizing Cable Service Change Notifications. The Commission will consider a Report and Order that would modernize requirements for notices cable operators must provide subscribers and local franchising authorities. (MB Docket Nos. 19-347, 17-105)
    • Eliminating Records Requirements for Cable Operator Interests in Video Programming. The Commission will consider a Report and Order that would eliminate the requirement that cable operators maintain records in their online public inspection files regarding the nature and extent of their attributable interests in video programming services. (MB Docket No. 20-35, 17-105)
    • Reforming IP Captioned Telephone Service Rates and Service Standards. The Commission will consider a Report and Order, Order on Reconsideration, and Further Notice of Proposed Rulemaking that would set compensation rates for Internet Protocol Captioned Telephone Service (IP CTS), deny reconsideration of previously set IP CTS compensation rates, and propose service quality and performance measurement standards for captioned telephone services. (CG Docket Nos. 13-24, 03-123)
    • Enforcement Item. The Commission will consider an enforcement action.

Other Developments

  • Former Principal Deputy Under Secretary in the Office of Intelligence and Analysis Brian Murphy has filed a whistleblower reprisal complaint against the United States Department of Homeland Security (DHS) for providing intelligence analysis the Trump White House and DHS did not want, mainly for political reasons, and then refusing to make alterations to fit the Administration’s chosen narrative on issues, especially on the Russian Federation’s interference in the 2020 Election. Murphy alleges “he was retaliatorily demoted to the role of Assistant to the Deputy Under Secretary for the DHS Management Division” because he refused to comply with orders from acting Secretary of Homeland Security Chad Wolf. Specifically, he claims:
    • In mid-May 2020, Mr. Wolf instructed Mr. Murphy to cease providing intelligence assessments on the threat of Russian interference in the United States, and instead start reporting on interference activities by China and Iran. Mr. Wolf stated that these instructions specifically originated from White House National Security Advisor Robert O’Brien. Mr. Murphy informed Mr. Wolf he would not comply with these instructions, as doing so would put the country in substantial and specific danger.
  • The National Security Agency (NSA) Office of the Inspector General (OIG) issued an unclassified version of its Semiannual Report to Congress consisting of “the audits, evaluations, inspections, and investigations that were completed and ongoing” from 1 October 2019 to 31 March 2020.
    • The OIG found ongoing problems with how the NSA is administering surveillance of United States’ people overseas (i.e. Section 704 and 705 of the Foreign Intelligence Surveillance Act), something that has been a long running problem at the agency. The OIG found
      • NSA does not have adequate and complete documentation of scenario-based data tagging rules for accurately assigning data labels to restrict access to data in accordance with legal and policy requirements, and consistently assessing data labeling errors;
      • NSA has not designated a standardized field in NSA data tags to efficiently store and identify data needed to verify the accuracy of data label assignments;
      • NSA does not document in its targeting tool a majority of a certain type of targeting request; and
      • NSA controls do not adequately and completely verify the accuracy of data labels assigned to data prior to ingest into NSA repositories.
      • As a result of these findings, the OIG made seven recommendations, six to assist NSA in strengthening its corporate data tagging controls and governance, and a seventh to help ensure that NSA’s FISA §§704 and 705(b) data tagging legal and policy determinations are consistent with NSA representations made to the FISC and other external overseers regarding how NSA handles such data, and that these tagging requirements are fully documented and promulgated to the NSA workforce.
    • The OIG noted the middling progress the NSA has made in securing its information technology, a weakness that could well be used by adversaries to penetrate the agency’s networks:
      • In accordance with U.S. Office of Management and Budget guidance, the OIG is required annually to assess the effectiveness of information security programs on a maturity model spectrum, which ranges from Level 1 (ad hoc) to Level 5 (optimized). Our assessment of eight IT security areas revealed that while progress was made in some areas from FY2018 to FY2019, there continues to be room for improvement in all eight IT security areas.
      • For the second consecutive year, Identity and Access Management was deemed the strongest security area with an overall maturity level of 3, consistently implemented. The Agency’s challenges in Security Training dropped the maturity level from 3, consistently implemented, to 2, defined. For the second consecutive year, Contingency Planning was assessed at an overall maturity level of ad hoc; although the Agency has made some improvements to the program, additional improvements need to be made.
  • The Office of the National Director of Intelligence (ODNI) released a June 2020 Foreign Intelligence Surveillance Court (FISC) opinion that sets the limits on using information gained from electronic surveillance of former Trump campaign adviser Carter Page
    • FISC noted
      • The government has acknowledged that at least some of its collection under color of those FISC orders was unlawful. It nevertheless now contends that it must temporarily retain, and potentially use and disclose, the information collected, largely in the context of ongoing or anticipated litigation. The Court hereby sets parameters for such use or disclosure.
    • The FISC ordered:
      • (1) With regard to the third-party FOIA litigation, see supra pp. 9-10, and the pending litigation with Page, see supra p. 12, the government may use or disclose Page FISA information insofar as necessary for the good-faith conduct of that litigation;
      • (2) With regard to any future claims brought by Page seeking redress for unlawful electronic surveillance or physical search or for disclosure of the results of such surveillance or search, the government may use or disclose Page FISA information insofar as necessary to the good-faith conduct of the litigation of such claims;
      • (3) Further use or disclosure of Page FISA information is permitted insofar as necessary to effective performance or disciplinary reviews of government personnel, provided that any such use or disclosure of raw information is permitted only insofar as a particular need to use or disclose the specific information at issue has been demonstrated. This paragraph applies, but is not limited to, use by, and disclosure by or to, the FBI’s INSD or OPR;
      • (4) Further use or disclosure of Page FISA information by DOJ OIG is permitted only insofar as necessary to assess the implementation of Recommendation 9 of the OIG Report;
      • (5) Further use or disclosure of Page FISA information is permitted only insofar as necessary to investigate or prosecute potential crimes relating to the conduct of the Page or Crossfire Hurricane investigations, provided that any such use or disclosure of raw information is permitted only insofar as a particular need to use or disclose the specific information at issue has been demonstrated. This paragraph applies, but is not limited to, use by, and disclosure by or to, personnel engaged in the review being lead by United States Attorney Durham. See supra p.17;and
      • (6) By January 29, 2021, and at intervals of no more than six months thereafter, the government shall submit under oath a written report on the retention, and any use or disclosure, of Page FISA information
  • Portland, Oregon has passed bans on the use of facial recognition technology by its government and private entities that is being characterized as the most stringent in the United States. Effective immediately, no city agency may use FRT and come 1 January 2021 no private companies may do so. In contrast, FRT bans in Boston, San Francisco, and Oakland only bar government entities from using the technology. However, Portland residents would still be permitted to use FRT; for example, those choosing to use FRT to unlock their phone would still be legal. The legislation explains
    • The purpose of this Chapter is to prohibit the use of Face Recognition Technologies in Places of Public Accommodation by Private Entities within the boundaries of the City of Portland.
    • Face Recognition Technologies have been shown to falsely identify women and People of Color on a routine basis. While progress continues to be made in improving Face Recognition Technologies, wide ranges in accuracy and error rates that differ by race and gender have been found in vendor testing.
    • Community members have raised concerns on the impacts of Face Recognition Technologies on civil liberties and civil rights. In addition, the collection, trade, and use of face biometric information may compromise the privacy of individuals even in their private setting. While these claims are being assessed, the City is creating safeguards aiming to protect Portlanders’ sensitive information until better infrastructure and policies are in place.
    • Portland’s commitment to equity means that we prioritize the safety and well-being of communities of color and other marginalized and vulnerable community members.
    • However, the ban does not apply
      • To the extent necessary for a Private Entity to comply with federal, state, or local laws;
      • For user verification purposes by an individual to access the individual’s own personal or employer issued communication and electronic devices; or
      • In automatic face detection services in social media applications.
  • President Donald Trump has nominated Nathan Simington to replace Federal Communications Commission (FCC) Commissioner Michael O’Reilly. Reports indicate Trump was displeased that O’Reilly was not receptive to Executive Order (EO) 13925 “Preventing Online Censorship” and so declined to renominate O’Reilly for anther term. Simington is currently serving as Senior Advisor in the National Telecommunications and Information Administration (NTIA) and is reported to have been deeply involved in the drafting of the EO. A White House press release provided this biography:
    • Among his many responsibilities across the telecommunications industry, he works on 5G security and secure supply chains, the American Broadband Initiative, and is NTIA’s representative to the Government Advisory Committee of the Internet Corporation for Assigned Names and Numbers.
    • Prior to his appointment at NTIA, Mr. Simington was Senior Counsel to Brightstar Corporation, a leading international company in the wireless industry.  In this role, he negotiated deals with companies across the spectrum of the telecommunications and internet industry, including most of the world’s leading wireless carriers. As the head lawyer on the advanced mobility products team, he spearheaded numerous international transactions in the devices, towers and services fields and forged strong relationships with leading telecom equipment manufacturers.  Prior to his career with Brightstar, Mr. Simington was an attorney in private practice with prominent national and international law firms.
    • Following the directive in the EO, on 27 July, the NTIA filed a petition with the FCC, asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself.
    • In early August, the FCC asked for comments on the NTIA petition, and comments were due by 2 September. Over 2500 comments have been filed, and a cursory search turned up numerous form letter comments drafted by a conservative organization that were then submitted by members and followers.

Further Reading

  • “I Have Blood on My Hands”: A Whistleblower Says Facebook Ignored Global Political Manipulation” By Craig Silverman, Ryan Mac, and Pranav Dixit — BuzzFeed News. In a blistering memo on her way out the door, a Facebook engineer charged with moderating fake content around the world charged the company is unconcerned about how the manipulation of its platform is benefitting regimes throughout the world. There is also the implication the company is much more focused on content moderation in the United States (U.S.) and western Europe, possibly because of political pressure from those nations. Worse than allowing repressive and anti-democratic governments target news organizations and opposition figures, the company was slow to respond when human rights advocates accounts were falsely flagged as violating terms of service. The engineer finally quit after sleepless nights of worrying about how her time and efforts may be falling short of protecting nations and people in many nations. She further claimed “[i]t’s an open secret within the civic integrity space that Facebook’s short-term decisions are largely motivated by PR and the potential for negative attention.”
  • Online learning’s toll on kids’ privacy” By Ashley Gold — Axios. With the shift to online education for many students in the United States, the privacy and data security practices of companies in this space are starting to be examined. But schools and parents may be woefully underinformed about or lack power to curb some data collection and usage practices. The Federal Trade Commission (FTC) enforces the Children’s Online Privacy Protection Act (COPPA), which critics claim is not strong enough and to the extent the FTC enforces the law, it is “woefully insufficient.” Moreover, the differences between richer schools and poorer schools plays out with respect to privacy and data security and the latter group of schools likely cannot afford to vet and use the best companies.
  • Unlimited Information Is Transforming Society” By Naomi Oreskes and Erik M. Conway — Scientific American. This comprehensive article traces the field of information alongside other technological advances like electricity, nuclear power, and space travel. The authors posit that we are at a new point with information in that creation and transmission of it now flows in two directions whereas for much of history it flowed one way, often from the elites to everyone else.
  • First death reported following a ransomware attack on a German hospital” By Catalin Cimpanu — ZDNet. The first fatality associated with a ransomware attack happened in Gernmany when a patient in an ambulance was diverted from a hospital struggling with ransomware. Appafently, the hackers did not even mean to target the hospital in Dusseldorf and instead were aiming to infect and extort a university hospital nearby. Nonetheless, Germany’s Bundesamt für Sicherheit in der Informationstechnik thereafter issued a warning advising entities to update the CVE-2019-19871 vulnerability on Citrix network gateways.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Peggy und Marco Lachmann-Anke from Pixabay

“Paper” Hearing on COVID-19 and Big Data

On April 9, the Senate Commerce, Science, and Transportation Committee held a virtual hearing of sorts as all the proceedings would occur through the written word with the chair, ranking member, and witnesses all submitting statements. Then all the members were to submit written questions to the witnesses who will have 96 business hours to respond or what appears to be 12 days. The questions posed to each witness by each member of the committee have been posted on the hearing webpage as well.

In his written statement, Chair Roger Wicker (R-MS) stated “[a]s the public and private sectors race to develop a vaccine for [COVID-19], government officials and health-care professionals have turned to what is known as “big data” to help fight the global pandemic.” He stated “[i]n recognition of the value of big data, Congress recently authorized the CDC, through the bipartisan coronavirus relief package, to develop a modern data surveillance and analytics system,” a reference to the $500 million appropriated for “for public health data surveillance and analytics infrastructure modernization.”  Wicker said “[t]his system is expected to use public health data inputs – including big data – to track the coronavirus more effectively and reduce its spread.” He added “[s]tate governments are also using big data to monitor the availability of hospital resources and manage supply chains for the distribution of masks and other personal protective medical equipment.”

Wicker remarked,

  • Recent media reports revealed that big data is being used by the mobile advertising industry and technology companies in the United States to track the spread of the virus through the collection of consumer location data.  This location data is purported to be in aggregate form and anonymized so that it does not contain consumers’ personally identifiable information.  It is intended to help researchers identify where large crowds are forming and pinpoint the source of potential outbreaks.  The data may also help predict trends in the transmission of COVID-19 and serve as an early warning system for individuals to self-isolate or quarantine.
  • In addition to these uses, consumer location data is being analyzed to help track the effectiveness of social distancing and stay-at-home guidelines.  Data scientists are also seeking ways to combine artificial intelligence and machine learning technologies with big data to build upon efforts to track patterns, make diagnoses, and identify other environmental or geographic factors affecting the rate of disease transmission.
  • The European Union is turning to big data to stop the spread of the illness as well. Italy, Germany, and others have sought to obtain consumer location data from telecommunications companies to track COVID-19.  To protect consumer privacy, EU member states have committed to using only anonymized and aggregate mobile phone location data.  Although the EU’s General Data Protection Regulation does not apply to anonymized data, EU officials have committed to deleting the data once the public health crisis is over.  

Wicker asserted, “[t]he potential benefits of big data to help contain the virus and limit future outbreaks could be significant.” He stated “[r]educing privacy risks begins with understanding how consumers’ location data – and any other information – is being collected when tracking compliance with social distancing measures.” He contended that “[e]qually important is understanding how that data is anonymized to remove all personally identifiable information and prevent individuals from being re-identified…[and] I look forward to hearing from our witnesses about how consumer privacy can be protected at every stage of the data collection process.”

Wicker stated, “I also look forward to exploring how consumers are notified about the collection of their location information and their ability to control or opt out of this data collection if desired.” He explained “[g]iven the sensitivity of geolocation data, increased transparency into these practices will help protect consumers from data misuse and other unwanted or unexpected data processing.” Wicker added “I hope to learn more about how location data is being publicly disclosed, with whom it is being shared, and what will be done with any identifiable data at the end of this global pandemic.”

Wicker concluded,

Strengthening consumer data privacy through the development of a strong and bipartisan federal data privacy law has been a priority for this Committee.  The collection of consumer location data to track the coronavirus, although well intentioned and possibly necessary at this time, further underscores the need for uniform, national privacy legislation.  Such a law would provide all Americans with more transparency, choice, and control over their data, as well as ways to keep businesses more accountable to consumers when they seek to use their data for unexpected purposes.  It would also provide certainty and clear, workable rules of the road for businesses in all 50 states, and preserve Americans’ trust and confidence that their data will be protected and secure no matter where they live.

Ranking Member Maria Cantwell (D-WA) asserted, “[r]ight now, we must ensure there are enough hospital beds, enough personal protective equipment, and enough ventilators and medical supplies to withstand the full force of this virus as it peaks in communities across our country” in her opening statement. She stated, “[w]e need robust testing, and as the virus finally fades, we’ll need to deploy contact tracing systems so that we can respond quickly to outbreaks and stamp it out for good.” Cantwell claimed, “[d]ata provides incredible insights that can assist us in these efforts, and we should be doing everything possible to harness information in a manner that upholds our values.” She remarked, “[t]o gain and keep the public’s trust about the use of data, a defined framework should be maintained to protect privacy rights…[that] at a minimum, should ensure that information is used:

(1) for a specific limited purpose, with a measurable outcome and an end date,

(2) in a fully transparent manner with strong consumer rights, and

(3) under strict accountability measures.

Cantwell stated, “[w]e must always focus on exactly how we expect technology to help, and how to use data strategically to these ends…[and] [w]e must resist hasty decisions that will sweep up massive, unrelated data sets.” She further argued, “we must guard against vaguely defined and non-transparent government initiatives with our personal data…[b]ecause rights and data surrendered temporarily during an emergency can become very difficult to get back.”

Cantwell expressed her belief that “there are three advantages to data that need to be harnessed at this time: the power to predict, the power to discover, and the power to persuade.” She remarked, “[d]ata helps us build models based on what has come before…[and] [w]e can use these models to identify patterns to help us prepare for what might be next, whether those are predictions of where disease is spreading, estimations of community needs, or coordination of scarce resources.” Cantwell said, “[l]arge publically available data sets also help us identify patterns and solutions that cannot be seen with a more fragmented, less complete picture.” She asserted, “[d]iscoveries and insights that once were hidden can now be brought to light with the help of advanced data analysis techniques.” She said, “[a]nd when there are vital messages to share, data allows us to get those messages out to everyone who needs to hear them…[and] [m]essages about social distancing, exposure risks, and treatment options are just a few of the many types of essential communications that can be informed and enhanced by data analysis.”

Cantwell summed up:

  • The world is now confronting a challenge of tremendous urgency and magnitude. At some point, we will be opening up our society and our economy again. First, we’re going to need robust testing. And when that time comes, we’re also going to need technology, powered by data, to help us safely transition back to a more normal way of life.
  • Our job in Congress is to help provide the tools needed to turn back this disease, and to understand how we marshal innovation and technology in a responsible way to respond to this challenge, both in the short term and for what we are starting to understand may be a very long fight ahead.
  • We are only at the beginning of this fight. We urgently need to plan for the days and, yes, the years ahead; we must discover, test, and distribute new cures faster than ever before; we need our greatest minds, wherever they may be, to collaborate and work together; and we must build unity because ultimately, that is our greatest strength.

University of Washington Professor of Law Ryan Calo explained

In this testimony, I will address some of the ways people and institutions propose to use data analytics and other technology to respond to coronavirus. The first set of examples involves gaining a better understanding of the virus and its effects on American life. By and large I support these efforts; the value proposition is clear and the privacy harms less pronounced. The second set of examples involves the attempt to track the spread of COVID-19 at an individual level using mobile software applications (“apps”). I am more skeptical of this approach as I fear that it threatens privacy and civil liberties while doing little to address the pandemic. Finally, I conclude with the recommendation that, however we leverage data to fight this pandemic, policymakers limit use cases to the emergency itself, and not permit mission creep or downstream secondary uses that surprise the consumer.

Calo said

I am not opposed to leveraging every tool in our technical arsenal to address the current pandemic. We are facing a near unprecedented global crisis. I note in conclusion that there will be measures that are appropriate in this context, but not beyond it. Americans and their representatives should be vigilant that whatever techniques we use today to combat coronavirus do not wind up being used tomorrow to address other behaviors or achieve other goals. To paraphrase the late Justice Robert Jackson, a problem with emergency powers is that they tend to kindle emergencies.

Calo asserted

In national security, critics speak in terms of mission creep, as when vast surveillance powers conferred to fight terrorism end up being used to enforce against narcotics trafficking or unlawful immigration. In consumer privacy, much thought is given to the prospect of secondary use, i.e., the possibility that data collected for one purpose will be used by a company to effectuate a second, more questionable purpose without asking the data subject for additional permissions. No consumer would or should expect that the absence of certain antibodies in their blood, gathered for the purpose of tracing a lethal disease, could lead to higher health insurance premiums down the line. There is also a simpler danger that Americans will become acclimated to more invasive surveillance partnerships between industry and government.14My hope is that policymakers will expressly ensure that any accommodations privacy must concede to the pandemic will not outlive the crisis.

ACT | The App Association Senior Director for Public Policy Graham Dufault explained some of the big data privacy concerns in the COVID-19 crisis:

  • Creating and Using Big Data Sets Consistent with Privacy Expectations. Beyond the Taiwan example described above, other nations are engaging in their own versions of highly targeted surveillance. Israel is tracking citizens’ movements using smartphone location data and even sending text messages to people who were recently near a person known to have been infected with COVID-19, with an order to self-quarantine.While Israeli courts blocked the use of this data to enforce quarantines,11even the use of it to send unsolicited text messages and swiftly apply impromptu quarantines raises some questions.
  • By contrast, in the United States, private companies are leading the charge on big data sets about location, with persistent privacy oversight by policymakers. For example, Google is producing reports on foot traffic patterns using smartphone location data. However, there are limitations to the reports because they only use high-level data indicating a percentage decrease or increase in foot traffic in six different types of locations (e.g., workplaces, retail, and recreation sites)over a given period of time. Their vagueness is in part the result of federal and state privacy law, which generally prohibit deceptive practices, including the disclosure of private data in a manner that is inconsistent with a company’s own privacy policies or where the individual never consented to the disclosure. News articles variously describe these kinds of high-level reports as tracking compliance with stay-at-home orders, but they only do so in an indirect sense and certainly not to the degree to which Taiwan or Israel track compliance, which involves the use of individual location data.
  • With Location Data, Privacy is Possible. Ideally, federal, state, and local governments could enact targeted measures that significantly stem the spread of COVID-19 in high-risk areas and at high-risk times, while enabling certain parts of the economy to open back up where there is mitigation of risk—all with anonymous data. The Private Kit app takes privacy protective steps that may help provide both actionable data and effective anonymity. For example, when a user downloads the app, it clarifies that location data stays on the user’s phone and does not go to a centralized server. Instead, when turned on, the app tracks the user’s location and stores it in an encrypted format—which it apparently sends, again encrypted, directly to other phones when queried. Theoretically, it would be difficult for any single user of the app to discern the identity of the person signified by one of the dots on the map. The problem Private Kit encounters is whether enough people will download this app quickly enough for it to be useful for policymakers and users. Similar ideas, like NextTrace have also cropped up, but the effectiveness of these tools may be limited if a single, popular choice does not soon emerge.
  • The COVID-19 Pandemic Underscores the Need for a National Privacy Law. National privacy legislation should ensure companies are using default privacy measures like those described above. Animating some of the privacy concerns policymakers have expressed about the use of big data to address the COVID-19 pandemic is a (not entirely unfair) lack of trust in how tech-driven companies are using sensitive personal data, especially location data. While many of us worry that governmental intrusions to address the COVID-19 pandemic would be difficult to pull back, policymakers also worry that corporate surveillance efforts could later turn into unexpected uses of sensitive data and exposure to additional risk of unauthorized access. The passage of a strong, national privacy framework could help alleviate the stated concerns with private sector use of data.
  • Healthcare Data Remains Siloed. Through the Connected Health Initiative (CHI), we advocate for patients to be able to share their healthcare data with digital health companies that can help them make use of it. But in general, electronic health records (EHR) companies decline to transfer that data except inside their own network of providers and business associates (BAs), citing Health Insurance Portability and Accountability Act (HIPAA) compliance concerns. The problem with this, of course, is that HIPAA is supposed to make data portable, as the name suggests. And EHRs have emerged as a chokepoint for healthcare data that patients should otherwise be able to use as they wish. Besides harming big data competencies, outdated healthcare policies have also directly harmed patients. It would be a great tragedy if we yanked telehealth and remote physiologic monitoring (RPM)away from patients just as the general public begins to realize their potential. Certainly, the ability to rely on telehealth (defined in Medicare as live voice or video visits between patients and caregivers) is a sudden necessity during the pandemic as caregivers must screen and monitor patients from a distance. Avoiding such basic communications technologies because of fraud or abuse concerns when public health demands patients stay at home would be nothing short of a catastrophic win for red tape. What surprises many of us, however, is just how unprepared our relative inability to make use of digital health has made us for pandemics like COVID-19.

Interactive Advertising Bureau Executive Vice President for Public Policy Dave Grimaldi stated

While self-regulation has been a useful mechanism to encourage responsible data use, federal leadership is now needed to ensure that robust consumer privacy protections apply consistently throughout the country. The time is right for the creation of a new paradigm for data privacy in the United States. To this end, IAB is a key supporter of Privacy for America, a broad industry coalition of top trade organizations and companies representing a wide cross-section of the American economy that advocates for federal omnibus privacy legislation. Privacy for America has released a detailed policy framework to provide members of Congress with a new option to consider as they develop data privacy legislation for the United States. Participants in Privacy for America have met with leaders of Congress, the FTC, the Department of Commerce, the White House, and other key stakeholders to discuss the ways the framework protects consumers while also ensuring that beneficial uses of data can continue to provide vast benefits to the economy and mankind.

Grimaldi claimed

The Privacy for America framework would prohibit, rather than allow consent for, a range of practices that make personal data vulnerable to misuse. Many of these prohibitions would apply not only to companies that engage in these harmful practices directly, but to suppliers of data who have reason to know that the personal information will be used for these purposes.

  • Eligibility Determinations. Determining whether individuals are eligible for benefits like a job or credit are among the most important decisions that companies make. Although many of these decisions are currently regulated by existing sectoral laws (e.g., the Fair Credit Reporting Act), companies can easily purchase data on the open market to evade compliance with these laws. Privacy for America’s framework would prevent this abuse by banning the use of data to make eligibility decisions—about jobs, credit, insurance, healthcare, education, financial aid, or housing—outside these sectoral laws, thereby bolstering and clarifying the protections already in place. It also would provide new tools to regulators to cut off the suppliers of data that undermine these protections. To the extent that companies are unsure about whether a practice is permitted under existing law, they would be able to seek guidance from the FTC.
  • Discrimination. The widespread availability of detailed personal information has increased concerns that this data will be used to discriminate against individuals. The new framework envisioned by Privacy for America would supplement existing anti- discrimination laws by banning outright a particularly pernicious form of discrimination—using data to charge higher prices for goods or services based on personal traits such race, color, religion, national origin, sexual orientation, or gender identity. As discussed below, the framework also would allow individuals to opt out of data personalization, which can contribute to discrimination.
  • Fraud and Deception. For decades, the FTC and the states have pursued cases against companies that engage in fraud and deception. The new framework would focus specifically on the use and supply of data for these purposes. Thus, it would ban a range of fraudulent practices designed to induce the disclosure of personal information and, more generally, material misrepresentations about data privacy and security.
  • Stalking. In recent years, the proliferation of data has made it easier to track the location and activities of individuals for use in stalking. Of note, mobile apps designed for this very purpose have been identified in the marketplace. The framework would outlaw the use of personal information for stalking or other forms of substantial harassment, and would hold these types of apps accountable.
  • Use of Sensitive Data Without Express Consent. Consumers care most about their sensitive data, and companies should have an obligation to protect it. The new framework would prohibit companies from obtaining a range of sensitive information— including health, financial, biometric, and geolocation information, as well as call records, private emails, and device recording and photos—without obtaining consumers’ express consent.
  • Special Protections for Individuals Over 12 and Under 16 (Tweens). The Privacy for America framework includes a robust set of safeguards for data collected from tweens, an age group that needs protection but is actively engaged online and not subject to constant parental oversight. Specifically, the framework would prohibit companies from transferring tween data to third parties when they have actual knowledge of age. It also would ban payment to tweens for personal data, except under a contract to which a parent or legal guardian is a party. Finally, companies would be required to implement data eraser requirements allowing individuals to delete data posted online when they were tweens.

Center for Democracy and Technology Data and Privacy Project Director Michelle Richardson advised

When deciding what types of data practices are appropriate, Congress should remember that privacy is a balancing of equities. We no longer think of privacy as an on-off switch, or something that can be dismissed after a person agrees to a lengthy privacy policy. It instead weighs the intrusion of any product or program against the benefit of the data use, the secondary effects on individuals, and any mitigating steps that can be taken to minimize harms. As policymakers review data collection, use and sharing, they should:

  • Focus on prevention and treatment, not punishment: Past epidemics have demonstrated that fear is not as effective as clear, meaningful information from a reliable source and the ability to voluntarily comply with medical and governmental directives. Successfully fighting the coronavirus will mean ensuring that a government response does not evolve into law enforcement and broad surveillance functions.
  • Ensure accuracy and effectiveness: There does not appear to be a universally accepted definition of “accurate” or “effective” when it comes to predicting, preventing, or responding to the coronavirus. Nevertheless, if a tool or practice is unlikely to provide meaningful and measurable contributions to the coronavirus response, companies and governments should consider alternatives. This is not only because the privacy risks may not be justified but because people may rely on these measures in lieu of those that actually work.
  • Provide actionable information: In a time of crisis, more information isn’t always better. New data collection or novel data uses should inform individual, corporate, or government behavior in a constructive way. Symptom trackers, for example, may tell a person whether he or she should seek medical care. Contact tracing on the other hand, when it relies on insufficiently granular data, may result in unnecessary or unproductive quarantine, testing, and fear.
  • Require corporate and government practices that respect privacy: People are reasonably fearful for their own health and the health of their loved ones. The burden for constructing privacy-protective products and responses must not be on concerned citizens but on companies and governments. That includes:
    • A preference for aggregated data. Individually identifiable information should not be used when less intrusive measures will suffice. If aggregated data will not do, industry best practices in anonymization and de-identification must be applied.
    • Minimizing collection, use, and sharing. When identifiable information is necessary, data processing should be limited when possible.
    • Purpose limitations. Data collected or used for the coronavirus response should not be used for secondary purposes. For corporate actors, this means advertising for commercial purposes or unrelated product development. For government actors, that means any function not directly related to their public health functions.
    • Deletion. Data should be deleted when it is no longer necessary for responding to the coronavirus epidemic or conducting public health research, especially if it is personally identifiable.
  • Build services that serve all populations: Newly released data is confirming that minorities are contracting the coronavirus at a higher rate and are more likely to die from it.58 There are also legitimate questions about how actionable mobility tracking data is for rural, poor, and working class communities that must travel for work or to secure food and medical care. As technology seeks to find solutions to the coronavirus, it is crucial that it does so in a way that serves all demographics and does not exacerbate existing inequalities.
  • Empower individuals when possible: Epidemic response may not always allow for individualized opt-ins or opt-outs of data collection and use. To the extent possible, participation in data based programs should be voluntary and individuals should maintain traditional rights to control one’s data.
  • Be transparent to build trust: People will hesitate to participate in programs that involve their personal information but that are not transparent in how that information will be used. Companies that provide data, or inferences from data, and the governmental entities that use such information, must be transparent to users and residents about how data will be used.
  • Be especially rigorous when considering government action: A coordinated government response is necessary for successfully fighting the coronavirus epidemic, but the United States has an important tradition of recognizing that the powers of the state pose unique threats to privacy and liberty.