Further Reading, Other Developments, and Coming Events (16 November)

Further Reading

  • Trump’s refusal to begin the transition could damage cybersecurity” By Joseph Marks — The Washington Post. Former executive branch officials, some of whom served at the Department of Homeland Security (DHS), are warning that the Trump Administration’s refusal to start the transition to the Biden Administration may harm the United States’ (U.S.) ability to manage cyber risks if it stretches on too long.
  • Biden will get tougher on Russia and boost election security. Here’s what to expect.” By Joseph Marks — The Washington Post. Expect a Biden Administration to restore cybersecurity policy to the prominence it had in the Obama Administration with renewed diplomatic efforts to foster international consensus against nations like the Russian Federation or People’s Republic of China. A Biden Presidency will likely continue to pursue the Trump Administration’s larger objectives on the People’s Republic of China but without the capriciousness of the current President introducing an element of uncertainty. And, election security and funding will naturally be a focus, too.
  • Taking Back Our Privacy” By Anna Wiener — The New Yorker. This fascinating profile of Moxie Marlinspike (yes, that’s really his name), the prime mover behind end-to-end encryption in WhatsApp and his application, Signal, (hands down the best messaging app, in my opinion), is worth your time.
  • Biden’s Transition Team Is Stuffed With Amazon, Uber, Lyft, and Airbnb Personnel” By Edward Ongweso Jr — Vice’s Motherboard. This piece casts a critical eye on a number of members of the Biden-Harris transition team that have been instrumental in policy changes desired by their employers seemingly at odds with the President-elect’s policies. It remains to be seen how such personnel may affect policies for the new Administration.
  • Officials say firing DHS cyber chief could make U.S. less safe as election process continues” By Joseph Marks — The Washington Post. The head of the Department of Homeland Security’s Cybersecurity Infrastructure and Security Agency (CISA) may well be among those purged by the Trump Administration regardless of the costs to national security. CISA Director Christopher Krebs has deftly navigated some of the most fraught, partisan territory in the Trump Administration in leading efforts on election security, but his webpage, Rumor Control, may have been too much for the White House. Consequently, Krebs is saying he expects to be fired like CISA Assistant Director Bryan Ware was this past week.

Other Developments

  • The Democratic leadership on a key committee wrote the chairs of both the Federal Trade Commission (FTC) and the Federal Communications Commission (FCC), “demanding that the two commissions stop work on all partisan or controversial items currently under consideration in light of the results of last week’s presidential election” per the press release. House Energy and Commerce Committee Chair Frank Pallone Jr. (D-NJ), Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL), and Communications and Technology Subcommittee Chair Mike Doyle (D-PA) argued that FTC Chair Joseph Simons and FCC Chair Ajit Pai should “only pursue consensus and administrative matters that are non-partisan for the remainder of your tenure.” The agencies are, of course, free to dismiss the letters and the request and may well do so, especially in the case of the FCC and its rulemaking on 47 U.S.C. 230. Additionally, as rumored, the FTC may soon file an antitrust case against Facebook for its dominance of the social messaging market when Democrats on the FTC and elsewhere might prefer a broader case.
  • The Office of Personnel Management’s (OPM) Office of the Inspector General (OIG) released a pair of audits on the agency’s information security practices and procedures and found continued weaknesses in the agency’s systems. The OPM was breached by People’s Republic of China (PRC) hackers during the Obama Administration and massive amounts of information about government employees was exfiltrated. Since that time, the OPM has struggled to mend its information security and systems.
    • In “Audit of the Information Technology Security Controls of the U.S. Office of Personnel Management’s Agency Common Controls,” the OIG found explained that its “audit of the agency common controls listed in the Common Security Control Collection (CSCC) determined that:
      • Documentation assigning roles and responsibilities for the governance of the CSCC does not exist.
      • Inconsistencies in the risk assessment and reporting of deficient controls were identified in the most recent assessment results documentation of the CSCC.
      • Weaknesses identified in an assessment of the CSCC were not tracked through a plan of actions and milestones.
      • Weaknesses identified in an assessment of the CSCC were not communicated to the Information System Security Officers, System Owners or Authorizing Officials of the systems that inherit the controls.
      • We tested 56 of the 94 controls in the CSCC. Of the 56 controls tested, 29 were either partially satisfied or not satisfied. Satisfied controls are fully implemented controls according to the National Institute of Standards and Technology.”
    • And, in the annual Federal Information Security Modernization Act (FISMA) audit, the OIG found middling progress. Specifically, with respect to the FISMA IG Reporting Metrics, the OIG found:
      • Risk Management – OPM has defined an enterprise-wide risk management strategy through its risk management council. OPM is working to implement a comprehensive inventory management process for its system interconnections, hardware assets, and software.
      • Configuration Management – OPM continues to develop baseline configurations and approve standard configuration settings for its information systems. The agency is also working to establish routine audit processes to ensure that its systems maintain compliance with established configurations.
      • Identity, Credential, and Access Management (ICAM) – OPM is continuing to develop its agency ICAM strategy, and acknowledges a need to implement an ICAM program. However, OPM still does not have sufficient processes in place to manage contractors in its environment.
      • Data Protection and Privacy – OPM has implemented some controls related to data protection and privacy. However, there are still resource constraints within OPM’s Office of Privacy and Information Management that limit its effectiveness.
      • Security Training – OPM has implemented a security training strategy and program, and has performed a workforce assessment, but is still working to address gaps identified in its security training needs.
      • Information Security Continuous Monitoring – OPM has established many of the policies and procedures surrounding continuous monitoring, but the agency has not completed the implementation and enforcement of the policies. OPM also continues to struggle to conduct security controls assessments on all of its information systems.
      • Incident Response – OPM has implemented many of the required controls for incident response. Based upon our audit work, OPM has successfully implemented all of the FISMA metrics at the level of “consistently implemented” or higher.
      • Contingency Planning – OPM has not implemented several of the FISMA requirements related to contingency planning, and continues to struggle to maintain its contingency plans as well as conducting contingency plan tests on a routine basis.
  • The Australian Competition and Consumer Commission (ACCC) announced “amendments to the Consumer Data Right Rules…[that] permit the use of accredited intermediaries to collect data, through an expansion of the rules relating to outsourced service providers” per the press release. The ACCC stated “The amendments expand the Consumer Data Right system by allowing for accredited businesses to rely on other accredited businesses to collect Consumer Data Right data on their behalf, so they can provide goods and services to consumers.” The ACCC stated “[t]he Competition and Consumer (Consumer Data Right) Amendment Rules (No. 2) 2020 (Accredited Intermediary Rules) commenced on 2 October 2020 and are available on the Federal Register of Legislation.”
  • Singapore’s central bank called on financial institutions to ramp up cybersecurity because of increased threats during the COVID-19 pandemic. The Monetary Authority of Singapore (MAS)’s Cyber Security Advisory Panel (CSAP) held “its fourth annual meeting with MAS management…[and] shared its insights on cyber risks in the new operating environment and made several recommendations:”
    • Reviewing risk profiles and adequacy of risk mitigating measures. The Panel discussed the risks and vulnerabilities arising from the rapid adoption of remote access technologies and work processes that could affect FIs’ cyber risk profiles. The meeting highlighted the need for FIs to assess if their existing risk profiles have changed and remain acceptable. This is to ensure that in the long run appropriate controls are implemented to mitigate any new risks.  
    • Maintaining oversight of third-party vendors and their controls. With the increased reliance on third-party vendors, the Panel emphasised the need for FIs to step up their oversight of these counterparts and to monitor and secure remote access by third-parties to FIs’ systems. This is even more important during the COVID-19 pandemic where remote working has become pervasive.
    • Strengthening governance over the use of open-source software (OSS). Vulnerabilities in OSS are typically targeted and exploited by threat actors. The Panel recommended that FIs establish policies and procedures on the use of OSS and to ensure these codes are robustly reviewed and tested before they are deployed in the FIs’ IT environment.
  • Washington State Attorney General Bob Ferguson issued his fifth annual Data Breach Report “showed that the number of Washingtonians affected by breaches nearly doubled in the last year and ransomware attacks tripled” according to his press release. Ferguson asserted:
    • The total number of Washingtonians affected by a data breach increased significantly, from 351,000 in 2019 to 651,000 in 2020. Overall, there were fewer breaches reported to the Attorney General’s Office in 2020, decreasing from 60 reported breaches last year to 51 this year.
    • Ferguson made the following recommendations:
      • 1. Bring RCW 19.255.005 and RCW 42.56.590 into alignment by making sure that private entities also have to provide notice to consumers for breaches of a consumer’s name and the last-four digits of their Social Security number.
      • SB 6187, which was signed by Governor Inslee on March 18, 2020, and went into effect on June 11, 2020 modified the definition of personal information for breaches that occur at local and state agencies. Specifically, the bill modified the definition of personal information in RCW 42.56.590 to include the last four digits of a SSN in combination with a consumer’s name as a stand alone element that will trigger the requirement for consumer notice. This change should be extended to RCW 19.255.005 as well, to bring both laws into alignment, and provide consumers with the most robust protections possible, regardless of the type of entity that was breached.
      • 2. Expand the definition of “personal information” in RCW 19.255.005 and RCW 42.56.590 to include Individual Tax Identification numbers (ITINs).
      • ITINs are assigned by the IRS to foreign-born individuals who are unable to acquire a Social Security number for the purposes of processing various tax related documents. In other words, they are a unique identifier equivalent in sensitivity to a Social Security number. At present, ten states include ITINs in their definition of “personal information.” In 2018, Washington State was home to just over 1.1 million foreign born individuals, representing approximately 15% of the state’s population.
      • 3. Establish a legal requirement for persons or businesses that store personal information to maintain a risk-based information security program, and to ensure that information is not retained for a period longer than is reasonably required.
      • As this report discussed last year, it is imperative that entities who handle the private information of Washingtonians take steps necessary to keep it safe, and be prepared to act if they cannot. Such precautions are beneficial for both consumers and the organizations collecting their data. In 2019, Ponemon Report indicated that 48% of the companies surveyed lacked any form of security automation – security technologies used to detect breaches more efficiently than humans can.22 In 2020, that number dropped by only 7%.23
      • In 2019, the average cost of a data breach for companies without automation was nearly twice as expensive as for those who implemented security automation. That cost has only grown since, with data breaches in 2020 costing companies without security automation nearly triple that of business who have automation. Similarly, the formation of a dedicated Incident Response Team and testing of an Incident Response Plan reduced the average total cost of breaches in 2020 by more than $2 million.
      • Requiring data collectors to maintain an appropriately sized security program and incident response team and to dispose of consumer information that is no longer needed is a critical next step in mitigating the size and cost of breaches in our state.
  • Four former Secretaries of Homeland Security and two acting Secretaries wrote the leadership of the Congress regarding “the need to consolidate and strengthen Congressional oversight of the Department of Homeland Security (DHS) in order to make possible the fundamental changes that DHS urgently needs to protect the American people from the threats we face in 2021.” They noted “more than 90 different committees or subcommittees today have jurisdiction over DHS—far more than any other cabinet department.” They asserted:
    • DHS urgently needs to make major reforms, improvements, and enhancements to ensure the Department can protect the nation in the way Congress envisioned nearly two decades ago. DHS’s leadership, whether Democratic or Republican, needs to work with a single authorizing committee with broad subject matter authority to enact the changes and authorize the programs that DHS needs to address the threats of 2021.
  • Privacy International (PI) and 13 other groups from the European Union (EU) and Africa wrote the European Commission (EC), arguing the EU’s policies are supporting “the funding and development of projects and initiatives which threaten the right to privacy and other fundamental rights, such as freedom of expression and freedom of assembly.” These groups contended:
    • that by sponsoring such activities, the EU drives the adoption and use of surveillance technologies that, if abused by local actors, can potentially violate the fundamental rights of people residing in those countries. In the absence of rule of law and human rights safeguards enshrined in law, which seek to limit the state’s powers and protect people’s rights, these technologies can be exploited by authorities and other actors with access and result in onerous implications not just for the rights of privacy and data protection but also for other rights, such as freedom of expression and freedom of assembly.
    • In their press release, these groups stated the letter “comes following the public release of hundreds of documents obtained by PI after a year of negotiating with EU bodies under access to documents laws, which show:
      • How police and security agencies in Africa and the Balkans are trained with the EU’s support in spying on internet and social media users and using controversial surveillance techniques and tools; Read PI’s report here.
      • How EU bodies are training and equipping border and migration authorities in non-member countries with surveillance tools, including wiretapping systems and other phone surveillance tools, in a bid to ‘outsource’ the EU’s border controls; Read PI’s report here.
      • How Civipol, a well-connected French security company, is developing mass biometric systems with EU aid funds in Western Africa in order to stop migration and facilitate deportations without adequate risk assessments. Read PI’s report here.
    • They stated “we call on the European Commission, in coordination with the European Parliament and EU member states to:
      • Ensure no support is provided for surveillance or identity systems across external assistance funds and instruments to third countries that lack a clear and effective legal framework governing the use of the surveillance equipment or techniques.
      • Only provide support for surveillance or identity systems after an adequate risk assessment and due diligence are carried out.
      • Provide Parliament greater capabilities of scrutiny and ensuring accountability over funds.
      • All future projects aimed at addressing “the root causes of instability, forced displacement, and irregular migration” should be mainstreamed into the NDICI. In turn, discontinue the EUTF for Africa when the current fund comes to its end in 2020.
      • Ensure that EC Directorate-General for International Cooperation and Development (DEVCO), the EU body in charge of development aid, establishes a new Fund aimed at improving governance and legal frameworks in non-EU countries to promote the right to privacy and data protection. Priorities of the Fund should include:
        • Revising existing privacy and data protection legal frameworks, or where there is none developing new ones, that regulate surveillance by police and intelligence agencies, aimed at ensuring they are robust, effectively implemented, and provide adequate redress for individuals;
        • Strengthening laws or introducing new ones that set out clear guidelines within which the government authorities may conduct surveillance activities;
        • Focusing on promotion and strengthening of democratisation and human rights protections;
        • Strengthening the independence of key monitoring institutions, such as the judiciary, to ensure compliance with human rights standards.

Coming Events

  • On 17 November, the Senate Judiciary Committee will hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • The Senate Homeland Security and Governmental Affairs Committee’s Regulatory Affairs and Federal Management Subcommittee will hold a hearing on how to modernize telework in light of what was learned during the COVID-19 pandemic on 18 November.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by cottonbro from Pexels

House Hearing On Social Media and Extremism

House Energy and Commerce Committee looked at the effects of social media on the U.S. especially in growing radicalization.

The House Energy and Commerce’s Consumer Protection and Commerce Subcommittee held a hearing on the role social media plays in the proliferation of extremism. Democrats and Republicans continued to articulate different views of the causes and effects of social media in stoking hate and violence. Democrats focused more on white supremacist, racist, and anti-Semitic hate speech, while Republicans focused on hate speech from the left, particularly against law enforcement and Republican government officials. This hearing could serve as a precursor for legislation to reform 47 U.S.C. 230 (aka Section 230 of the Communications Act of 1934).

Chair Jan Schakowsky (D-IL) stated

  • Throughout our nation’s history, we have seen extremism undermine public faith in our institutions, incite violence, sow division, and spread hate speech. Whether it be the Ku Klux Klan, Neo Nazis or bullying of vulnerable individuals, these attacks are not new to Americans. What is different today is the way social media algorithms can amplify hate speech.
  • Despite many conveniences and benefits for communication, over time social media’s dark side has grown and divided Americans at a time when we need to pull together. As Big Tech developed the online ecosystem and monetized its functions to their enormous benefit, these companies have done little to protect Americans from the dangers lurking in the dark corners. Driven by profit and power, and in the face of obvious harms, these mega companies successfully convinced governments all over the world to leave them alone, lest we disturb the delicate garden they are tending.
  • Big Tech has helped divide our nation, and stoke genocide in others. Consider Myanmar, where we saw mass murder of the Rohingya people.  And these companies have profited at every turn. Consider the QAnon conspiracy theory that has thrived online for years now. Q followers believe that the entire world is controlled by a secret cabal of child abusers who will eventually drink the blood of victims. The FBI has linked the group to domestic terror and considers it a continuing terrorism threat.
  • I would like to commend our colleagues Adam Kinzinger, Tom Malinowski, and Denver Riggleman for confronting this threat head-on.
  • There is no doubt that controversy and extremism drive engagement, and therefore profits. Algorithms that amplify extremist views also amplifies profit for these platforms. A 911 conspiracy video has been seen 22 million times on Facebook in the last week.  Each view keeps eyeballs on the platform and dollars rolling in. Nowhere has Facebook been more negligent than in its oversight of its Group function.
  • Facebook Groups promoting misogyny have grown by 10%, anti-LGBT groups have grown by 22%, and groups promoting antisemitism have grown by 27%…in the past week.
  • In a recent interview, an engineer at Facebook said the group recommendation algorithm is “the scariest feature of the platform – the darkest manifestation.” And he continued, “A user enters one group, and Facebook essentially pigeonholes them into a lifestyle they can never really get out of.”
  • Next week I’ll be circulating draft legislation that aims to fundamentally alter these companies’ business models and give consumers and regulators recourse when these companies fail in their basic commitments to consumers. I hope you’ll all take a look.

Ranking Member Cathy McMorris Rodgers (R-WA) said that as with any disruptive technology, social media has its faults and can cause real harm, especially if companies are not fulfilling their responsibilities. She hoped people would still recognize the internet as an overwhelming force for good, especially in challenging times. Rodgers said social media gives Americans a platform for their voices to be heard and keep people connected to their loved ones. She asserted it offers unlimited access to information and unlimited opportunities for innovation.

Rodgers asserted freedom of speech is central to American democracy, which is what sets the United States apart from nearly every other nation. She argued this bedrock principle is increasingly under attack. Rodgers conceded free speech is not an absolute right and there are exceptions such as harming others or one’s self. She expressed her extreme concern about platforms applying inconsistent moderating policies for their own purposes. Rodgers said whether this is an excuse for failing to enforce content standards fairly or by altering speech to settle scores with political or competitive opponents, there is no clearer example of a platform using its power for political purposes than Twitter, singling out President Donald Trump. Twitter, at the same time, has left up blatant violent threats by activists, Democratic candidates, and authoritarian leaders.

Rodgers noted that Twitter’s rules say they are intended to ensure all people can participate in public conversation freely and safely, but that’s not what we are seeing. Rodgers stated to further its leadership’s political agenda, Twitter has instead embraced an inconsistent application of its standards. She argued that for political speech one may disagree with, the answer should not be censorship. Rodgers contended the answer should always be more speech and harmful speech should be removed regardless of the political leanings of the speaker or the moderator.

Rodgers claimed Twitter has fallen well short of encouraging healthy discourse online. She said the following are on Twitter today, and unlike Trump, they have not been fact-checked or tagged for violation of standards by @Jack. The World Health Organization shared this propaganda from the CCP, quote: ‘Preliminary investigations conducted by the Chinese authorities have found no clear evidence of human-to-human transmission of the novel #coronavirus.’ Rodgers declared that this is false.

Rodgers stated that another example is a well-known online activist on the left who has repeatedly doxxed and falsely accused innocent people of heinous crimes like falsely accusing a Texas state trooper of rape. He also accused an innocent man of murdering a 7-year old girl. She added death threats were sent to the man’s family and he ultimately took his own life.” Rodgers stated this online activist also used Twitter to threaten the lives of innocent police officers in Kenosha, Wisconsin, but Twitter said this didn’t violate their standards. Rodgers stated this tweet is word-for-word from a candidate challenging Republican Congressman Brian Mast, and she tweeted; I quote: ‘is that really the new rule they want? Killing is okay if it’s a bad guy? Is it now open season on… Trump… Barr… Kavanaugh… Pompeo.” Rodgers asked what would happen if the President or any Republican said that about Democrats. She expressed her hope that  the Secret Service took this threat more seriously than Jack Dorsey.

Rodgers stated that bottom line: Twitter continues to tag the President’s tweets with increased frequency as we approach the election, but they have ignored violent threats against Republicans, allowed for propaganda pushed by the Chinese Communist Party, tolerated doxxing and the incitement of violence against police officers, and also left clear threats by the Supreme Leader of Iran go unchallenged.” She claimed that this begs the question, what is the point of their terms of services and content policies if Jack Dorsey intentionally applies them differently depending on who the user is. Rodgers argued that if Democrats actually cared about extremism they would call for Twitter to address threats from all political viewpoints.

Rutgers University Miler Center for Community Protection and Resiliency Fellow John Donohue argued:

  • The problem we face is that social unrest is being effectively organized in the social-cyber domain, into potential insurgencies, on the basis of memes and short messages hosted and fed by social media companies. This fact is both highly visible on the one hand and fundamentally invisible on the other because though it is ubiquitous, no single entity can contextualize the sheer scale of coded language and memes. Layered on that technical complexity is the legal obligations on law enforcement to ensure the constitutional protections on citizens’ free speech and assembly, at the same time as it tries to distinguish imminent threats to life and destruction of property from jokes. 7
  • America is at a cross roads, the intersection of constitutional rights and legitimate law enforcement public safety and civil society has never been more at risk by domestic actors as it is now as seditionists actively promote a revolution. However, I remain confident that America remains strong to its founding principles and recommend the following as possible paths forward.
  • Social media companies were slow to act during the rise of ISIS message amplification and recruitment activities. These companies cannot be alone in combatting extremist ideologies and accelerationists, but they are part of the solution. And legislation is needed to ensure those companies work collaboratively with civic leaders across the spectrum for a civil society.
  • Just as the internet is diffuse, the solution cannot reside in singular entity. With regard to extremist actions there needs to be better coordination among law enforcement intelligence capacities, supported by appropriate Department of Justice entities and willing or forced social media companies to rapidly respond to hate driven seditious rhetoric where the content and context clearly demonstrates unlawful activity is about to occur, is occurring or is being planned. Moreover, when there is an imminent threat to life social media platforms cannot be the sole arbiter of what is ‘in’ or ‘out’ of community standards, nor what is appropriate to share with law enforcement or not. Lives depend on it. At the same time, as we protect democracy at a strategic level, our communities are routinely confronted with actual events, unfolding in real time. Those events, such as the Tree of Life Synagogue massacre sometimes have enough lead-time to have law enforcement intervene, and protect life. Structures and policies must be strengthened and supported by the federal government and the social media platforms equally. Those efforts as I mentioned with the national fusion centers are a strong foundation for that effort.
  • These are fundamentally the traditions that now find themselves under direct attack by the extreme left and extreme right alike. How we ultimately move forward together as a country, as Americans, depends how we negotiate this moment in history.

Coalition for a Safer Web President Marc Ginsberg stated

  • Coalition for a Safer Web is on record urging Congress for the sake of our democracy and the safety of the American people, to end the immunity from content liability accorded social media companies, which has, by judicial extension, enabled fringe web supported radical and extremist websites to also claim the same immunity from content liability.
  • But we are realists and we simply do not envision in the foreseeable future a bi-partisan agreement to achieve this objective – even splitting off from content immunity the extremist incitement rags which pass as websites, including GAB, 4chan, 8kun, and the other scum of social media catering to terrorists and funneling Russian disinformation and misinformation into our political discourse.
  • That is why CSW developed a public/private sector solution to tackle this dilemma – a new Social Media Standards Board (SMSB).The SMSB would serve as a:
    • Transparent content moderation auditing organization to monitor compliance by social media companies of a new industry “code of conduct” developed with the participation of concerned citizens groups, social media companies, and the advertising industry – which is, after all, the industry with the most leverage over social media and which created a new Global Alliance for Responsible Media (GARM) to accomplish this goal.
    • Forum to incubate and promote new technologies to assist social media companies to fulfill their own customer and vendor obligations to better manage and achieve verifiable commitments to de-platform extremist incitement, dis, and misinformation.
  • The SMSB is loosely modeled after the successful 1973 banking industry’s Financial Accounting Standards Board (FASB), which was created precisely to harmonize the various standards (think customer terms of service) of banks and develop private sector regulatory mechanisms to hold banks to their industry and regulatory commitments.
  • The following is extracted from CSW’s SMSB proposal dated August 20, 2020 and attached to my testimony, which was also the subject of an article published in The Hill.
    • It envisions passage by Congress of an amendment to Section 230 delegating to the SMSB the power to suspend Section 230 immunity until a violating social media company restores its compliance with new industry code of conduct. The loss of Section 230 immunity would represent the ultimate penalty imposed on code violators for sustained violations. Lesser sanctions against social media companies imposed by the SMSB code could conceivably include: 1) de-certification from code compliance; 2) forfeiture of digital ad revenue; and 3) a referral by the SMSB for administrative action to the Federal Trade Commission.
    • Should the Subcommittee consider a SMSB worthy of further consideration I hope it will invite to testify representatives of GARM to discuss how the digital advertising industry intends to use its undeniable financial leverage it has to compel social media companies to abide by verifiable standards which protect their brand safety both in the United States and abroad.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Merakist on Unsplash

Further Reading, Other Developments, and Coming Events (21 August)

Here are Further Reading, Other Developments, and Coming Events.

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.
  • On 22 September, the Federal Trade Commission (FTC) will hold a public workshop “to examine the potential benefits and challenges to consumers and competition raised by data portability.” By 21 August, the FTC “is seeking comment on a range of issues including:
    • How are companies currently implementing data portability? What are the different contexts in which data portability has been implemented?
    • What have been the benefits and costs of data portability? What are the benefits and costs of achieving data portability through regulation?
    • To what extent has data portability increased or decreased competition?
    • Are there research studies, surveys, or other information on the impact of data portability on consumer autonomy and trust?
    • Does data portability work better in some contexts than others (e.g., banking, health, social media)? Does it work better for particular types of information over others (e.g., information the consumer provides to the business vs. all information the business has about the consumer, information about the consumer alone vs. information that implicates others such as photos of multiple people, comment threads)?
    • Who should be responsible for the security of personal data in transit between businesses? Should there be data security standards for transmitting personal data between businesses? Who should develop these standards?
    • How do companies verify the identity of the requesting consumer before transmitting their information to another company?
    • How can interoperability among services best be achieved? What are the costs of interoperability? Who should be responsible for achieving interoperability?
    • What lessons and best practices can be learned from the implementation of the data portability requirements in the GDPR and CCPA? Has the implementation of these requirements affected competition and, if so, in what ways?”
  • The Federal Communications Commission (FCC) will hold an open meeting on 30 September, but an agenda is not available at this time.

Other Developments

  • The National Institute of Standards and Technology (NIST) published for input Four Principles of Explainable Artificial Intelligence (Draft NISTIR 8312) in which the authors stated:
    • We introduce four principles for explainable artificial intelligence (AI) that comprise the fundamental properties for explainable AI systems. They were developed to encompass the multidisciplinary nature of explainable AI, including the fields of computer science,  engineering, and psychology. Because one size fits all explanations do not exist, different users will require different types of explanations. We present five categories of explanation and summarize theories of explainable AI. We give an overview of the algorithms in the field that cover the major classes of explainable algorithms. As a baseline comparison, we assess how well explanations provided by people follow our four principles. This assessment provides insights to the challenges of designing explainable AI systems.
    • NIST said “our four principles of explainable AI are:
      • Explanation: Systems deliver accompanying evidence or reason(s) for all outputs.
      • Meaningful: Systems provide explanations that are understandable to individual users.
      • Explanation Accuracy: The explanation correctly reflects the system’s process for generating the output.
      • Knowledge Limits: The system only operates under conditions for which it was designed or when the system reaches a sufficient confidence in its output.
    • A year ago, NIST published “U.S. LEADERSHIP IN AI: A Plan for Federal Engagement in Developing Technical Standards and Related Tools” as required by Executive Order (EO) 13859, Maintaining American Leadership in Artificial Intelligence in response to an August 10, 2019 due date. 
      • NIST explained that “[t]here are a number of cross-sector (horizontal) and sector-specific (vertical) AI standards available now and many others are being developed by numerous standards developing organizations (SDOs)…[and] [s]ome areas, such as communications, have well-established and regularly maintained standards in widespread use, often originally developed for other technologies. Other aspects, such as trustworthiness, are only now being considered.” NIST explained that its AI plan “identifies the following nine areas of focus for AI standards: 
        • Concepts and terminology
        • Data and knowledge 
        • Human interactions 
        • Metrics
        • Networking
        • Performance testing and reporting methodology
        • Safety
        • Risk management
        • Trustworthiness
      • NIST asserting that “[i]n deciding which standards efforts merit strong Federal government involvement, U.S. government agencies should prioritize AI standards efforts that are:
        • Consensus-based, where decision-making is based upon clearly established terms or agreements that are understood by all involved parties, and decisions are reached on general agreement.
        • Inclusive and accessible, to encourage input reflecting diverse and balanced communities of users, developers, vendors, and experts. Stakeholders should include representatives from diverse technical disciplines as well as experts and practioners from non-traditional disciplines of special importance to AI such as ethicists, economists, legal professionals, and policy makers: essentially, accommodating all desiring a “seat at the table.”
        • Multi-path, developed through traditional and novel standards-setting approaches and organizations that best meet the needs of developers and users in the marketplace as well as society at large.
        • Open and transparent, operating in a manner that: provides opportunity for participation by all directly- and materially- affected; has well-established and readily accessible operating rules, procedures, and policies that provide certainty about decision making processes; allows timely feedback for further consideration of the standard; and ensures prompt availability of the standard upon adoption.
        • Result in globally relevant and non-discriminatory standards, where standards avoid becoming non-tariff trade barriers or locking in particular technologies or products.
  • Consumer Watchdog has sued Zoom Video Communications “for making false and deceptive representations to consumers about its data security practices in violation of the District of Columbia Consumer Protection Procedures Act (CPPA).” The advocacy organization asserted
    • To distinguish itself from competitors and attract new customers, Zoom began advertising and touting its use of a strong security feature called “end-to-end encryption” to protect communications on its platform, meaning that the only people who can access the communicated data are the sender and the intended recipient. Using end-to-end encryption prevents unwanted third parties—including the company that owns the platform (in this case, Zoom)—from accessing communications, messages, and data transmitted by users.
    • Unfortunately, Zoom’s claims that communications on its platform were end-to-end encrypted were false. Zoom only used the phrase “end-to-end encryption” as a marketing device to lull consumers and businesses into a false sense of security.
    • The reality is that Zoom is, and has always been, capable of intercepting and accessing any and all of the data that users transmit on its platform—the very opposite of end-to-end encryption. Nonetheless, Zoom relied on its end-to-end encryption claim to attract customers and to build itself into a publicly traded company with a valuation of more than $70 billion.
    • Consumer Watchdog is seeking the greater of treble damages or $1,500 per violation along with other relief
    • Zoom is being sued in a number of other cases, including two class action suits in United States courts in Northern California (#1 and #2).
  • The United States (U.S.) Government Accountability Office (GAO) decided the Trump Administration violated the order of succession at the U.S. Department of Homeland Security by naming the Customs and Border Protection (CBP) Commissioner of Kevin McAleenan the acting Secretary after former Secretary Kirstjen Nielsen resigned early in 2019. The agency’s existing order of succession made clear that Cybersecurity and Infrastructure Security Agency (CISA) Director Christopher Krebs was next in line to lead DHS. The GAO added “[a]s such, the subsequent appointments of Under Secretary for Strategy, Policy, and Plans, Chad Wolf and Principal Deputy Director of U.S. Citizenship and Immigration Services (USCIS) Ken Cuccinelli were also improper because they relied on an amended designation made by Mr. McAleenan.”
    • However, GAO is punting the question of what the implications of its findings are:
      • In this decision we do not review the consequences of Mr. McAleenan’s service as Acting Secretary, other than the consequences of the November delegation, nor do we review the consequences of Messers. Wolf and Cuccinelli service as Acting Secretary and Senior Official Performing the Duties of Deputy Secretary respectively.
      • We are referring the question as to who should be serving as the Acting Secretary and the Senior Official Performing the Duties of Deputy Secretary to the DHS Office of Inspector General for its review.
      • We also refer to the Inspector General the question of consequences of actions taken by these officials, including consideration of whether actions taken by these officials may be ratified by the Acting Secretary and Senior Official Performing the Duties of Deputy Secretary as designated in the April Delegation.
    • The GAO also denied DHS’s request to rescind this opinion because “DHS has not shown that our decision contains either material errors of fact or law, nor has DHS provided information not previously considered that warrants reversal or modification of the decision.”
    • The chairs of the House Homeland Security and Oversight and Reform Committees had requested the GAO legal opinion and claimed in their press release the opinion “conclude[es] that President Donald Trump’s appointments to senior leadership positions at the Department of Homeland Security were illegal and circumvented both the Federal Vacancy Reform Act and the Homeland Security Act.”
  • Top Democrats on the House Energy and Commerce Committee wrote the members of the Facebook Oversight Board expressing their concern the body “does not have the power it needs to change Facebook’s harmful policies.” Chair Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) “encouraged the newly appointed members to exert pressure on Facebook to listen to and act upon their policy recommendations, something that is not currently included in the Board Members’ overall responsibilities.” They asserted:
    • The Committee leaders believe Facebook is intentionally amplifying divisive and conspiratorial content because such content attracts more customer usage and, with it, advertising revenue. Pallone, Doyle and Schakowsky were also troubled by recent reports that Facebook had an opportunity to retune its systems responsible for the amplification of this content, but chose not to. 
    • The three Committee leaders wrote that the public interest should be the Oversight Board’s priority and that it should not be influenced by the profit motives of Facebook executives. Pallone, Doyle and Schakowsky also requested the board members answer a series of questions in the coming weeks.
  • The United States (U.S.) Government Accountability Office (GAO) examined how well the United States Department of Homeland Security and selected federal agencies are implementing a cybersecurity program designed to give the government better oversight and control of their networks. In auditing the Continuous Diagnostics and Mitigation (CDM), the GAO found limited success and ongoing, systemic roadblocks preventing increased levels of security. DHS has estimated the program will cost $10.9 billion over ten years.
    • The GAO concluded
      • Selected agencies reported that the CDM program had helped improve their awareness of hardware on their networks. However, although the program has been in existence for several years, these agencies had only implemented the foundational capability for managing hardware to a limited extent, including not associating hardware devices with FISMA systems. In addition, while most agencies implemented requirements for managing software, all of them inconsistently implemented requirements for managing configuration settings. Moreover, poor data quality resulting from these implementation shortcomings diminished the usefulness of agency dashboards to support security-related decision making. Until agencies fully and effectively implement CDM program capabilities, including the foundational capability of managing hardware on their networks, agency and federal dashboards will not accurately reflect agencies’ security posture. Part of the reason that agencies have not fully implemented key CDM requirements is that DHS had not ensured integrators had addressed shortcomings with integrators’ CDM solutions for managing hardware and vulnerabilities. Although DHS has taken various actions to address challenges identified by agencies, without further assistance from DHS in helping agencies overcome implementation shortcomings, the program—costing billions of dollars— will likely not fully achieve expected benefits.
    • The chairs and ranking members of the Senate Homeland Security & Governmental Affairs and House Homeland Security Committees, the chair of the House Oversight and Reform Committee, and other Members requested that the GAO study and report on this issue.
  • Google and the Australian Competition and Consumer Commission (ACCC) have exchanged public letters, fighting over the latter’s proposal to ensure that media companies are compensated for articles and content the former uses.
    • In an Open Letter to Australians, Google claimed:
      • A proposed law, the News Media Bargaining Code, would force us to provide you with a dramatically worse Google Search and YouTube, could lead to your data being handed over to big news businesses, and would put the free services you use at risk in Australia.
      • You’ve always relied on Google Search and YouTube to show you what’s most relevant and helpful to you. We could no longer guarantee that under this law. The law would force us to give an unfair advantage to one group of businesses – news media businesses – over everyone else who has a website, YouTube channel or small business. News media businesses alone would be given information that would help them artificially inflate their ranking over everyone else, even when someone else provides a better result. We’ve always treated all website owners fairly when it comes to information we share about ranking. The proposed changes are not fair and they mean that Google Search results and YouTube will be worse for you.
      • You trust us with your data and our job is to keep it safe. Under this law, Google has to tell news media businesses “how they can gain access” to data about your use of our products. There’s no way of knowing if any data handed over would be protected, or how it might be used by news media businesses.
      • We deeply believe in the importance of news to society. We partner closely with Australian news media businesses — we already pay them millions of dollars and send them billions of free clicks every year. We’ve offered to pay more to license content. But rather than encouraging these types of partnerships, the law is set up to give big media companies special treatment and to encourage them to make enormous and unreasonable demands that would put our free services at risk.
    • In its response, the ACCC asserted:
      • The open letter published by Google today contains misinformation about the draft news media bargaining code which the ACCC would like to address. 
      • Google will not be required to charge Australians for the use of its free services such as Google Search and YouTube, unless it chooses to do so.
      • Google will not be required to share any additional user data with Australian news businesses unless it chooses to do so.
      • The draft code will allow Australian news businesses to negotiate for fair payment for their journalists’ work that is included on Google services.
      • This will address a significant bargaining power imbalance between Australian news media businesses and Google and Facebook.
    • Late last month, the ACCC released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury and the companies.
    • The ACCC explained
      • The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.
    • This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.
    • A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”
  • The United States Coast Guard is asking for information on “the introduction and development of automated and autonomous commercial vessels and vessel technologies subject to U.S. jurisdiction, on U.S. flagged commercial vessels, and in U.S. port facilities.” The Coast Guard is particularly interested in the “barriers to the development of autonomous vessels.” The agency stated
    • On February 11, 2019, the President issued Executive Order (E.O.) 13859, “Maintaining American Leadership in Artificial Intelligence.”The executive order announced the policy of the United States Government to sustain and enhance the scientific, technological, and economic leadership position of the United States in artificial intelligence (AI) research and development and deployment through a coordinated Federal Government strategy. Automation is a broad category that may or may not incorporate many forms of technology, one of which is AI. This request for information (RFI) will support the Coast Guard’s efforts to accomplish its mission consistent with the policies and strategies articulated in E.O. 13859. Input received from this RFI will allow the Coast Guard to better understand, among other things, the intersection between AI and automated or autonomous technologies aboard commercial vessels, and to better fulfill its mission of ensuring our Nation’s maritime safety, security, and stewardship.

Further Reading

  • ‘Boring and awkward’: students voice concern as colleges plan to reopen – through Minecraft” By Kari Paul – The Guardian. A handful of universities in the United States (U.S.) are offering students access to customized Minecraft, an online game that allows players to build worlds. The aim seems to be to allow students to socialize online in replicas on their campuses. The students interviewed for this story seemed underwhelmed by the effort, however.
  • When regulators fail to rein in Big Tech, some turn to antitrust litigation” – By Reed Albergotti and Jay Greene – The Washington Post. This article places Epic Games suit against Apple and Google into the larger context of companies availing themselves of the right to sue themselves under antitrust laws in the United States. However, for a number of reasons, these suits have not often succeeded, and one legal commentator opined that judges tend to see these actions as sour grapes. However, revelations turned up during discovery can lead antitrust regulators to jump into proceedings, giving the suit additional heft.
  • What Can America Learn from Europe About Regulating Big Tech?” By Nick Romeo – The New Yorker.  A former Member of the European Parliament, Marietje Schaake, from the Netherlands is now a professor at Stanford and is trying to offer a new path on regulating big tech that would rein in the excesses and externalities while allowing new technologies and competition to flourish. The question is whether there is a wide enough appetite for her vision in the European Union let alone the United States.
  • Facebook employees internally question policy after India content controversy – sources, memos” By Aditya Kalra and Munsif Vengattil – Reuters. The tech giant is also facing an employee revolt in the world’s largest democracy. Much like in the United States and elsewhere, employees are pressing leadership to explain why they are seemingly not applying the platform’s rules on false and harmful material to hateful speech by leaders. In this case, it was posts by a member of the ruling Bharatiya Janata Party (BJP) calling Indian Muslims traitors. And, in much the same way accusations have been leveled at a top Facebook lobbyist in Washington who has allegedly interceded on behalf of Republicans and far right interests on questionable material, a lobbyist in New Delhi has done the same the BJB.
  • List of 2020 election meddlers includes Cuba, Saudi Arabia and North Korea, US intelligence official says” By Shannon Vavra – cyberscoop. At a virtual event this week, National Counterintelligence and Security Center (NCSC) Director William Evanina claimed that even more nations are trying to disrupt the United States election this fall, including Cuba, Saudi Arabia, and North Korea. Evanina cautioned anyone lest they think the capabilities of these nations rise to the level of the Russian Federation, People’s Republic of China, and Iran. Earleir this month, Evanina issued an update to his late July statement “100 Days Until Election 2020” through “sharing additional information with the public on the intentions and activities of our adversaries with respect to the 2020 election…[that] is being released for the purpose of better informing Americans so they can play a critical role in safeguarding our election.” Evanina offered more in the way of detail on the three nations identified as those being most active in and capable of interfering in the November election: the Russian Federation, the PRC, and Iran. This additional detail may well have been provided given the pressure Democrats in Congress to do just this. Members like Speaker of the House Nancy Pelosi (D-CA) argued that Evanina was not giving an accurate picture of the actions by foreign nations to influence the outcome and perception of the 2020 election. Republicans in Congress pushed back, claiming Democrats were seeking to politicize the classified briefings given by the Intelligence Community (IC).

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Silentpilot from Pixabay

Further Reading and Other Developments (11 July)

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Other Developments

  • The United States District Court of Maine denied a motion by a number of telecommunications trade associations to enjoin enforcement of a new Maine law instituting privacy practices for internet service providers (ISP) in the state that limited information collection and processing. The plaintiffs claimed the 2017 repeal of the Federal Communications Commission’s (FCC) 2016 ISP Privacy Order preempted states from implementing their own privacy rules for ISPs. In its decision, the court denied the plaintiffs’ motion and will proceed to decide the merits of the case.
  • The European Data Protection Board (EDPB) has debuted a “One-Stop-Shop” register “containing decisions taken by national supervisory authorities following the One-Stop-Shop cooperation procedure (Art. 60 GDPR).” The EDPB explained “[u]nder the GDPR, Supervisory Authorities have a duty to cooperate on cases with a cross-border component to ensure a consistent application of the regulation – the so-called one-stop-shop (OSS) mechanism…[and] [u]nder the OSS, the Lead Supervisory Authority (LSA) is in charge of preparing the draft decisions and works together with the concerned SAs to reach consensus.” Hence this new repository will contain the decisions on which EU data protection authorities have cooperated in addressing alleged GDPR violations that reach across the borders of EU nations.
  • The chair of the House Energy and Commerce Committee and three subcommittee chairs wrote Facebook, Google, and Twitter asking the companies “provide the Committee with monthly reports similar in scope to what you are providing the European Commission regarding your COVID-19 disinformation efforts as they relate to United States users of your platform.” They are also asking that the companies brief them and staff on 22 July on these efforts. Given the Committee’s focus on disinformation, it is quite possible these monthly reports and the briefing could be the basis of more hearings and/or legislation. Chair Frank Pallone, Jr. (D-NJ), Oversight and Investigations Subcommittee Chair Diana DeGette (D-CO), Communications and Technology Subcommittee Chair Mike Doyle (D-PA) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) signed the letters.
  • Reports indicate the Federal Trade Commission (FTC) and Department of Justice (DOJ) are reviewing the February 2019 $5.7 million settlement between the FTC and TikTok for violating the Children’s Online Privacy Protection Act (COPPA). In May 2020, a number of public advocacy groups filed a complaint with the FTC, asking whether the agency has “complied with the consent decree.” If TikTok has violated the order, it could face huge fines as the FTC and DOJ could seek a range of financial penalties. This seems to be another front in the escalating conflict between the United States and the People’s Republic of China.
  • Tech Inquiry, an organization that “seek[s] to combat abuses in the tech industry through coupling concerned tech workers with relevant members of civil society” revealed “an in-depth analysis of all public US federal (sub)contracting data over the last four and a half years to estimate the rankings of tech companies, both in and out of Silicon Valley, as contractors with the military, law enforcement, and diplomatic arms of the United States.” Tech Inquiry claimed “[o]ur analysis shows a diversity of contracting postures (see Tables 2 and 3), not a systemic divide from Washington. Within a substantial list of namebrand tech companies, only Facebook, Apple, and Twitter look to be staying out of major military and law enforcement contracts.”
  • The United States Secret Service announced the formation of a new Cyber Fraud Task Force (CFTF) which merges “its Electronic Crimes Task Forces (ECTFs) and Financial Crimes Task Forces (FCTFs) into a single unified network.” The rationale given for the merger is “the line between cyber and financial crimes has steadily blurred, to the point today where the two – cyber and financial crimes – cannot be effectively disentangled.”
  • The United States Election Assistance Commission (EAC) held a virtual public hearing, “Lessons Learned from the 2020 Primary Elections” “to discuss the administration of primary elections during the coronavirus pandemic.”
  • The National Council of Statewide Interoperability Coordinators (NCSWIC), a Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) administered program, released its “NCSWIC Strategic Plan and Implementation Guide,” “a stakeholder-driven, multi-jurisdictional, and multi-disciplinary plan to enhance interoperable and emergency communications.” NCSWIC contended “[t]he plan is a critical mid-range (three-year) tool to help NCSWIC and its partners prioritize resources, strengthen governance, identify future investments, and address interoperability gaps.”
  • Access Now is pressing “video conferencing platforms” other than Zoom to issue “regular transparency reports… clarifying exactly how they protect personal user data and enforce policies related to freedom of expression.”

Further Reading

  • India bans 59 Chinese apps, including TikTok and WeChat, after deadly border clash” – South China Morning Post. As a seeming extension to the military skirmish India and the People’s Republic of China (PRC) engaged in, a number of PRC apps have been banned by the Indian government, begging the question of whether there will be further escalation between the world’s two most populous nations. India is the TikTok’s biggest market with more than 120 million users in the South Asian country, and a range of other apps and platforms also have millions of users. Most of the smartphones used in India are made by PRC entities. Moreover, if New Delhi joins Washington’s war on Huawei, ZTE, and other PRC companies, the cumulative effect could significantly affect the PRC’s global technological ambitions.
  • Huawei data flows under fire in German court case” – POLITICO. A former Huawei employee in Germany has sued the company alleging violations of the General Data Protection Regulation (GDPR) through the company’s use of standard contractual clauses. This person requested the data the company had collected from him and the reasons for doing so. Huawei claimed it had deleted the data. A German court’s decision that Huawei had violated the GDPR is being appealed. However, some bigger issues are raised by the case, including growing unease within the European Union, that People’s Republic of China firms are possibly illegally transferring and processing EU citizens’ data and a case before Europe’s highest court in which the legality of standard contractual clauses may be determined as early as this month.
  • Deutsche Telekom under pressure after reports on Huawei reliance” – Politico. A German newspaper reported on confidential documents showing that Deutsche Telekom deepened its relationship with Huawei as the United States’ government was pressuring its allies and other nations to stop using the equipment and services of the company. The German telecommunications company denied the claims, and a number of German officials expressed surprise and dismay, opining that the government of Chancellor Angela Merkel should act more swiftly to implement legislation to secure Germany’s networks.
  • Inside the Plot to Kill the Open Technology Fund” – Vice. According to critics, the Trump Administration’s remaking of the United States (US) Agency for Global Media (USAGM) is threatening the mission and effectiveness of the Open Technology Fund (OTF), a US government non-profit designed to help dissidents and endangered populations throughout the world. The OTF has funded a number of open technology projects, including the Signal messaging app, but the new USAGM head, Michael pack, is pressing for closed source technology.
  • How Police Secretly Took Over a Global Phone Network for Organized Crime” – Vice. European law enforcement agencies penetrated and compromised an encrypted messaging service in Europe, leading to a number of arrests and seizures of drugs. Encrochat had billed itself as completely secure, but hackers with the French government broke into the system and laid bare the details of numerous crimes. And, this is only the latest encrypted app that is marketed to criminals, meaning others will soon step into the void created when Encrochat shut down.
  • Virus-Tracing Apps Are Rife With Problems. Governments Are Rushing to Fix Them.” – The New York Times. In numerous nations around the world, the rush to design and distribute contact tracing apps to fight COVID-19 has resulted in a host of problems predicted by information technology professionals and privacy, civil liberties and human rights advocates. Some apps collect too much information, many are not secure, and some do not seem to perform their intended tasks. Moreover, without mass adoption, the utility of an app is questionable at best. Some countries have sought to improve and perfect their apps in response to criticism, but others are continuing to use and even mandate their citizens and residents use them.
  • Hong Kong Security Law Sets Stage for Global Internet Fight” – The New York Times. After the People’s Republic of China (PRC) passed a new law that strips many of the protections Hong Kong enjoyed, technology companies are caught in a bind, for now Hong Kong may well start demanding they hand over data on people living in Hong Kong or employees could face jail time. Moreover, the data demands made of companies like Google or Facebook could pertain to people anywhere in the world. Companies that comply with Beijing’s wishes would likely face turbulence in Washington and vice versa. TikTok said it would withdraw from Hong Kong altogether.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gino Crescoli from Pixabay

Dueling COVID-19 Privacy Bills Released

Democratic stakeholders answer a Republican proposal on how to regulate privacy issues raised by COVID-19 contact tracing. The proposals have little chance of enactment and are mostly about positioning.  

First things first, if you would like to receive my Technology Policy Update, email me. You can find some of these Updates from 2019 and 2020 here.

Late last week, a group of Democratic stakeholders on privacy and data security issues released the “Public Health Emergency Privacy Act” (S.3749), a bill that serves as a counterpoint to the “COVID-19 Consumer Data Protection Act” (S.3663) legislation introduced a few weeks ago to pose a solution to the privacy issues raised by contact tracing of COVID-19. However, the Democratic bill contains a number of provisions that many Republicans consider non-starters such as a private right of action for individuals and no preemption of state laws. It is not likely this bill will advance in the Senate even though it may possibly be moved in the House. S. 3749 was introduced by Senators Richard Blumenthal (D-CT) and Mark Warner (D-VA) and Representatives Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Suzan DelBene (D-WA). 

In a way, the “Public Health Emergency Privacy Act” makes the case that “Health Insurance Portability and Accountability Act” (HIPAA)/“Health Information Technology for Economic and Clinical Health Act” (HITECH Act) regulations are inadequate to protect the privacy and data of people in the United States for the reason that these regulations apply only to healthcare providers, their business associates, and other named entities in the healthcare system. Entities collecting healthcare information, even very sensitive information, are not subject to HIPAA/HITECH regulations and would likely be regulated, to the extent they are, by the Federal Trade Commission (FTC) or state agencies and attorneys general.

The “Public Health Emergency Privacy Act” would cover virtually all entities, including government agencies except for public health authorities (e.g. a state department of health or the Centers for Disease Control and Prevention), healthcare providers, service providers, people acting in a household capacity. This remarkable definition likely reflects plans announced around by the world by governments to eschew Google and Apple’s contact tracing app to develop its own. Moreover, it would also touch some efforts aside and apart from contact tracing apps. Moreover, this is the first bill introduced recently that proposes to treat public and private entities the same way with respect to how and when they may collect personal data.

The types of data protected under the act are “emergency health data” which are defined as “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” The bill then lists a number of examples of emergency health data that is sweeping and comprehensive. This term includes “information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual” related to testing for COVID-19 and related genetic and biometric information. Geolocation and proximity information would also be covered by this definition. Finally, emergency health data encompasses “any other data collected from a personal device,” which seems to cover all data collection from a person’s phone, the primary means of contact tracing proposed thus far. However, it would appear that data on people collected at the household or higher level may not be subject to this definition and hence much of the bill’s protections.

The authority provided to covered entities is limited. First, collection, use, and disclosure of emergency health data are restricted to good faith health purposes. And yet, this term is not defined in the act, and the FTC may need to define it during the expedited rulemaking the bill. However, it seems a fairly safe assumption the agency and courts would not construe using emergency health data for advertising would not be a good faith public health purpose. Next, covered entities must allow people to correct inaccurate information and the entity itself has the duty to take reasonable efforts on its own to correct this information. Covered entities must implement reasonable safeguards to prevent discrimination on the basis of these data, which is a new obligation placed on covered entities in a privacy or data security bill. This provision may have been included to bar government agencies collecting and using covered data for discriminatory purposes. However, critics of more expansive bills may see this as the camel’s nose under the tent for future privacy and data security bills. Finally, the only government agencies that can legally be provided emergency health data are public health authorities and then only “for good faith public health purposes and in direct response to exigent circumstances.” Again, the limits of good faith public health purposes remain unclear and then such disclosure can only occur in exigent circumstances, which likely means when a person has contracted or been exposed to COVID-19.

Covered entities and service providers must implement appropriate measure to protect the security and confidentiality of emergency health data, but the third member of the usual triumvirate, availability, is not identified as requiring protection.

The “Public Health Emergency Privacy Act” outright bars a number of uses for emergency health data:

  • Commercial advertising, recommendations for e-commerce, or machine learning for these purposes
  • Any discriminatory practices related to employment, finance, credit, insurance, housing, or education opportunities; and
  • Discriminating with respect to goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation

It bears note the covered data cannot be collected or disclosed for these purposes either.

The act contains very strong consent language. Covered entities may not collect, use, or disclose emergency health data unless a person has provided express affirmative consent, which requires knowing, informed choice that cannot be obtained through the use of deceptive practices nor inferred through a person’s inaction. However, there are significant exceptions to this consent requirement that would allow covered entities to collect, use, or disclose these data, including

  • protecting against malicious, deceptive, fraudulent, or illegal activity; or
  • detecting, responding to, or preventing information security incidents or threats;
    the covered organization is compelled to do so by a legal obligation.

But these purposes are valid only when necessary and solely for the fulfillment of named purpose.

In a related vein, covered entities must allow people to revoke their consent and it must be effective as soon as practicable but no later than 15 days afterward. Moreover, the person’s emergency health data must be destroyed or rendered not linkable within 30 days of revocation of consent.

Covered entities must provide clear and conspicuous notice before or at the point of data collection that explains how and why the emergency health data is collected, used, and disclosed. Moreover, it must also disclose the categories of recipients to whom a covered entity discloses covered data. This notice must also disclose the covered entity’s data retention policies and data security policies and practices but only with respect to emergency health data. The notice must also inform consumers on how they may exercise rights under the act and how to file a complaint with the FTC.

There would also be a public reporting requirement for those covered entities collecting, using, or disclosing the emergency health data of 100,000 or more people. Every three months, these entities would need to report on aggregate figures on the number of people from whom data was collected, used, and disclosed. These reports must also detail the categories of emergency health data collected, used and disclosed and the categories of third parties with whom such data are shared.

The bill requires covered entities to destroy or make not linkable emergency health data 60 days after the Secretary of Health and Human Services declares the end of the COVID-19 public health emergency, or a state does so, or 60 days after collection.

The FTC would be given the daunting task of beginning a rulemaking 7 days after enactment “to ensure a covered organization that has collected, used, or disclosed emergency health data before the date of enactment of this Act is in compliance with this Act, to the degree practicable” that must be completed within 45 days. There is also a provision requiring the Department of Health and Human Services (HHS) to issue guidance so that HIPAA/HITECH Act regulated entities do not need to meet duplicative requirements, and, moreover, the bill exempts these entities when operating under the HIPAA/HITECH Act regulations.

The FTC, state attorneys general, and people would be able to enforce this new act through a variety of means. The FTC would treat violations as f they were violations of regulation barring a deceptive or unfair practice, meaning the agency could levy fines in the first instance of more than $43,000 a violation. The FTC could seek a range of other relief, including injunctions, restitution, disgorgement, and remediation. However, the FTC could go to federal court without having to consult with the Department of Justice, which is a departure from current law and almost all the privacy bills introduced in this Congress. The FTC’s jurisdiction would be broadened to include common carriers and non-profits for purposes of this bill, too. The FTC would receive normal notice and comment rulemaking authority that is very broad as the agency would be free to promulgate any regulations it sees necessary to effectuate this act.

State attorneys general would be able to seek all the same relief the FTC can so long as the latter is not already bringing an action. Moreover, any other state officials empowered by state statute to bring similar actions may do so for violations of this act.

People would be allowed to sue in either federal or state court to vindicate violations. The bill states that any violation is to be considered an injury in fact to forestall any court from finding that a violation does not injure the person, meaning her suit cannot proceed. The act would allow people to recover between $100-1000 for any negligent violation and between $500-5000 for any reckless, willful, or intentional violation with no cap on total damages. People may also ask for and receive reasonable attorney’s fees and costs and any equitable or declaratory relief a court sees fit to grant. Moreover, the bill would disallow all pre-lawsuit arbitration agreements or waivers of rights. Much of the right to sue granted to people by the “Public Health Emergency Privacy Act” will be opposed by many Republicans and industry stakeholders.

The enforcement section raises a few questions given that entities covered by the bill include government agencies. Presumably, the FTC cannot enforce this act against government agencies for the very good reason they do not have jurisdiction over them. However, does the private right of action waive the federal and state government’s sovereign immunity? This is not clear and may need clarification of the bill is acted upon in either chamber of Congress.

This bill would not preempt state laws, which, if enacted, could subject covered entities to meeting more than one regime for collecting, using, and disclosing emergency health data.

Apart from contact tracing, the “Public Health Emergency Privacy Act” also bars the use of emergency health data to abridge a person’s right to vote and it requires HHS, the United States Commission on Civil Rights, and the FTC to “prepare and submit to Congress reports that examines the civil rights impact of the collection, use, and disclosure of health information in response to the COVID–19 public health emergency.”

Given the continued impasse over privacy legislation, it is little wonder that the bill unveiled a few weeks ago by four key Senate Republicans takes a similar approach that differs in key aspects. Of course, there is no private right of action and it expressly preempts state laws to the contrary.

Generally speaking, the structure of the “COVID–19 Consumer Data Protection Act of 2020” (S.3663) tracks with the bills that have been released thus far by the four sponsors: Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) (See here for analysis of the “Consumer Data Privacy Act of 2019”)and Senators John Thune (R-SD), Jerry Moran (R-KS) (See here for analysis of “Consumer Data Privacy and Security Act of 2020” (S.3456)), and Marsha Blackburn (R-TN) (See here for analysis of the “Balancing the Rights Of Web Surfers Equally and Responsibly Act of 2019” (BROWSER Act) (S. 1116)). In short, people would be provided with notice about what information the app collects, how it is processed, and with whom and under what circumstances this information will be shared. Then a person would be free to make an informed choice about whether or not she wants to consent and allow the app or technology to operate on her smartphone. The FTC and state attorneys general would enforce the new protections.

The scope of the information and entities covered is narrower than the Democratic bill. “Covered data” is “precise geolocation data, proximity data, a persistent identifier, and personal health information” but not aggregated data, business contact information, de-identified data, employee screening data, and publicly available information. Those entities covered by the bill are those already subject to FTC jurisdiction along with common carriers and non-profits, but government agencies are not, again unlike the bill put forth by Democrats. Entities must abide by this bill to the extent they “collect[], process[], or transfer[] such covered data, or determine[] the means and purposes for the collection, processing, or transfer of covered data.”

Another key definition is how an “individual” is determined, for it excludes any person acting in her role as an employee and almost all work-related capacities, making clear employers will not need to comply with respect to those working for them.

“Personal health information” is “information relating to an individual that is

  • genetic information of the individual; or
  • information relating to the diagnosis or treatment of past, present, or future physical, mental health, or disability of the individual; and
  • identifies, or is reasonably linkable to, the individual.

And yet, this term excludes educational information subject to the “Family Educational Rights and Privacy Act of 1974” or “information subject to regulations promulgated pursuant to the HIPAA/HITECH Acts.

The “COVID–19 Consumer Data Protection Act of 2020” bars the collection, processing, or transferring of covered data for a covered purpose unless prior notice is provided, a person has provided affirmative consent, and the covered entity has agreed not to collect, process, or transfer such covered data for any other purpose than those detailed in the notice. However, leaving aside the bill’s enumerated allowable purposes for which covered data may collected with consent, the bill provides a number of exemptions from this general bar. For example, collection, processing, or transfers necessary for a covered entity to comply with another law are permissible apparently in the absence of a person’s consent. Moreover, a covered entity need not obtain consent for operational or administrative tasks not disclosed in the notice provided to people.

The act does spell out “covered purposes” for which covered entities may collect, process, or transfer covered data with consent after notice has been given:

  • Collecting, processing, or transferring the covered data of an individual to track the spread, signs, or symptoms of COVID–19.
  • Collecting, processing, or transferring the covered data of an individual to measure compliance with social distancing guidelines or other requirements related to COVID–19 that are imposed on individuals under a Federal, State, or local government order.
  • Collecting, processing, or transferring the covered data of an individual to conduct contact tracing for COVID–19 cases.

Covered entities would be required to publish a privacy policy detailing for which of the above covered purposes a person’s covered data would be collected, processed, or transferred. This policy must also detail the categories of entities that receive covered data, its data retention and data security policies.

There would be reporting requirements that would affect more covered entities than the Democratic bill. Accordingly, any covered entity collecting, processing or transferring covered data for one of the enumerated covered processes must issue a public report 30 days after enactment and then every 60 days thereafter.

Among other provisions in the bill, people would be able to revoke consent, a request that covered entities must honor within 14 days. Covered entities must also ensure the covered data are accurate, but this requirement falls a bit short of people being granted to right to correct inaccurate data as they would, instead, be merely able to report inaccuracies. There is no recourse if a covered entity chooses not to heed these reports. Covered entities would need to “delete or de-identify all covered data collected, processed, or transferred for a [covered purpose] when it is no longer being used for such purpose and is no longer necessary to comply with a Federal, State, or local legal obligation, or the establishment, exercise, or defense of a legal claim.” Even though there is the commandment to delete or de-identify, the timing as to when that happens seems somewhat open-ended as some covered entities could seek legal obligations to meet in order to keep the data on hand.

Covered entities must also minimize covered data by limiting collection, processing, and transferring to “what is reasonably necessary, proportionate, and limited to carry out [a covered purpose.]” The FTC must draft and release guidelines “recommending best practices for covered entities to minimize the collection, processing, and transfer of covered data.” However, these guidelines would most likely be advisory in nature and would not carry the force of law or regulation, leaving covered entities to disregard some of the document if they choose. Covered entities must “establish, implement, and maintain reasonable administrative, technical, and physical data security policies and practices to protect against risks to the confidentiality, security, and integrity of such data,” a mandate broader than the duty imposed by the Democratic bill.

The FTC and state attorneys general would enforce the new regime. The FTC would be able to seek civil fines for first violations in the same way as the Democrat’s privacy bill. However, unlike the other bill, the Republican bill would nullify any Federal Communications Commission (FCC) jurisdiction to the extent it conflicted with the FTC’s in enforcement of the new act. Presumably, this would address jurisdictional issues raised by placing common carriers under FTC jurisdiction when they are usually overseen by the FCC. Even though state laws are preempted, state attorneys general would be able to bring actions to enforce this act at the state level. And, as noted earlier, there is no private right of action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.