Further Reading, Other Developments, and Coming Events (18 February 2021)

Further Reading

  • Google, Microsoft, Qualcomm Protest Nvidia’s Acquisition of Arm Ltd.” By  David McLaughlin, Ian King, and Dina Bass — Bloomberg. Major United States (U.S.) tech multinationals are telling the U.S. government that Nvidia’s proposed purchase of Arm will hurt competition in the semi-conductor market, an interesting position for an industry renowned for being acquisition hungry. The British firm, Arm, is a key player in the semi-conductor business that deals with all companies, and the fear articulated by firms like Qualcomm, Microsoft, and Google is that Nvidia will cut supply and increase prices once it controls Arm. According to one report, Arm has made something like 95% of the chip architecture for the world’s smartphones and 95% of the chips made in the People’s Republic of China (PRC). The deal has to clear U.S., British, EU, and PRC regulators. In the U.S., the Federal Trade Commission (FTC) has reportedly made very large document requests, which indicates their interest in digging into the deal and suggests the possibility they may come out against the acquisition. The FTC may also be waiting to read the mood in Washington as there is renewed, bipartisan concern about antitrust and competition and about the semi-conductor industry. Finally, acting FTC Chair Rebecca Kelly Slaughter has come out against a lax approach to so-called vertical mergers such as the proposed Nvidia-Arm deal, which may well be the ultimate position of a Democratic FTC.
  • Are Private Messaging Apps the Next Misinformation Hot Spot?” By Brian X. Chen and Kevin Roose — The New York Times. The conclusion these two tech writers reach is that, on balance, private messaging apps like Signal and Telegram, are better for society than not. Moreover, they reason it is better to have extremists migrate from platforms like Facebook to ones where it is much harder to spread their views and proselytize.
  • Amazon Has Transformed the Geography of Wealth and Power” By Vauhini Vara — The Atlantic. A harrowing view of the rise of Amazon cast against the decline of the middle class and the middle of the United States (U.S.) Correlation is not causation, of course, but the company has sped the decline of a number of industries and arguably a number of cities.
  • Zuckerberg responds to Apple’s privacy policies: “We need to inflict pain” By Samuel Axon — Ars Technica. Relations between the companies have worsened as their CEO have taken personal shots at each other in public and private culminating in Apple’s change to its iOS requiring users to agree to being tracked by apps across the internet, which is Facebook’s bread and butter. Expect things to get worse as both Tim Cook and Mark Zuckerberg think augmented reality or mixed reality are the next major frontiers in tech, suggesting the competition may intensify.
  • Inside the Making of Facebook’s Supreme Court” By Kate Klonik — The New Yorker. A very immersive piece on the genesis and design of the Facebook Oversight Board, originally conceived of as a supreme court for content moderation. However, not all content moderation decisions can be referred to the Board; in fact, only when Facebook decides to take down content does a person have a right to appeal. Otherwise, one must depend on the company’s beneficence. So, for example, if Facebook decided to leave up content that is racist toward Muslims, a Facebook user could not appeal the decision. Additionally, Board decisions are not precedential, which, in plain English means, if the Board decides a take down of, say, Nazi propaganda comports with Facebook’s rules, the company would not be obligated to take down similar Nazi content thereafter. This latter wrinkle will ultimately serve to limit the power of the Board. The piece quotes critics, including many involved with the design and establishment of the Board, who see the final form as being little more than a fig leaf for public relations.

Other Developments

  • The Department of Health and Human Services (HHS) was taken to task by a federal appeals court in a blunt opinion decrying the agency’s failure to articulate even the most basic rationale for a multi-million dollar fine of a major Houston hospital for its data security and data privacy violations. HHS’ Office of Civil Rights had levied $4.348 million find on  the University of Texas M.D. Anderson Cancer Center (M.D. Anderson) for violations of the regulations promulgated pursuant to the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) and “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5) governing the security and privacy of certain classes of health information. M.D. Anderson appealed the decision, losing at each stage, until it reached the United States Court of Appeals for the Fifth Circuit (Fifth Circuit.) In its ruling, the Fifth Circuit held that OCR’s “decision  was  arbitrary,  capricious,  and contrary to law.” The Fifth Circuit vacated the penalty and sent the matter back to HHS for further consideration.
    • In its opinion, the Fifth Circuit explained the facts:
      • First, back in 2012, an M.D. Anderson faculty member’s laptop was stolen. The laptop was not encrypted or password-protected but contained “electronic protected health information (ePHI) for 29,021 individuals.” Second, also in 2012, an M.D. Anderson trainee lost an unencrypted USB thumb drive during her evening commute. That thumb drive contained ePHI for over 2,000 individuals. Finally, in 2013, a visiting researcher at M.D. Anderson misplaced another unencrypted USB thumb drive, this time containing ePHI for nearly 3,600 individuals.
      • M.D. Anderson disclosed these incidents to HHS. Then HHS determined that M.D. Anderson had violated two federal regulations. HHS promulgated both of those regulations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act of 2009 (the “HITECH Act”). The first regulation requires entities covered by HIPAA and the HITECH Act to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data. 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”). The second regulation prohibits the unpermitted disclosure of protected health information. Id. § 164.502(a) (the “Disclosure Rule”).
      • HHS also determined that M.D. Anderson had “reasonable cause” to know that it had violated the rules. 42 U.S.C. § 1320d-5(a)(1)(B) (setting out the “reasonable cause” culpability standard). So, in a purported exercise of its power under 42 U.S.C. § 1320d-5 (HIPAA’s enforcement provision), HHS assessed daily penalties of $1,348,000 for the Encryption Rule violations, $1,500,000 for the 2012 Disclosure Rule violations, and $1,500,000 for the 2013 Disclosure Rule violations. In total, HHS imposed a civil monetary penalty (“CMP” or “penalty”) of $4,348,000.
      • M.D. Anderson unsuccessfully worked its way through two levels of administrative appeals. Then it petitioned our court for review. See 42 U.S.C. § 1320a-7a(e)  (authorizing  judicial  review).  After  M.D.  Anderson  filed  its  petition, the Government conceded that it could not defend its penalty and asked us to reduce it by a factor of 10 to $450,000. 
  • The Australian Senate Standing Committee for the Scrutiny of Bills has weighed in on both the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, two major legislative proposals put forth in December 2020. This committee plays a special role in legislating in the Senate, for it must “scrutinise each bill introduced into the Parliament as to whether the bills, by express words or otherwise:
    • (i)  trespass unduly on personal rights and liberties;
    • (ii)  make rights, liberties or obligations unduly dependent upon insufficiently defined administrative powers;
    • (iii)  make rights, liberties or obligations unduly dependent upon non- reviewable decisions;
    • (iv)  inappropriately delegate legislative powers; or
    • (v)  insufficiently subject the exercise of legislative power to parliamentary scrutiny.
    • Regarding the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (see here for analysis), the committee explained:
      • The bill seeks to amend the Surveillance Devices Act 2004 (SD Act), the Crimes Act 1914 (Crimes Act) and associated legislation to introduce three new types of warrants available to the Australian Federal Police (AFP) and the Australian Criminal Intelligence Commission (ACIC) for investigating and disrupting online crime. These are:
        • data disruption warrants, which enable the AFP and the ACIC to modify, add, copy or delete data for the purposes of frustrating the commission of serious offences online;
        • network activity warrants, which permit access to devices and networks used by suspected criminal networks, and
        • account takeover warrants, which provide the AFP and the ACIC with the ability to take control of a person’s online account for the purposes of gathering evidence to further a criminal investigation.
    • The committee flagged concerns about the bill in these categories:
      • Authorisation of coercive powers
        • Issuing authority
        • Time period for warrants
        • Mandatory considerations
        • Broad scope of offences
      • Use of coercive powers without a warrant
        • Emergency authorisations
      • Innocent third parties
        • Access to third party computers, communications in transit and account-based data
        • Compelling third parties to provide information
        • Broad definition of ‘criminal network of individuals’
      • Use of information obtained through warrant processes
        • Prohibitions on use
        • Storage and destruction of records
      • Presumption of innocence—certificate constitutes prima facie evidence
      • Reversal of evidential burden of proof
      • Broad delegation of administrative powers
        • Appropriate authorising officers of the ACIC
    • The committee asked for the following feedback from the government on the bill:
      • The committee requests the minister’s detailed advice as to:
        • why it is considered necessary and appropriate to enable law enforcement officers to disrupt or access data or takeover an online account without a warrant in certain emergency situations (noting the coercive and intrusive nature of these powers and the ability to seek a warrant via the telephone, fax or email);
        • the appropriateness of retaining information obtained under an emergency authorisation that is subsequently not approved by a judge or AAT member;
        • and the appropriateness of enabling law enforcement agencies to act to conceal any thing done under a warrant after the warrant has ceased to be in force, and whether the bill could be amended to provide a process for obtaining a separate concealment of access warrant if the original warrant has ceased to be in force.
      • The committee requests the minister’s detailed advice as to:
        • the effect of Schedules 1-3 on the privacy rights of third parties and a detailed justification for the intrusion on those rights, in particular:
        • why proposed sections 27KE and 27KP do not specifically require the judge or nominated AAT member to consider the privacy implications
        • for third parties of authorising access to a third party computer or
        • communication in transit;
        • why the requirement that an issuing authority be satisfied that an assistance order is justifiable and proportionate, having regard to the offences to which it would relate, only applies to an assistance order with respect to data disruption warrants, and not to all warrants; and
        • whether the breadth of the definitions of ‘electronically linked group of individuals’ and ‘criminal network of individuals’ can be narrowed to reduce the potential for intrusion on the privacy rights of innocent third parties.
    • The committee requests the minister’s detailed advice as to:
      • whether all of the exceptions to the restrictions on the use, recording or disclosure of protected information obtained under the warrants are appropriate and whether any exceptions are drafted in broader terms than is strictly necessary; and
      • why the bill does not require review of the continued need for the retention of records or reports comprising protected information on a more regular basis than a period of five years.
    • As the explanatory materials do not adequately address these issues, the committee requests the minister’s detailed advice as to:
      • why it is considered necessary and appropriate to provide for evidentiary certificates to be issued in connection a data disruption warrant or emergency authorisation, a network access warrant, or an account takeover warrant;
      • the circumstances in which it is intended that evidentiary certificates would be issued, including the nature of any relevant proceedings; and
      • the impact that issuing evidentiary certificates may have on individuals’ rights and liberties, including on the ability of individuals to challenge the lawfulness of actions taken by law enforcement agencies.
    • As the explanatory materials do not address this issue, the committee requests the minister’s advice as to why it is proposed to use offence-specific defences (which reverse the evidential burden of proof) in this instance. The committee’s consideration of the appropriateness of a provision which reverses the burden of proof is assisted if it explicitly addresses relevant principles as set out in the Guide to Framing Commonwealth Offences.
    • The committee requests the minister’s advice as to why it is considered necessary to allow for executive level members of staff of the ACIC to be ‘appropriate authorising officers’, in particular with reference to the committee’s scrutiny concerns in relation to the use of coercive powers without judicial authorisation under an emergency authorisation.
    • Regarding the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, the committee asserted the bill “seeks to establish a mandatory code of conduct to support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The committee requested less input on this bill:
      • requests the Treasurer’s advice as to why it is considered necessary and appropriate to leave the determination of which digital platforms must participate in the News Media and Digital Platforms Mandatory Bargaining Code to delegated legislation.
      • If it is considered appropriate to leave this matter to delegated legislation, the committee requests the Treasurer’s advice as to whether the bill can be amended to require the positive approval of each House of the Parliament before determinations made under proposed section 52E come into effect.
  • The European Data Protection Board (EDPB) issued a statement “on new draft provisions of the second additional protocol to the Council of Europe Convention on Cybercrime (Budapest Convention),” the second time it has weighed in on the rewrite of “the first international treaty on crimes committed via the Internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security.” The EDPB took issue with the process of meeting and drafting new provisions:
    • Following up on the publication of new draft provisions of the second additional protocol to the Budapest Convention , the EDPB therefore, once again, wishes to provide an expert and constructive contribution with a view to ensure that data protection considerations are duly taken into account in the overall drafting process of the additional protocol, considering that the meetings dedicated to the preparation of the additional protocol are being held in closed sessions and that the direct involvement of data protection authorities in the drafting process has not been foreseen in the T-CY Terms of Reference
    • The EDPB offered itself again as a resource and key stakeholder that needs to be involved with the effort:
      • In November 2019, the EDPB also published its latest contribution to the consultation on a draft second additional protocol, indicating that it remained available for further contributions and called for an early and more proactive involvement of data protection authorities in the preparation of these specific provisions, in order to ensure an optimal understanding and consideration of data protections safeguards (emphasis in the original).
    • The EDPB further asserted:
      • The EDPB remains fully aware that situations where judicial and law enforcement authorities are faced with a “cross-border situation” with regards to access to personal data as part of their investigations can be a challenging reality and recognises the legitimate objective of enhancing international cooperation on cybercrime and access to information. In parallel, the EDPB reiterates that the protection of personal data and legal certainty must be guaranteed, thus contributing to the objective of establishing sustainable arrangements for the sharing of personal data with third countries for law enforcement purposes, which are fully compatible with the EU Treaties and the Charter of Fundamental Rights of the EU. The EDPB furthermore considers it essential to frame the preparation of the additional protocol within the framework of the Council of Europe core values and principles, and in particular human rights and the rule of law.
  • The European Commission (EC) published a statement on how artificial intelligence (AI) “can transform Europe’s health sector.” The EC sketched out legislation it hopes to introduce soon on regulating AI in the European union (EU). The EC asserted:
    • A high-standard health system, rich health data and a strong research and innovation ecosystem are Europe’s key assets that can help transform its health sector and make the EU a global leader in health-related artificial intelligence applications. 
    • The use of artificial intelligence (AI) applications in healthcare is increasing rapidly.
    • Before the COVID-19 pandemic, challenges linked to our ageing populations and shortages of healthcare professionals were already driving up the adoption of AI technologies in healthcare. 
    • The pandemic has all but accelerated this trend. Real-time contact tracing apps are just one example of the many AI applications used to monitor the spread of the virus and to reinforce the public health response to it.
    • AI and robotics are also key for the development and manufacturing of new vaccines against COVID-19.
    • The European Commission is currently preparing a comprehensive package of measures to address issues posed by the introduction of AI, including a European legal framework for AI to address fundamental rights and safety risks specific to the AI systems, as well as rules on liability related to new technologies.
  • The House Energy and Commerce Committee Chair Frank Pallone, Jr. (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Apple CEO Tim Cook “urging review and improvement of Apple’s new App Privacy labels in light of recent reports suggesting they are often misleading or inaccurate.” Pallone and Schakowsky are working from a Washington Post article, in which the paper’s tech columnist learned that Apple’s purported ratings system to inform consumers about the privacy practices of apps is largely illusory and possibly illegally deceptive. Pallone and Schakowsky asserted:
    • According to recent reports, App Privacy labels can be highly misleading or blatantly false. Using software that logs data transmitted to trackers, a reporter discovered that approximately one third of evaluated apps that said they did not collect data had inaccurate labels. For example, a travel app labeled as collecting no data was sending identifiers and other data to a massive search engine and social media company, an app-analytics company, and even a Russian Internet company. A ‘slime simulator’ rated for ages 4 and older had a ‘Data Not Collected’ label, even though the app shares identifying information with major tech companies and shared data about the phone’s battery level, storage, general location, and volume level with a video game software development company.
    • Simplifying and enhancing privacy disclosures is a laudable goal, but consumer trust in privacy labeling approaches may be undermined if Apple’s App Privacy labels disseminate false and misleading information. Without meaningful, accurate information, Apple’s tool of illumination and transparency may become a source of consumer confusion and harm. False and misleading privacy labels can dupe privacy-conscious consumers into downloading data-intensive apps, ultimately eroding the credibility and integrity of the labels. A privacy label without credibility and integrity also may dull the competitive forces encouraging app developers to improve their data practices.
    • A privacy label is no protection if it is false. We urge Apple to improve the validity of its App Privacy labels to ensure consumers are provided meaningful information about their apps’ data practices and that consumers are not harmed by these potentially deceptive practices.
    • Pallone and Schakowsky stated “[t]o better understand Apple’s practices with respect to the privacy labels, we request that you provide written response to the following questions by February 23, 2021:
      • 1. Apple has stated that it conducts routine and ongoing audits of the information provided by developers and works with developers to correct any inaccuracies.
        • a. Please detail the process by which Apple audits the privacy information provided by app developers. Please explain how frequently audits are conducted, the criteria by which Apple selects which apps to audit, and the methods for verifying the accuracy of the privacy information provided by apps.
        • b. How many apps have been audited since the implementation of the App Privacy label? Of those, how many were found to have provided inaccurate or misleading information? 
      • 2. Does Apple ensure that App Privacy labels are corrected upon the discovery of inaccuracies or misleading information? If not, why not? For each app that has been found to have provided inaccurate or misleading information, how quickly was that label corrected?
      • 3. Please detail Apple’s enforcement policies when an app fails to provide accurate privacy information for the App Privacy label.
      • 4. Does Apple require more in-depth privacy disclosures and conduct more stringent oversight of apps targeted to children under the age of 13? If not, why not? If so, please describe the additional disclosures required and the oversight actions employed for these apps.
      • 5. Providing clear and easily comprehendible privacy information at the point of sale is certainly valuable, but privacy policies are not static. Does Apple notify users when one of their app’s privacy labels has materially changed? If not, why not. If so, how are users notified of such changes.
  • The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) “published its draft rules of the road for governing the future use of digital identities…[and] [i]t is part of plans to make it quicker and easier for people to verify themselves using modern technology and create a process as trusted as using passports or bank statements” according to its press release. The DCMS wants feedback by 11 March 2021 on the draft trust framework. The DCMS stated:
    • Digital identity products allow people to prove who they are, where they live or how old they are. They are set to revolutionise transactions such as buying a house, when people are often required to prove their identity multiple times to a bank, conveyancer or estate agent, and buying age-restricted goods online or in person.
    • The new ‘trust framework’ lays out the draft rules of the road organisations should follow. It includes the principles, policies, procedures and standards governing the use of digital identity to allow for the sharing of information to check people’s identities or personal details, such as a user’s address or age, in a trusted and consistent way. This will enable interoperability and increase public confidence.
    • The framework, once finalised, is expected to be brought into law. It has specific standards and requirements for organisations which provide or use digital identity services including:
      • Having a data management policy which explains how they create, obtain, disclose, protect, and delete data;
      • Following industry standards and best practice for information security and encryption;
      • Telling the user if any changes, for example an update to their address, have been made to their digital identity;
      • Where appropriate, having a detailed account recovery process and notifying users if organisations suspect someone has fraudulently accessed their account or used their digital identity;
      • Following guidance on how to choose secure authenticators for their service.
  • The European Commission (EC) “opened infringement procedures against 24 Member States for failing to enact new EU telecom rules.”
    • The EC asserted:
      • The European Electronic Communications Code modernises the European regulatory framework for electronic communications, to enhance consumers’ choices and rights, for example by ensuring clearer contracts, quality of services, and competitive markets. The Code also ensures higher standards of communication services, including more efficient and accessible emergency communications. Furthermore, it allows operators to benefit from rules incentivising investments in very-high capacity networks, as well as from enhanced regulatory predictability, leading to more innovative digital services and infrastructures.
      • The European Electronic Communications Code that brings the regulatory framework governing the European telecom sector up to date with the new challenges came into force in December 2018, and Member States have had two years to implement its rules. It is a central piece of legislation to achieve Europe’s Gigabit society and ensure full participation of all EU citizens in the digital economy and society.

Coming Events

  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • The U.S.-China Economic and Security Review Commission will hold a hearing titled “Deterring PRC Aggression Toward Taiwan” on 18 February.
  • On 24 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Fanning the Flames: Disinformation and Extremism in the Media.”
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Estúdio Bloom on Unsplash

Further Reading, Other Developments, and Coming Events (26, 27, and 28 January 2021)

Further Reading

  • President Biden’s Tech To-Do List” By Shira Ovide — The New York Times. Another survey of the pressing tech issues President Joe Biden and his Administration will grapple with.
  • Trying to improve remote learning? A refugee camp offers some surprising lessons” By Javeria Salman — The Hechinger Report. An organization that is helping refugee children advises that digital literacy is the necessary first step in helping all children have positive online learning experiences (assuming of course they have devices and internet access). This means more than being adept with Instagram, TikTok, and Snapchat. They also suggest that children work on projects as opposed to busy work.
  • Silicon Valley Takes the Battlespace” By Jonathan Guyer — The American Prospect. A company funded, in part, by former Google CEO Eric Schmidt, Rebellion Defense, landed two members on then President-elect Joe Biden’s official transition team, causing some to wonder about the group. This starts up writes artificial intelligence (AI) with defense industry applications, among other products. Schmidt chairs the National Security Commission on Artificial Intelligence and is widely seen as a bridge between Washington and Silicon Valley. Some see the rise of this company as the classic inside the Beltway tale of blurring interests and capitalizing on connections and know how.
  • The fight to make Netflix and Hulu pay cable fees” By Adi Robertson — The Verge. Municipalities are suing platforms like Netflix, Hulu, Dish Network, DirecTV and others, claiming they are not paying the franchise fees and quarterly fees traditional cable companies have been subject to for the use of the localities’ rights of way and broadband service. The companies are, of course, arguing they are not subject to these laws because they are not cable companies. There have been a host of such suits filed throughout the United States (U.S.) and bear watching.
  • Twitter’s misinformation problem is much bigger than Trump. The crowd may help solve it.” By Elizabeth Dwoskin — The Washington Post. Sounds like Twitter is going the route of Wikipedia with a pilot in which volunteers would fact check and provide context to problematic content. Perhaps this helps address the problems posed by social media platforms.
  • Biden’s clean up of Silicon Valley poses a problem for Scott Morrison” By Harley Dennett — The Canberra Times. The concern down under is that the Biden Administration will press the Morrison government into weakening the “Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020” that “establishes a mandatory code of conduct to help support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses” according to the Explanatory Memorandum. Doing so would please Google, Facebook, and others, supposedly making them more amenable to the coming policy changes Democrats want to unleash on tech companies. It remains to be seen what the Biden Administration would get in return.
  • China turbocharges bid to discredit Western vaccines, spread virus conspiracy theories” By Gerry Shih — The Washington Post. In light of more effective vaccines developed by United States (U.S.) companies and a World Health Organization (WHO) team in Wuhan investigating, the People’s Republic of China (PRC) has kicked its propaganda campaign into high gear. All sorts of unsubstantiated claims are being made about the safety and effectiveness of the U.S. vaccines and the source of COVID-19 (allegedly from the U.S.)
  • A Chinese hacking group is stealing airline passenger details” By Catalin Cimpanu — ZDNet.  Hackers associated with the People’s Republic of China (PRC) apparently hacked into one of the companies that generates Passenger Name Records (PNR) that details who flies where and when. There are many uses for these data, including identifying likely foreign intelligence operatives such as Central Intelligence Agency (CIA) agents stationed abroad.
  • Biden Has a Peloton Bike. That Raises Issues at the White House.” By Sheryl Gay Stolberg — The New York Times. This is the level of coverage of the new President. His predecessor used an insecure iPhone that other nations’ intelligence agencies were likely tapping and was famously careless with classified information. And yet, President Joe Biden’s Peloton worries cybersecurity experts. Buried inside the story are the revelations that during the Digital Age, Presidents present cybersecurity challenges and tailored solutions are found.
  • Ministry of Electronics asks Whatsapp to withdraw changes to privacy policy, disclose data sharing practice” By Bismah Malik — The New Indian Express. India’s Ministry of Electronics and Information Technology (MeitY) is asking WhatsApp to scrap plans to roll out an already delayed change to privacy policies. India is the company’s largest market and has already flexed its muscle against other foreign apps it claimed posed dangers to its people like TikTok. WhatsApp would likely be blocked under a proposed Indian law from moving ahead with its plan to make data people share with WhatsApp business accounts available to Facebook and for advertising. The Data Protection Bill is expected to pass the Parliament his year.
  • WhatsApp Fueled A Global Misinformation Crisis. Now, It’s Stuck In One.” By Pranav Dixit — BuzzFeed News. A nice overview of how WhatsApp and Facebook’s missteps and limited credibility with people resulted in a widely believed misrepresentation about the changes to WhatsApp’s Terms of Service announced earlier this year.
  • Amazon, Facebook, other tech giants spent roughly $65 million to lobby Washington last year” By Tony Romm — The Washington Post. While Amazon and Facebook increased their federal lobbying, Google cut back. It bears note these totals are only for the lobbying these entities are doing directly to the federal government and does not include what they spend on firms and lobbyists in Washington (which is plenty) or their contributions to organizations like the Information Technology Industry Council or the Center for Democracy and Technology (which, again, is a lot.) Let’s also not forget political contributions or fundraising by the leadership and senior employees of these companies and political action committees (PAC). Finally, these totals exclude funds spent in state capitals, and I expect tech companies dropped a ton of cash in places like Sacramento and Olympia last year as major privacy legislation was under consideration. Moreover, this article does not take in whatever the companies are spending in Brussels and other capitals around the world.
  • Google won’t donate to members of Congress who voted against election results” By Ashley Gold — Axios. Speaking of using money to influence the political process, Google has joined other tech companies in pausing donations to Members who voted against certifying President Joe Biden’s victory in the Electoral College (i.e., Senators Ted Cruz (R-TX) and Josh Hawley (R-MO), to name two). We’ll see how long this lasts.
  • FCC’S acting chair says agency reviewing reports of U.S. East Coast internet outages” By Staff — Reuters; “Big Internet outages hit the East Coast, causing issues for Verizon, Zoom, Slack, Gmail” By Rachel Lerman — The Washington Post. On 26 January, there were widespread internet outages on the east coast of the United States (U.S.) that the Federal Communications Commission (FCC) is vowing to investigate. Acting FCC Chair Jessica Rosenworcel tweeted:
    • We have seen reports of internet-related outages on the East Coast, making it difficult for people to work remotely and go to school online. The @FCC Public Safety and Homeland Security Bureau is working to get to the bottom of what is going on.
    • It is not clear where and why the roughly hour long outage occurred, but early fingers are being pointed at Verizon FIOS.
  • Police Say They Can Use Facial Recognition, Despite Bans” By Alfred Ng — The Markup. No one should be surprised that many police departments are reading bans on using facial recognition technology as narrowly as possible. Nevertheless, legislators and advocates are fighting over the interpretations of these recently passed statutes, almost all of which have been put in place by municipalities. Jurisdictions in the United States may also soon choose to address the use of facial recognition technology by businesses.
  • Why Are Moscow and Beijing Happy to Host the U.S. Far-Right Online?” By Fergus Ryan — Foreign Policy. The enemy of my enemy is my friend, supposedly. Hence, extremist right-wingers, white supremacists, and others are making common cause with the companies of the People’s Republic of China and the Russian Federation by moving their websites and materials to those jurisdictions after getting banned by western companies. Given how closely Beijing and Moscow monitor their nations’ internet, this is surely done with the tacit permission of those governments and quite possibly to the same end as their disinformation campaigns: to disrupt the United States and neutralize it as a rival.
  • After Huawei, Europe’s telcos want ‘open’ 5G networks “ By Laurens Cerulus — Politico EU. Europe’s major telecommunications companies, Deutsche Telekom, Telefónica, Vodafone and Orange, have banded together to support and buy Open RAN technology to roll out 5G instead of buying from Ericsson or Nokia who are promising to do it all. The Open RAN would allow for smaller companies to build pieces of 5G networks that would be interchangeable since everyone is working from the same standards. Huawei, of course, has been shut out of many European nations and see the development as more evidence that western nations are ganging up on it.

Other Developments

  • White House Press Secretary Jen Psaki confirmed that President Joe Biden has directed the United Intelligence Community (IC) to investigate and report to him on the SolarWinds breach perpetrated by the Russian Federation’s foreign intelligence service, Sluzhba vneshney razvedki Rossiyskoy Federatsii (SVR). Thus far, it appears that many United States (U.S.) agencies and private sector entities were quietly breached in early 2020 and then surveilled for months until FireEye, a private sector cybersecurity company, divulged it had been breached. Given former President Donald Trump’s aversion to acknowledging the malicious acts of Russia, it seemed likely the Biden Administration would start the U.S. response. Interestingly, the Biden Administration is extending two nuclear weapons control treaties at the same time it seeks to undertake this assessment of Russian hacking. And, whatever the results of the assessment, experts are in agreement that the Biden Administration would seem to have few good options to retaliate and deter future action.
    • At a 21 January press briefing, Psaki stated
      • I can confirm that the United States intends to seek a five-year extension of New START, as the treaty permits.  The President has long been clear that the New START Treaty is in the national security interests of the United States.  And this extension makes even more sense when the relationship with Russia is adversarial, as it is at this time.
      • New START is the only remaining treaty constraining Russian nuclear forces and is an anchor of strategic stability between our two countries.
      • And to the other part of your question: Even as we work with Russia to advance U.S. interests, so too we work to hold Russia to account for its reckless and adversarial actions.  And to this end, the President is also issuing a tasking to the intelligence community for its full assessment of the SolarWinds cyber breach, Russian interference in the 2020 election, its use of chemical weapons against opposition leader Alexei Navalny, and the alleged bounties on U.S. soldiers in Afghanistan.
  • A group of 40 organizations urged President Joe Biden “to avoid appointing to key antitrust enforcement positions individuals who have served as lawyers, lobbyists, or consultants for Amazon, Apple, Facebook, and Google” in a letter sent before his inauguration. Instead, they encouraged him “to appoint experienced litigators or public servants who have recognized the dangers of, rather than helped to exacerbate, these corporations’ market power.” They closed the letter with this paragraph:
    • With your historic election, and the groundbreaking mandate Americans have entrusted you with, you face the challenge of not only rebuilding the country, but also rebuilding trust in government. We believe that appointing antitrust enforcers with no ties to dominant corporations in the industries they will be tasked with overseeing –particularly in regard to the technology sector –willhelp re-establish public trust in government at a critically important moment in our country’s history. We look forward to working with your administration to ensure powerful technology corporations are held accountable for wrongdoing in the months of years ahead.
    • The signatories include:
      • Public Citizen
      • American Economic Liberties Project
      • Open Markets Institute
      • Revolving Door Project
  • The National Security Agency (NSA) issued an advisory “Adopting Encrypted DNS in Enterprise Environments,” “explaining the benefits and risks of adopting the encrypted domain name system (DNS) protocol, DNS over HTTPs (DoH), in enterprise environments.” This advisory is entirely voluntary and does not bind any class of entities. Moreover, it is the latest in a series of public advisories that has seen the heretofore secretive NSA seek to rival the Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) in advising the owners and operators of cyber infrastructure. The NSA explained:
    • Use of the Internet relies on translating domain names (like “nsa.gov”) to Internet Protocol addresses. This is the job of the Domain Name System (DNS). In the past, DNS lookups were generally unencrypted, since they have to be handled by the network to direct traffic to the right locations. DNS over Hypertext Transfer Protocol over Transport Layer Security (HTTPS), often referred to as DNS over HTTPS (DoH), encrypts DNS requests by using HTTPS to provide privacy, integrity, and “last mile” source authentication with a client’s DNS resolver. Itis useful to prevent eavesdropping and manipulation of DNS traffic.While DoH can help protect the privacy of DNS requests and the integrity of responses, enterprises that use DoH will lose some of the control needed to govern DNS usage within their networks unless they allow only their chosen DoH resolver to be used. Enterprise DNS controls can prevent numerous threat techniques used by cyber threat actors for initial access, command and control, and exfiltration.
    • Using DoH with external resolvers can be good for home or mobile users and networks that do not use DNS security controls. For enterprise networks, however, NSA recommends using only designated enterprise DNS resolvers in order to properly leverage essential enterprise cybersecurity defenses, facilitate access to local network resources, and protect internal network information. The enterprise DNS resolver may be either an enterprise-operated DNS server or an externally hosted service. Either way, the enterprise resolver should support encrypted DNS requests, such as DoH, for local privacy and integrity protections, but all other encrypted DNS resolvers should be disabled and blocked. However, if the enterprise DNS resolver does not support DoH, the enterprise DNS resolver should still be used and all encrypted DNS should be disabled and blocked until encrypted DNS capabilities can be fully integrated into the enterprise DNS infrastructure.
  • The United States (U.S.) Government Accountability Office (GAO) has sent a report to the chair of the House Oversight Committee on its own initiative that “examines: (1) the Department of Defense’s (DOD) efforts to revise the process for identifying and protecting its critical technologies, and (2) opportunities for DOD’s revised process to inform U.S. government protection programs.” The GAO stated:
    • DOD’s critical technologies—including those associated with an acquisition program throughout its lifecycle or those still early in development—are DOD funded efforts that provide new or improved capabilities necessary to maintain the U.S. technological advantage. For the purposes of this report, we refer to these as critical acquisition programs and technologies. Also for the purposes of this report, U.S. government protection programs are those GAO previously identified across the federal government that are designed to protect critical technologies such as the Arms Export Control System, National Industrial Security Program, and the Committee on Foreign Investment in the U.S
    • Critical technologies are pivotal to maintaining the U.S. military advantage and, as such, are a frequent target for unauthorized access by adversaries such as through theft, espionage, illegal export, and reverse engineering. DOD has long recognized the need to effectively identify and ensure the consistent protection of these technologies from adversaries, but past efforts have not been fully successful. Recent efforts to revise its process for identifying and protecting its critical acquisition programs and technologies—led by DOD’s Protecting Critical Technology Task Force— offer some improvements.
    • However, DOD can further strengthen its revised process by determining the approach for completing key steps. These steps include ensuring its critical acquisition programs and technologies list is formally communicated to all relevant internal entities and other federal agencies, such as the Department of the Treasury as chair of the Committee on Foreign Investment in the United States, to promote a consistent understanding of what DOD deems critical to protect. They also include developing appropriate metrics that DOD program offices as well as organizations—such as the military departments and Under Secretary of Defense level offices—can use to assess the implementation and sufficiency of the assigned protection measures. Finally, DOD has not yet designated an organization to oversee critical technology protection efforts beyond 2020. As DOD works to develop a policy for its revised process, addressing these issues will not only help improve and ensure continuity in DOD’s protection efforts, but also help ensure government- wide protection efforts are better coordinated as called for in the 2020 National Strategy for Critical and Emerging Technologies.
    • The GAO made three recommendations to the DOD:
      • The Secretary of Defense should direct the Deputy Secretary of Defense in conjunction with the Protecting Critical Technology Task Force to determine a process for formally communicating future critical acquisition programs and technologies lists to all relevant DOD organizations and federal agencies. (Recommendation 1)
      • The Secretary of Defense should direct the Deputy Secretary of Defense in conjunction with the Protecting Critical Technology Task Force to identify, develop, and periodically review appropriate metrics to assess the implementation and sufficiency of the assigned protection measures. (Recommendation 2)
      • The Secretary of Defense should direct the Deputy Secretary of Defense in conjunction with the Protecting Critical Technology Task Force to finalize the decision as to which DOD organization will oversee protection efforts beyond 2020. (Recommendation 3)
  • The National Telecommunications and Information Administration (NTIA) “under sponsorship of and in collaboration with the Department of Defense (DOD) 5G Initiative” “issued a Notice of Inquiry (NOI)…to explore a “5G Challenge” aiming to accelerate the development of an open source 5G ecosystem that can support DOD missions.” The NTIA explained:
    • A key innovation in 5G that is becoming more pervasive in the larger 5G ecosystem is the trend toward “open 5G” architectures that emphasize open interfaces in the network stack. NTIA, under sponsorship of and in collaboration with the DOD 5G Initiative, is seeking comments and recommendations from all interested stakeholders to explore the creation of a 5G Challenge that would accelerate the development of the open 5G stack ecosystem in support of DOD missions.
    • For the purposes of this Notice, NTIA has organized these questions into three broad categories: (1) Challenge structure and goals; (2) incentives and scope; and (3) timeframe and infrastructure support. NTIA seeks public input on any and/or all of these three categories.
  • The Court of Justice for the European Union’s (CJEU) Advocate General has released his opinion in a case on whether a different data protection authority (DPA) from the lead agency in a case may also bring actions in its court system. The General Data Protection Regulation (GDPR) has a mechanism that organizes the regulation of data protection in that one agency, often the first to act, becomes the lead supervisory authority (LSA) and other DPAs must follow its lead. Most famously, Ireland’s Data Protection Commission (DPC) has been the LSA for the action Maximillian Schrems brought against Facebook that led to the demise of two adequacy agreements between the United States (U.S.) and the European Union (EU). In each case, the DPC was the LSA. The CJEU is not obligated to follow the Advocate General’s opinions, but they frequently prove persuasive. In any event, the Advocate General found DPAs may, under some circumstances, bring cases for cross border infringement even if another DPA is LSA. Advocate General Michal Bobek summarized the facts of the case:
    • In September 2015, the Belgian data protection authority commenced proceedings before the Belgian courts against several companies belonging to the Facebook group (Facebook), namely Facebook INC, Facebook Ireland Ltd, which is the group’s main establishment in the EU, and Facebook Belgium BVBA (Facebook Belgium). In those proceedings, the data protection authority requested that Facebook be ordered to cease, with respect to any internet user established in Belgium, to place, without their consent, certain cookies on the device those individuals use when they browse a web page in the Facebook.com domain or when they end up on a third party’s website, as well as to collect data by means of social plugins and pixels on third party websites in an excessive manner. In addition, it requested the destruction of all personal data obtained by means of cookies and social plugins, about each internet user established in Belgium.
    • The proceedings at issue are at present in progress before the Hof van beroep te Brussel (Court of Appeal, Brussels, Belgium) with however their scope being limited to Facebook Belgium, as that court previously established that it had no jurisdiction with regard to the actions against Facebook INC and Facebook Ireland Ltd. In this context, Facebook Belgium asserts that, as of thed ate on which the General Data Protection Regulation (GDPR)1has become applicable,the Belgian data protection authority has lost competence to continue the judicial proceedings at issue against Facebook. It contends that, under the GDPR, only the data protection authority of the State of Facebook’s main establishment in the EU (the so-called ‘lead’ data protection authority in the EU for Facebook), namely the Irish Data Protection Commission, is empowered to engage in judicial proceedings against Facebook for infringements of the GDPR in relation to cross-border data processing.
    • Bobek summed up the legal questions presented to the CJEU:
      • Does the GDPR permit a supervisory authority of a Member State to bring proceedings before a court of that State for an alleged infringement of that regulation with respect to cross-border data processing, where that authority is not the lead supervisory authority with regard to that processing?
      • Or does the new ‘one-stop-shop’ mechanism, heralded as one of the major innovations brought about by the GDPR, prevent such a situation from happening? If a controller were called upon to defend itself against a legal challenge concerning cross-border data processing brought by a supervisory authority in a court outside the place of the controller’s main establishment, would that be ‘one-stop-too-many’ and therefore incompatible with the new GDPR mechanism?
    • Bobek made the following findings:
      • [F]irst, that it transpires from the wording of the GDPR that the lead data protection authority has a general competence over cross-border data processing, including the commencement of judicial proceedings for the breach of the GDPR, and, by implication, the other data protection authorities concerned enjoy a more limited power to act in that regard.
      • Second, the Advocate General recalls that the very reason for the introduction of the one-stop-shop mechanism enshrined in the GDPR, whereby a significant role has been given to the lead data protection authority and cooperation mechanisms have been set up to involve other data protection authorities, was to address certain shortcomings resulting from the former legislation. Indeed, economic operators used to be required to comply with the various sets of national rules implementing that legislation, and to liaise, at the same time, with all the national data protection authorities, which proved to be costly, burdensome and time-consuming for those operators, and an inevitable source of uncertainty and conflicts for them and their customers.
      • Third, the Advocate General stresses that the lead data protection authority cannot be deemed as the sole enforcer of the GDPR in cross-border situations and must, in compliance with the relevant rules and time limits provided for by the GDPR, closely cooperate with the other data protection authorities concerned, the input of which is crucial in this area.
  • The United States (U.S.) Department of Defense added more companies from the People’s Republic of China (PRC) to the list of those associated with or controlled by the Chinese Communist Party or the People’s Liberation Army (PLA) “in accordance with the statutory requirement of Section 1237 of the National Defense Authorization Act for Fiscal Year 1999.” The previous lists were released last year (here, here and here.) This designation will almost certainly make doing business in the United States (U.S.) and elsewhere more difficult.
    • The first part of Section 1237 grants the President authority to “exercise International Emergency Economic Powers Act (IEEPA) authorities (other than authorities relating to importation) without regard to section 202 of the IEEPA (50 U.S.C. 1701) in the case of any commercial activity in the United States by a person that is on the list.” IEEPA grants the President sweeping powers to prohibit transactions and block property and property interests for nations and other groups subject to an IEEPA national emergency declaration. Consequently, those companies identified by the DOD on a list per Section 1237 could be blocked and prohibited from doing business with U.S. entities and others and those that do business with such Chinese companies could be subject to enforcement actions by the U.S. government.
    • The statute defines a “Communist Chinese military company” as “any person identified in the Defense Intelligence Agency publication numbered VP-1920-271-90, dated September 1990, or PC-1921-57-95, dated October 1995, and any update of those publications for the purposes of this section; and any other person that is owned or controlled by the People’s Liberation Army; and is engaged in providing commercial services, manufacturing, producing, or exporting.” Considering that the terms “owned” and “controlled” are not spelled out in this section, the executive branch may have very wide latitude in deeming a non-Chinese company as owned or controlled and therefore subject to the President’s use of IEEPA powers. Moreover, since the President already has the authority to declare an emergency and then use IEEPA powers, this language would seem to allow the President to bypass any such declaration and immediately use such powers, except those regarding importation, against any Chinese entities identified on this list by the Pentagon.
  • A group of 13 House Democrats wrote Attorney General designate Merrick Garland asking that the Biden Administration “to withdraw from the United States (U.S.) federal government’s lawsuit against the State of California over its net neutrality law as one of the first actions after inauguration.” The Trump Administration had sued California after a measure became law in 2018, mandating net neutrality there in the wake of the Federal Communications Commission’s (FCC) rollback of federal net neutrality. The Members argued:
    • In September 2018, then-Governor Jerry Brown signed into law SB 822, the strongest net neutrality law in the country. The Trump Department of Justice (DOJ) sued to overturn California’s law hours later, and associations of telecommunications providers sued within days. Parties to the case agreed to put the case on hold until Mozilla v. FCC was resolved. In that case, the Court of Appeals for the D.C. Circuit vacated the part of the Federal Communications Commission (FCC)’s 2018 Restoring Internet Order (RIF) that preempted state net neutrality laws.
    • The arguments of the Trump DOJ and telecommunications associations in U.S. v. California extend further than even the FCC’s RIF and have implications on the ability of California and other states to regulate many communications and technology policy issues.
    • The Eastern District of California has scheduled a hearing in U.S. v. California for a request for an injunction on January 26, 2021. It is for these reasons, we ask that the federal DOJ withdraw from U.S. v. California shortly after President-elect Biden is inaugurated.
  • On its first day in power, the Biden Administration issued its “National Strategy for the COVID-19 Response and Pandemic Preparedness.” In the cover letter, President Joe Biden stated:
    • For the past year, we could not turn to the federal government for a national plan to answer prayers with action — until today. In the following pages, you will find my Administration’s national strategy to beat the COVID-19 pandemic. It is a comprehensive plan that starts with restoring public trust and mounting an aggressive, safe, and effective vaccination campaign. It continues with the steps we know that stop the spread liked expanded masking, testing, and social distancing. It’s a plan where the federal government works with states, cities, Tribal communities, and private industry to increase supply and administer testing and the vaccines that will help reopen schools and businesses safely. Equity will also be central to our strategy so that the communities and people being disproportionately infected and killed by the pandemic receive the care they need and deserve.
    • Given the numerous cyber-attacks and intrusions throughout the pandemic and growing risks to the entire vaccine supply chain, the President asked the Director of National Intelligence Avril Haines to “lead an assessment of ongoing cyber threats and foreign interference campaigns targeting COVID-19 vaccines and related public health efforts” in order to “counter any threat to the vaccination program.” The Administration stated “[t]he U.S. Government will take steps to address cyber threats to the fight against COVID-19, including cyber attacks on COVID-19 research, vaccination efforts, the health care systems and the public health infrastructure.”
    • Specifically, the strategy requires the following:
      • To assist in the Federal Government’s efforts to provide warning of pandemics, protect our biotechnology infrastructure from cyber attacks and intellectual property theft, identify and monitor biological threats from states and non-state actors, provide validation of foreign data and response efforts, and assess strategic challenges and opportunities from emerging biotechnologies, the Director of National Intelligence shall:
        • (i) Review the collection and reporting capabilities in the United States Intelligence Community (IC) related to pandemics and the full range of high-consequence biological threats and develop a plan for how the IC may strengthen and prioritize such capabilities, including through organizational changes or the creation of National Intelligence Manager and National Intelligence Officer positions focused on biological threats, global public health, and biotechnology;
        • (ii) Develop and submit to the President, through the Assistant to the President for National Security Affairs (APNSA) and the COVID-19 Response Coordinator, a National Intelligence Estimate on
          • (A) the impact of COVID-19 on national and economic security; and
          • (B) current, emerging, reemerging, potential, and future biological risks to national and economic security; and
        • (iii)  In coordination with the Secretary of State, the Secretary of Defense, the Secretary of Health and Human Services (HHS), the Director of the Centers for Disease Control and Prevention (CDC), the Administrator of United States Agency for International Development (USAID), the Director of the Office of Science and Technology Policy, and the heads of other relevant agencies, promptly develop and submit to the APNSA an analysis of the security implications of biological threats that can be incorporated into modeling, simulation, course of action analysis, and other analyses.
  • Before the end of the Trump Administration, the Departments of State and Treasury imposed sanctions on a group of Russians for taking part in “a Russia-linked foreign influence network associated with Andrii Derkach, who was designated on September 10, 2020, pursuant to Executive Order (E.O.) 13848 for his attempt to influence the 2020 U.S. Presidential election” according to the Trump Administration Department of State press release. These sanctions emanate from a narrative pushed by Derkach, a likely Russian agent, that the Biden family were engaged in corrupt dealings in Ukraine. Allies of the Trump Campaign pushed this narrative, too, until it failed to gain traction in the public sphere. It is little wonder the last administration waited until the tail end of the Trump presidency to levy such sanctions. State went on to explain:
    • Former Ukraine Government officials Konstantin Kulyk, Oleksandr Onyshchenko, Andriy Telizhenko, and current member of the Ukrainian parliament Oleksandr Dubinsky, have publicly appeared with or affiliated themselves with Derkach through the coordinated dissemination and promotion of fraudulent or unsubstantiated allegations involving a U.S. political candidate.  They have made repeated public statements advancing malicious narratives that U.S. Government officials have engaged in corrupt dealings in Ukraine.  These efforts and narratives are consistent with or in support of Derkach’s objectives to influence the 2020 U.S. presidential election.  As such, these individuals have been designated pursuant to E.O. 13848 for having directly or indirectly engaged in, sponsored, concealed, or otherwise been complicit in foreign influence in an attempt to undermine the 2020 U.S. elections.
    • NabuLeaks, Era-Media, Only News, and Skeptik TOV are media front companies in Ukraine that disseminate false narratives at the behest of Derkach’s and his associates.  They are being designated pursuant to E.O. 13848 for being owned or controlled by Derkach or his media team.  Today’s action also includes the designation of Petro Zhuravel, Dmytro Kovalchuk, and Anton Simonenko for having materially assisted, sponsored, or provided financial, material, or technological support for, or goods or services to or in support of, Derkach.
    • Additionally, the Department of the Treasury’s Office of Foreign Assets Control (OFAC) “took additional action against seven individuals and four entities that are part of a Russia-linked foreign influence network associated with Andrii Derkach” according to the agency’s press release. OFAC stated “[a]s a result of today’s designations, all property and interests in property of these targets that are subject to U.S. jurisdiction are blocked, and U.S. persons are generally prohibited from engaging in transactions with them. Additionally, any entities 50 percent or more owned by one or more designated persons are also blocked.”
  • The United States (U.S.) Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) published “a draft of the Trusted Internet Connections (TIC) 3.0 Remote User Use Case and the draft National Cybersecurity Protection System (NCPS) Cloud Interface Reference Architecture (NCIRA): Volume 2.” The agency remarked in its press release:
    • The TIC initiative was launched under former President George W. Bush to limit the access points to the wider internet federal agencies used based on the logic of physical defense. And so, fewer entry and exit points made for a safer compound. However, over time, this proved problematic, especially as new technology came into use. Consequently, in the aforementioned OMB memorandum, the Trump Administration began a revamp from which these documents flow:
      • To continue to promote a consistent baseline of security capabilities, the Department of Homeland Security (DHS) will define TIC initiative requirements in documentation called TIC Use Cases (refer to Appendix A). TIC Use Case documentation will outline which alternative security controls, such as endpoint and user-based protections, must be in place for specific scenarios in which traffic may not be required to flow through a physical TIC access point. To promote flexibility while maintaining a focus on security outcomes, the capabilities used to meet TIC Use Case requirements may be separate from an agency’s existing network boundary solutions provided by a Trusted Internet Connection Access Provider (TICAP) or Managed Trusted Internet Protocol Services (MTIPS). Given the diversity of platforms and implementations across the Federal Government, TIC Use Cases will highlight proven, secure scenarios, where agencies have met requirements for government-wide intrusion detection and prevention efforts, such as the National Cybersecurity Protection System (including the EINSTEIN suite), without being required to route traffic through a TICAP/MTIPS solution.
    • In the Remote User Use Case, it is explained that
      • The TIC 3.0 Remote User Use Case (Remote User Use Case) defines how network and multi-boundary security should be applied when an agency permits remote users on their network. A remote user is an agency user that performs sanctioned business functions outside of a physical agency premises. The remote user scenario has two distinguishing characteristics:
        • 1. Remote user devices are not directly connected to network infrastructure that is managed and maintained by the agency.
        • 2. Remote user devices are intended for individual use (i.e., not a server).
      • In contrast, when remote user devices are directly connected to local area networks and other devices that are managed and maintained by the agency, it would be considered either an agency campus or a branch office scenario. TIC architectures for agency campus and branch office scenarios are enumerated in the TIC 3.0 Traditional TIC Use Case and the TIC 3.0 Branch Office Use Case respectively.
    • In NCIRA, it is stated:
      • The NCPS Cloud Interface Reference Architecture is being released as two individual volumes. The first volume provides an overview of changes to NCPS to accommodate the collection of relevant data from agencies’ cloud environments and provides general reporting patterns for sending cloud telemetry to CISA. This second volume builds upon the concepts presented in NCPS Cloud Interface Reference Architecture: Volume One and provides an index of common cloud telemetry reporting patterns and characteristics for how agencies can send cloud-specific data to the NCPS cloud-based architecture. Individual cloud service providers (CSPs) can refer to the reporting patterns in this volume to offer guidance on their solutions that allow agencies to send cloud telemetry to CISA in fulfillment of NCPS requirements.
  • The Congressional-Executive Commission on China (CECC) published its “2020 Annual Report” “on human rights and the rule of law in China.” The CECC found that:
    • the Chinese government and Communist Party have taken unprecedented steps to extend their repressive policies through censorship, intimidation, and the detention of people in China for exercising their fundamental human rights. Nowhere is this more evident than in the Xinjiang Uyghur Autonomous Region (XUAR) where new evidence emerged that crimes against humanity—and possibly genocide—are occurring, and in Hong Kong, where the ‘‘one country, two systems’’ frame-work has been effectively dismantled.
    • These policies are in direct violation of China’s Constitution, which guarantees ‘‘freedom of speech, of the press, of assembly, of association, of procession and of demonstration,’’ as well as ‘‘freedom of religious belief.’’ The actions of the Chinese government also contravene both the letter and the spirit of the Universal Declaration of Human Rights; violate its obligations under the Inter-national Covenant on Civil and Political Rights, which the Chinese government has signed but not ratified; and violate the Inter-national Covenant on Economic, Social, and Cultural Rights, ratified in 2001. Further, the Chinese government has abandoned any pretense of adhering to the legally binding commitments it made to the international community when it signed the 1984 Sino-British Joint Declaration on the future of Hong Kong.
    • President and Party General Secretary Xi Jinping has tightened his grip over China’s one-party authoritarian system, and the Party has further absorbed key government functions while also enhancing its control over universities and businesses. Authorities promoted the official ideology of ‘‘Xi Jinping Thought’’ on social media and required Party members, government officials, journalists, and students to study it, making the ideology both pervasive, and for much of the country, mandatory.
    • Regarding freedom of expression, the CECC recommended:
      • Give greater public expression, including at the highest levels of the U.S. Government, to the issue of press freedom in China, condemning: the harassment and detention of both domestic and foreign journalists; the denial, threat of denial, or delay of visas for foreign journalists; and the censorship of foreign media websites. Consistently link press freedom to U.S. interests, noting that censorship and restrictions on journalists and media websites prevent the free flow of information on issues of public concern, including public health and environ-mental crises, food safety problems, and corruption, and act as trade barriers for foreign companies attempting to access the Chinese market. Assess the extent to which China’s treatment of foreign journalists contravenes its World Trade Organization commitments and other obligations.
      • Sustain, and where appropriate, expand, programs that develop and widely distribute technologies that will assist Chinese human rights advocates and civil society organizations in circumventing internet restrictions, in order to access and share content protected under international human rights standards. Continue to maintain internet freedom programs for China at the U.S. Department of State and the United States Agency for Global Media to provide digital security training and capacity-building efforts for bloggers, journalists, civil society organizations, and human rights and internet freedom advocates in China.
      • Raise with Chinese officials, during all appropriate bilateral discussions, the cost to U.S.-China relations and to the Chinese public’s confidence in government institutions that is incurred when the Chinese government restricts political debate, advocacy for democracy or human rights, and other forms of peaceful  political  expression.  Emphasize  that  such  restrictions  violate  international  standards  for  free  expression,  particularly  those  contained  in  Article  19  of  the  International  Covenant  on  Civil  and  Political  Rights  and  Article  19  of  the  Universal  Declaration of Human Rights.
  • The Center for Democracy and Technology (CDT) issued its “Recommendations to the Biden Administration and 117th Congress to Advance Civil Rights & Civil Liberties in the Digital Age” that called for reform to content moderation, election law, privacy, big data, and other policy areas.
  • A United States (U.S.) federal court denied Parler’s request for a preliminary injunction against Amazon Web Services (AWS) after the latter shut down the former’s website for repeated violations of their contract, including the use of the conservative tilting platform during the 6 January 2021 insurrection at the United States Capitol. Parler was essentially asking the court to force AWS to once again host its website while its litigation was pending. The court reviewed Parler’s claims and clarified the scope of the case:
    • In its Complaint, Parler asserts three claims: (1) for conspiracy in restraint of trade, in violation of the Sherman Act, 15 U.S.C. § 1; (2) for breach of contract; and (3) for tortious interference with business expectancy. AWS disputes all three claims, asserting that it is Parler, not AWS, that has violated the terms of the parties’ Agreement, and in particular AWS’s Acceptable Use Policy, which prohibits the “illegal, harmful, or offensive” use of AWS services.
    • It is important to note what this case is not about. Parler is not asserting a violation of any First Amendment rights, which exist only against a governmental entity, and not against a private company like AWS. And indeed, Parler has not disputed that at least some of the abusive and violent posts that gave rise to the issues in this case violate AWS’s Acceptable Use Policy. This motion also does not ask the Court to make a final ruling on the merits of Parler’s claims. As a motion for a preliminary injunction, before any discovery has been conducted, Parler seeks only to have the Court determine the likelihood that Parler will ultimately prevail on its claims, and to order AWS to restore service to Parler pending a full and fair litigation of the issues raised in the Complaint.
    • However, the court ruled against Parler:
      • Parler has failed to meet the standard set by Ninth Circuit and U.S. Supreme Court precedent for issuance of a preliminary injunction. To be clear, the Court is not dismissing Parler’s substantive underlying claims at this time. Parler has fallen far short, however, of demonstrating, as it must, that it has raised serious questions going to the merits of its claims, or that the balance of hardships tips sharply in its favor. It has also failed to demonstrate that it is likely to prevail on the merits of any of its three claims; that the balance of equities tips in its favor, let alone strongly so; or that the public interests lie in granting the injunction.
  • The United States (U.S.) Department of Commerce’s National Telecommunications and Information Administration (NTIA) issued a statutorily required “National Strategy to Secure 5G Implementation Plan” and Appendices. The NTIA explained:
    • In accordance with the Secure 5G and Beyond Act of 2020, the Executive Branch has developed a comprehensive implementation plan. This implementation will be managed under the leadership of the National Security Council and the National Economic Council, supported by the National Telecommunications and Information Administration (NTIA), and with contributions from and coordination among a wide range of departments and agencies. The implementation plan took into account the 69 substantive comments in response to NTIA’s Request for Comments received from companies, industry associations, and think tanks representing a range of interests and aspects of the telecommunications ecosystem. Consistent with the National Strategy to Secure 5G, the implementation plan encompasses four lines of effort:
      • Line of Effort One: Facilitate Domestic 5G Rollout: The first line of effort establishes a new research and development initiative to develop advanced communications and networking capabilities to achieve security, resilience, safety, privacy, and coverage of 5G and beyond at an affordable cost. Advancement of United States leadership in Secure 5G and beyond systems and applications will be accomplished by enhancing centers of research and development and manufacturing. These efforts will leverage public-private partnerships spanning government, industry, academia, national laboratories, and international allies. This line of effort also intends to identify incentives and options to leverage trusted international suppliers, both to facilitate secure and competitive 5G buildouts, and to ensure the global competitiveness of United States manufacturers and suppliers.
      • Line of Effort Two: Assess Risks to & Identify Core Security Principles of 5G Infrastructure: The second line of effort is oriented toward identifying and assessing risks and vulnerabilities to 5G infrastructure, building on existing capabilities in assessing and managing supply chain risk. This work will also involve the development of criteria for trusted suppliers and the application of a vendor supply chain risk management template to enable security-conscious acquisition decision-making. Several agencies have responsibilities for assessing threats as the United States’ manages risks associated with the global and regional adoption of 5G network technology as well as developing mitigation strategies to combat any identified threats. These threat assessments take into account, as appropriate, requirements from entities such as the Committee on Foreign Investment in the United States (CFIUS), the Executive Order (E.O.) on Establishing the Committee for the Assessment of Foreign Participation in the United States Telecommunications Services Sector (Team Telecom), and the Federal Acquisition Security Council (FASC). In addition, this line of effort will identify security gaps in United States and international supply chains and an assessment of the global competitiveness and economic vulnerabilities of United States manufacturers and suppliers. Finally, this set of activities will include working closely with the private sector and other stakeholders to identify, develop, and apply core security principles for 5G infrastructure. These efforts will include leveraging the Enduring Security Framework (ESF), a working group under the Critical Infrastructure Partnership Advisory Council (CIPAC). These emerging security principles will be synchronized with or complementary to other 5G security principles, such as the “Prague Proposals” from the Prague 5G Security Conference held in May 2019.
      • Line of Effort Three: Address Risks to United States Economic and National Security during Development and Deployment of 5G Infrastructure Worldwide: The third line of effort involves addressing the risks to United States economic and national security during the development and deployment of 5G infrastructure worldwide. As a part of this effort, the United States will identify the incentives and policies necessary to close identified security gaps in close coordination with the private sector and through the continuous evaluation of commercial, security, and technological developments in 5G networks. A related activity is the identification of policies that can ensure the economic viability of the United States domestic industrial base, in coordination with the private sector through listening sessions and reviews of best practices. An equally important activity relates to the identification and assessment of “high risk” vendors in United States5G infrastructure, through efforts such as the Implementation of E.O. 13873, on “Securing the Information and Communications Technology and Services Supply Chain.” These efforts will build on the work of the CFIUS, the FASC, and Team Telecom reviews of certain Federal Communications Commission (FCC) licenses involving foreign ownership. This element of the implementation plan will also involve more intense engagement with the owners and operators of private sector communications infrastructure, systems equipment developers, and other critical infrastructure owners and operators. The engagements will involve sharing information on 5G and future generation wireless communications systems and infrastructure equipment. Such work will be conducted through the Network Security Information Exchange, the IT and Communications Sector and Government Coordinating Councils, the National Security Telecommunications Advisory Committee, and NTIA’s Communications Supply Chain Risk Information Partnership (C-SCRIP).
      • Line of Effort Four: Promote Responsible Global Development and Deployment of 5G: The fourth line of effort addresses the responsible global development and deployment of 5G technology. A key component of this line of effort is diplomatic outreach and engagement to advocate for the adoption and implementation of 5G security measures that prohibit the use of untrusted vendors in all parts of 5G networks. A related component involves the provision of technical assistance to mutual defense treaty allies and strategic partners of the United States to maximize the security oftheir5G and future generations of wireless communications systems and infrastructure. The goal of providing financing support and technical assistance is to help enable countries and private companies to develop secure and trusted next generation networks that are free of untrusted vendors and that increase global connectivity. A key part of 5G deployment involves international standards development, thus the implementation plan outlines several steps in support of the goal of strengthening and expanding United States leadership in international standards bodies and voluntary consensus-based standards organizations, including strengthening coordination with and among the private sector. This line of effort will also include collaboration with allies and partners with regard to testing programs to ensure secure 5G and future wireless communications systems and infrastructure equipment, including spectrum-related testing. To successfully execute this work, continued close coordination between the United States Government, private sector, academic, and international government partners is required to ensure adoption of policies, standards, guidelines, and procurement strategies that reinforce 5G vendor diversity and foster market competition. The overarching goals of this line of effort are to promote United States-led or linked technology solutions in the global market; remove and reduce regulatory and trade barriers that harm United States competitiveness; provide support for trusted vendors; and advocate for policies and laws that promote open, competitive markets for United States technology companies. This will also be supported through close collaboration with partners on options to advance the development and deployment of open interfaced, standards-based, and interoperable 5G networks.
  • The Federal Communications Commission (FCC) issued its annual “Broadband Deployment Report,” one of the last reports on FCC policy under the stewardship of former Chair Ajit Pai. In the agency’s press release, Pai claimed “[i]n just three years, the number of American consumers living in areas without access to fixed broadband at 25/3 Mbps has been nearly cut in half.” He added:
    • These successes resulted from forward-thinking policies that removed barriers to infrastructure investment and promoted competition and innovation.  I look forward to seeing the Commission continue its efforts to ensure that all Americans have broadband access.  Especially with the success of last year’s Rural Digital Opportunity Fund Phase I auction, I have no doubt that these figures will continue to improve as auction winners deploy networks in the areas for which they got FCC funding.
    • In relevant part, the FCC claimed:
      • Moreover, more than three-quarters of those in newly served areas, nearly 3.7 million, are located in rural areas, bringing the number of rural Americans in areas served by at least 25/3 Mbps to nearly 83%. Since 2016, the number of Americans living in rural areas lacking access to 25/3 Mbps service has fallen more than 46%.  As a result, the rural–urban divide is rapidly closing; the gap between the percentage of urban Americans and the percentage of rural Americans with access to 25/3 Mbps fixed broadband has been nearly halved, falling from 30 points at the end of 2016 to just 16 points at the end of 2019.
      • With regard to mobile broadband, since 2018, the number of Americans lacking access to 4G LTE mobile broadband with a median speed of 10/3 Mbps was reduced by more than 57%, including a nearly 54% decrease among rural Americans.  As of the end of 2019, the vast majority of Americans, 94% had access to both 25/3 Mbps fixed broadband service and mobile broadband service with a median speed of 10/3 Mbps. Also as of the end of 2019, mobile providers now provide access to 5G capability to approximately 60% of Americans. These strides in mobile broadband deployment were fueled by more than $29 billion of capital expenditures in 2019 (roughly 18% of global mobile capital spending), the largest mobile broadband investment since 2015.
      • .  With this Report, the Commission fulfills the Congressional directive to report each year on the progress made in deploying broadband to all Americans. Despite this finding, our work to close the digital divide is not complete.  The Commission will continue its efforts to ensure that all Americans have the ability to access broadband.
  • The chair of the House Oversight and Reform Committee wrote a letter asking Federal Bureau of Investigation (FBI) Director Christopher Wray to conduct “a comprehensive investigation into the role that the social media site Parler played in the assault on the Capitol on January 6.” Chair Carolyn Maloney (D-NY) indicated her committee is also investigating the events of 6 January, suggesting there could be hearings soon on the matter. In the letter, Maloney asserted:
    • It is clear that Parler houses additional evidence critical to investigations of the attack on the Capitol. One commentator has already used geolocation data associated with Parler to track 1,200 videos that were uploaded in Washington, D.C. on January 6.
    • Questions have also been raised about Parler’s financing and its ties to Russia, which the Intelligence Community has warned is continuing to use social media and other measures to sow discord in the United States and interfere with our democracy. For example, posters on Parler have reportedly been traced back to Russian disinformation campaigns. The company was founded by John Matze shortly after he traveled in Russia with his wife, who is Russian and whose family reportedly has ties to the Russian government. Concerns about the company’s connections to Russia have grown since the company re-emerged on a Russian hosting service, DDos-Guard, after being denied services by Amazon Web Services. DDos-Guard has ties to the Russian government and hosts the websites of other far-right extremist groups, as well as the terrorist group Hamas.According to another recent report, “DDoS-Guard’s other clients include the Russian ministry of defence, as well as media organisations in Moscow.”
    • Given these concerns, we ask that the FBI undertake a robust review of the role played by Parler in the January 6 attacks, including (1) as a potential facilitator of planning and incitement related to the attacks, (2) as a repository of key evidence posted by users on its site, and (3) as potential conduit for foreign governments who may be financing civil unrest in the United States.
  • Microsoft released further detailed, technical findings from its investigation into the wide-ranging SolarWinds hack. Last month, Microsoft revealed that its source code had been accessed as part of the Russian hack and stressed that source code for its products had not been changed or tampered with. In its update on its SolarWinds investigation, Microsoft explained:
    • As we continue to gain deeper understanding of the Solorigate attack, we get a clearer picture of the skill level of the attackers and the extent of planning they put into pulling off one of the most sophisticated attacks in recent history. The combination of a complex attack chain and a protracted operation means that defensive solutions need to have comprehensive cross-domain visibility into attacker activity and provide months of historical data with powerful hunting tools to investigate as far back as necessary.
    • More than a month into the discovery of Solorigate, investigations continue to unearth new details that prove it is one of the most sophisticated and protracted intrusion attacks of the decade. Our continued analysis of threat data shows that the attackers behind Solorigate are skilled campaign operators who carefully planned and executed the attack, remaining elusive while maintaining persistence. These attackers appear to be knowledgeable about operations security and performing malicious activity with minimal footprint. In this blog, we’ll share new information to help better understand how the attack transpired. Our goal is to continue empowering the defender community by helping to increase their ability to hunt for the earliest artifacts of compromise and protect their networks from this threat.
    • As mentioned, in a 31 December 2020 blog posting, Microsoft revealed:
      • Our investigation has, however, revealed attempted activities beyond just the presence of malicious SolarWinds code in our environment. This activity has not put at risk the security of our services or any customer data, but we want to be transparent and share what we’re learning as we combat what we believe is a very sophisticated nation-state actor.
      • We detected unusual activity with a small number of internal accounts and upon review, we discovered one account had been used to view source code in a number of source code repositories. The account did not have permissions to modify any code or engineering systems and our investigation further confirmed no changes were made. These accounts were investigated and remediated.
  • The Trump Administration’s United States Trade Representative (USTR) weighed in on Australia’s proposed law to make Google, Facebook, and other technology companies pay for using Australian media content. The USTR reiterated the United States (U.S.) position that forcing U.S. firms to pay for content, as proposed, in unacceptable. It is likely the view of a Biden Administration is not likely to change. The Australian Senate committee considering the “Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020” had asked for input. In relevant part, the USTR argued:
    • the U.S. Government is concerned that an attempt, through legislation, to regulate the competitive positions of specific players in a fast-evolving digital market, to the clear detriment of two U.S. firms, may result in harmful outcomes. There may also be long-lasting negative consequences for U.S. and Australian firms, as well as Australian consumers. While the revised draft has partially addressed some U.S. concerns—including an effort to move towards a more balanced evaluation of the value news businesses and platforms offer each other in the context of mandatory arbitration—significant issues remain.
  • Plaintiffs have filed suit in California state court against WeChat and Tencent by Plaintiff Citizen Power Initiatives for China (CPIFC) and six unnamed California residents who use WeChat. They argue that the government of the People’s Republic of China (PRC) controls WeChat and forces it and its parent, Tencent, to turn over user data to the PRC in violation of California law. They make other allegations of unlawful conduct, including denying users in California the right to access funds though the app in the PRC. They are seeking class action status in order to bring a larger action against the PRC company. The plaintiffs claimed:
    • This case arises from Tencent’s practices of profiting from politically motivated, pro-Chinese Communist Party (“CCP”) censorship and surveillance of California WeChat users (“challenged practices”), which includes the practice of turning over private user data and communications to the government of the People’s Republic of China (“PRC government,” and, together with the CCP, the “Party-state”), and which inflicts an array of harms. Specifically, the challenged practices include Tencent’s practices of: (i) turning over private California WeChat user data and communications to the Party-state; (ii) profiting by using California WeChat user data and communications to improve Tencent’s censorship and surveillance algorithms; (iii) censoring and surveilling California WeChat user communications for content perceived as critical of the Party-state; (iv) suspending, blocking, or deleting California WeChat user accounts and/or data over such content; and (v) prohibiting California WeChat users from withdrawing funds stored in their WeChat accounts when those users do not possess an account with a PRC financial institution subject to monitoring by the Party-state.
    • This action also challenges provisions in Tencent’s terms of service and privacy policy  which,  taken  together,  are  oppressive,  obfuscatory,  and  incoherent  (“challenged provisions”). The challenged provisions include privacy-related terms that are deliberately vague and ambiguous with respect to whether the challenged practices are permitted or prohibited (“vague and ambiguous privacy provisions”), which in turn benefits Tencent by reserving to it the right to adopt self-interested interpretations. However, California WeChat users are entitled to clear, unambiguous, and testable language with respect to the nature and scope of their privacy on WeChat—in other words, to honesty and transparency.
    • Yet, even if the challenged practices were unambiguously prohibited under the challenged provisions, the challenged provisions include terms that make it practically impossible for California WeChat users to seek meaningful redress for the harms caused by those practices (“remedy-limiting provisions”). 
    • Finally, the challenged provisions include terms that impermissibly discriminate against California WeChat users who happen to be citizens of the PRC (“long-arm provisions”).
  • Representatives Anna Eshoo (D-CA) and Tom Malinowski (D-NJ) wrote the CEOs of Facebook, Twitter, and YouTube “urging the companies to address the fundamental design features of their social networks that facilitate the spread of extreme, radicalizing content to their users” per their press release. Last fall, Eshoo and Malinowski introduced the “Protecting Americans from Dangerous Algorithms Act” (H.R.8636) that would subject platforms like Facebook, Twitter, and YouTube to civil suits on the basis of the algorithms used to amplify content that violates the civil rights of others or results in international terrorism. They asserted:
    • The lawmakers note that the rioters who attacked the Capitol earlier this month were radicalized in part in digital echo chambers that these platforms designed, built, and maintained, and that the platforms are partially responsible for undermining our shared sense of objective reality, for intensifying fringe political beliefs, for facilitating connections between extremists, leading some of them to commit real-world, physical violence.
  • The United States (U.S.) Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced “[u]sing enterprise risk management best practices will be a focus for CISA in 2021, and today the National Risk Management Center (NRMC) is launching a Systemic Cyber Risk Reduction Venture to organize our work to reduce shared risk to the Nation’s security and economic security.” CISA explained that “[w]e anticipate three overarching lines of effort:
    • Build the Underlying Architecture for Cyber Risk Analysis to Critical Infrastructure. The critical infrastructure community is underpinned by a dependent web of hardware, software, services, and other connected componentry.
    • Cyber Risk Metric Development. Supporting efforts to better understand the impact of cyber risk across the critical infrastructure community will require developing usable metrics to quantify cyber risk in terms of functional loss. There’s no need to get bogged down with Greek equations with decimal place-level specificity. Metrics that provide even directional or comparative indicators are enormously helpful.
    • Promoting Tools to Address Concentrated Sources of Cyber Risk. Central to our venture to reduce systemic cyber risk is finding concentrated sources of risk that, if mitigated, provide heightened risk management bang for the buck if addressed.
  • The President’s Council of Advisors on Science and Technology (PCAST) issued its first assessment of a government program to fund research and development of advanced information technology for the first time since 2015. PCAST explained:
    • As required by statute, PCAST is tasked with periodically reviewing the Networking and Information Technology Research and Development (NITRD) Program, the Nation’s primary source of federally funded research and development in advanced information technologies such as computing, networking, and software. This report examines the NITRD Program’s progress since the last review was conducted in 2015, explores emerging areas of interest relevant to the NITRD Program, and presents PCAST’s findings and recommendations.
    • PCAST made the following recommendations:
      • Recommendation 1: The current NITRD Program model and its approach to coordinating foundational research in NIT fields across participating agencies should continue as constituted, with the following modifications:
        • NITRD groups should continue to review the PCAs regularly using a fast track action committee (FTAC) and adjust as needed (with a frequency of perhaps every 3 years rather than every 5–6 years, as had been recommended in the 2015 NITRD Review). It should also continue to review IWGs periodically, as recommended in the 2015 NITRD Review.
        • The NITRD Program should continue to pursue incremental modifications of existing structures (e.g., IWGs, PCAs) rather than engage in wholesale reorganizations at this time.
        • When launching wholly new IWGs and PCAs (e.g., such as the AI IWG and AI PCA), the NITRD Program should consider showing clearly in the annual NITRD Supplement to the President’s Budget which lines of effort derive from previous structures and which are wholly new programmatic areas and funding lines. This will be especially important should NITRD groups increase the frequency with which they review and modify PCAs.
      • Recommendation 2: The NITRD Program should examine current structures and operations to identify opportunities for greater multi-sector engagement in its activities. Opportunities include the following:
        • Amplify multi-sector outreach and engagement efforts. While the NITRD Program notifies the public about its convening activities, it could augment its outreach.
        • Expand the NITRD Program’s efforts to track non-U.S. coordinated NIT efforts and collaborate with international efforts where appropriate. This should be done in coordination with the NSTC International S&T Coordination Subcommittee to avoid duplicating efforts.
      • Recommendation 3: The NITRD Program should examine current structures and operations to identify opportunities for improving coordination in IotF areas related to the program. Opportunities could include:
        • AI—continue coordination efforts within the NITRD Program and between NITRD IWGs and the NSTC Select Committee on AI and the Machine Learning and Artificial Intelligence (MLAI) Subcommittee.
        • Advanced communications networks—continue coordination efforts within the NITRD Program through the Subcommittee and the LSN and WSRD IWGs.
        • QIS—increase coordination with the NQCO and the NSTC QIS Subcommittee, particularly on topics such as post-quantum cryptography R&D and other implications of the development of quantum technologies on the NIT landscape with advances in QIS.
        • Biotechnology—coordinate with NSTC bodies working in biosciences-related areas such as the Biodefense R&D (BDRD) Subcommittee and the Biological Sciences Subcommittee (BSSC).
        • Advanced manufacturing—coordinate with the NSTC Subcommittee on Advanced
        • Manufacturing and large-scale manufacturing R&D efforts such as the Manufacturing USA Institutes.
      • Recommendation 4: The NITRD Program should incorporate microelectronics R&D explicitly into its programmatic activities.
        • Could take the form of a separate IWG or incorporating hardware/components R&D into existing IWGs.
        • Should be stronger NNI-NITRD coordination to ensure alignment of R&D strategies and programmatic activities.
      • Recommendation 5: The NITRD Program should further examine ways it can coordinate its participating agencies—such as through an IWG or other multiagency bodies—to ensure they support and emphasize the following:
        • STEM education, including PhD fellowships, in NIT.
        • Programs at the intersection and convergence of computational science and other fields (CS + X) at 2-year and 4-year educational institutions.
        • Retraining and upskilling the non-technical workforce to participate in the cyber-ready workforce.
        • A diverse and inclusive NIT workforce across all levels of technical staff, engineers, and scientists.
        • Strengthen efforts to attract and retain international students, scientists, and engineers who wish to contribute to NIT R&D in the United States. These efforts should be informed by conducting studies of the role that international talent plays in the U.S. NIT workforce and any factors affecting recent changes in recruitment and retention.

Coming Events

  • The Commerce, Science, and Transportation Committee will hold a hearing on the nomination of Gina Raimondo to be the Secretary of Commerce on 26 January.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Photoholgic on Unsplash

“Censorship, Suppression, and the 2020 Election” Hearing

A second committee gets its shot at social media platform CEOs and much of the hearing runs much like the one at the end of last month.

It was with some reluctance that I watched the Senate Judiciary Committee’s hearing with Facebook and Twitter’s CEO given the other Senate hearing at which they appeared a few weeks ago. This hearing was prompted by the two platform’s “censorship” of a dubious New York Post article on Hunter Biden’s business practices that seems to have been planted by Trump campaign associates. At first, both Facebook and Twitter restricted posting or sharing the article in different ways but ultimately relented. Whatever their motivation and whether this was appropriate strike me as legitimate policy questions to ask. However, to criticize social media platforms for doing what is entirely within their rights under the liability shield provided by 47 U.S.C. 230 (Section 230) seems a bit much. Nonetheless, both Mark Zuckerberg and Jack Dorsey faced pointed questions from both Republicans and Democrats who profess to want to see change in social media. And yet, it remains unlikely the two parties in Congress can coalesce around broad policy changes. Perhaps targeted legislation has a chance, but it seems far too late in this Congress for that to happen.

Chair Lindsey Graham (R-SC) took an interesting approach and largely eschewed the typical Republican approach to rail against an anti-conservative biases social media platforms allegedly have despite little in the way of evidence to support these claims. Graham cited a handful of studies showing that social media engagement might be linked to harm to children and teenagers. This was an interesting approach given the hearing was ostensibly about censorship, content moderation, and Section 230. Perhaps Graham is using a modified rationale similar to the one undergirding Graham’s bill, the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020” (EARN IT Act of 2020) (S.3398) (i.e., children are at risk and are being harmed, hence Section 230 must be changed.) Graham did, of course reference the New York Post article but was equivocal as to its veracity and instead framed Twitter and Facebook’s decisions as essentially overriding the editorial choices of the newspaper. He also discussed a tweet of former United Nations Ambassador Nikki Haley that cast doubt on the legality and potential for fraud of mail-in voting that has a label appended by Twitter. Graham contrasted Haley’s tweet with one from Iran’s Ayatollah that questioned why many European nations outlaw Holocaust denial but allow Mohammed to be insulted. This tweet was never fact checked or labeled. Graham suggested the Ayatollah was calling for the destruction of Israel.

Graham argued Section 230 must be changed, and he expressed hope that Republicans and Democrats could work together to do so. He wondered if social media platforms were akin to media organizations given their immense influence and, if so, perhaps they should be regulated accordingly and open to the same liability for publishing defamatory material. Graham called for changes to Section 230 that would establish incentives for social media platforms to make changes such as a more open and transparent system of content moderation, including the biases of the fact checkers. He conceded social media platforms have the almost impossible task of telling people what is reliable and what is not. Finally, he framed social media issues as health issues and compared their addictive effect and harm to cigarettes.

Senator Richard Blumenthal (D-CT) made an opening statement in place of Ranking Member Dianne Feinstein (D-CA), suggesting the possibility that the latter did not want to be associated with this hearing that the former called not serious and a political sideshow. In any event, Blumenthal repeated many of his previously articulated positions on social media companies and how they are currently harming the United States (U.S.) in a number of ways. Blumenthal claimed President Donald Trump is using the megaphone of social media in ways that are harming the U.S. and detrimental to democracy. He called social media terrifying tools of persuasion with power far exceeding the Robber Barons of the last Gilded Age. Blumenthal further claimed social media companies are strip mining the personal data of people to their great profit while also promoting hate speech and voter suppression. Blumenthal acknowledged the baby steps Twitter and Facebook made in trying to address these problems but remarked parenthetically that Google was not required to appear at the hearing, an apparent reward for doing less than the other two companies to combat lies and misinformation.

Blumenthal said the hearing was not serious and was a political sideshow. Blumenthal remarked that “his colleagues” (by which he almost certainly meant Republicans) did not seem interested in foreign interference in U.S. elections and the calls for the murder of Federal Bureau of Investigation Director Christopher Wray and National Institute of Allergy and Infectious Diseases (NIAID) Director Anthony Fauci. Blumenthal said the purpose of the hearing was to bully Facebook, Twitter, and other platforms. He called for serious hearings into “Big Tech,” specifically on antitrust issues as the companies have become dominant and are abusing their power. He specifically suggested that Instagram and WhatsApp be spun off from Facebook and other companies broken up, too. Blumenthal called for strong privacy legislation to be enacted. He said “meaningful” Section 230 reform is needed, including a possible repeal of most of the liability protection, for the immunity shield is way too broad and the victims of harm deserve their day in court. Blumenthal vowed to keep working with Graham in the next Congress on the EARN IT Act, a sign perhaps that the bill is not going to get enacted before the end of the year. Graham noted, however, that next year should the Republicans hold the Congress, Senator Chuck Grassley (R-IA), the Senate’s President Pro Tempore, would become chair. Graham expressed his hope Grassley would work on Section 230.

Facebook CEO Mark Zuckerberg again portrayed Facebook as the platform that gives everyone a voice and then pivoted to the reforms implemented to ensure the company was not a vessel for election misinformation and mischief. Zuckerberg touted Facebook’s voter registration efforts (more than 4.5 million), its role in helping people volunteer at polls, and its efforts to disseminate factual information about when, where, and how Americans could vote. He turned to Facebook’s efforts to combat misinformation and voter suppression and the steps it took on election day and thereafter. Zuckerberg touted the lessons Facebook learned from the 2016 election in the form of changed policies and greater awareness of efforts by other nations to spread disinformation, lies, and chaos. Incidentally (or perhaps not so incidentally) Zuckerberg did not discuss the platform’s efforts to take on domestic efforts to undermine U.S. democracy. He, did, however reveal that Facebook is funding a “partnership with a team of independent external academics to conduct objective and empirically grounded research on social media’s impact on democracy.” Beyond remarking that Facebook hopes to learn about its role in this dynamic, he did not pledge any particular action on the basis of this study.

Zuckerberg reiterated Facebook’s positions on Section 230 reform:

I’ve also called for Congress to update Section 230 of the Communications Decency Act to make sure it’s working as intended. Section 230 allows us to provide our products and services to users by doing two things:

  • First, it encourages free expression. Without Section 230, platforms could potentially be held liable for everything people say. Platforms would likely censor more content to avoid legal risk and would be less likely to invest in technologies that enable people to express themselves in new ways.
  • Second, it allows platforms to moderate content. Without Section 230, platforms could face liability for doing even basic moderation, such as removing hate speech and harassment that impacts the safety and security of their communities.

Thanks to Section 230, people have the freedom to use the internet to express themselves, and platforms are able to more effectively address risks. Updating Section 230 is a significant decision, but we support the ideas around transparency and industry collaboration that are being discussed in some of the current bipartisan proposals, and I look forward to a meaningful dialogue about how we might update the law to deal with the problems we face today.

It’s important that any changes to the law don’t prevent new companies or businesses from being built, because innovation in the internet sector brings real benefits to billions of people around the world. We stand ready to work with Congress on what regulation could look like, whether that means Section 230 reform or providing guidance to platforms on other issues such as harmful content, privacy, elections, and data portability. By updating the rules for the internet, we can preserve what’s best about it—the freedom for people to express themselves and for entrepreneurs to build new things—while also protecting society from broader harms.

Twitter CEO Jack Dorsey explained Twitter’s content moderation policies, especially those related to the election. He stressed that Congress should build upon the foundation laid in Section 230 either through additional legislation or in helping to create private codes of conduct social media companies would help craft and then abide. He asserted that removing Section 230 protection or radically reducing the liability shield would not go to the problem of addressing problematic speech on social media and would indeed cause most platforms to retrench and more severely restrict speech, an outcome at odds with what Members desire. Dorsey then trotted the idea that carving out Section 230, as many of the bills introduced in this Congress propose to do, would create a complicated competitive landscape that would favor large incumbents with the resources to comply while all but shutting out smaller competitors. Regardless of whether this is likely to happen, it is shrewd testimony given the anti-trust sentiment on Capitol Hill and the executive branch towards large technology firms.

In terms of any concrete recommendations for Congress, Dorsey noted:

Three weeks ago, I told the Senate Committee on Commerce, Science and Transportation that I believe the best way to address our mutually-held concerns is to require the publication of moderation processes and practices, a straightforward process to appeal decisions, and best efforts around algorithmic choice, while protecting the privacy of the people who use our service. These are achievable in short order.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Prateek Katyal from Pexels

Further Reading, Other Developments, and Coming Events (16 November)

Further Reading

  • Trump’s refusal to begin the transition could damage cybersecurity” By Joseph Marks — The Washington Post. Former executive branch officials, some of whom served at the Department of Homeland Security (DHS), are warning that the Trump Administration’s refusal to start the transition to the Biden Administration may harm the United States’ (U.S.) ability to manage cyber risks if it stretches on too long.
  • Biden will get tougher on Russia and boost election security. Here’s what to expect.” By Joseph Marks — The Washington Post. Expect a Biden Administration to restore cybersecurity policy to the prominence it had in the Obama Administration with renewed diplomatic efforts to foster international consensus against nations like the Russian Federation or People’s Republic of China. A Biden Presidency will likely continue to pursue the Trump Administration’s larger objectives on the People’s Republic of China but without the capriciousness of the current President introducing an element of uncertainty. And, election security and funding will naturally be a focus, too.
  • Taking Back Our Privacy” By Anna Wiener — The New Yorker. This fascinating profile of Moxie Marlinspike (yes, that’s really his name), the prime mover behind end-to-end encryption in WhatsApp and his application, Signal, (hands down the best messaging app, in my opinion), is worth your time.
  • Biden’s Transition Team Is Stuffed With Amazon, Uber, Lyft, and Airbnb Personnel” By Edward Ongweso Jr — Vice’s Motherboard. This piece casts a critical eye on a number of members of the Biden-Harris transition team that have been instrumental in policy changes desired by their employers seemingly at odds with the President-elect’s policies. It remains to be seen how such personnel may affect policies for the new Administration.
  • Officials say firing DHS cyber chief could make U.S. less safe as election process continues” By Joseph Marks — The Washington Post. The head of the Department of Homeland Security’s Cybersecurity Infrastructure and Security Agency (CISA) may well be among those purged by the Trump Administration regardless of the costs to national security. CISA Director Christopher Krebs has deftly navigated some of the most fraught, partisan territory in the Trump Administration in leading efforts on election security, but his webpage, Rumor Control, may have been too much for the White House. Consequently, Krebs is saying he expects to be fired like CISA Assistant Director Bryan Ware was this past week.

Other Developments

  • The Democratic leadership on a key committee wrote the chairs of both the Federal Trade Commission (FTC) and the Federal Communications Commission (FCC), “demanding that the two commissions stop work on all partisan or controversial items currently under consideration in light of the results of last week’s presidential election” per the press release. House Energy and Commerce Committee Chair Frank Pallone Jr. (D-NJ), Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL), and Communications and Technology Subcommittee Chair Mike Doyle (D-PA) argued that FTC Chair Joseph Simons and FCC Chair Ajit Pai should “only pursue consensus and administrative matters that are non-partisan for the remainder of your tenure.” The agencies are, of course, free to dismiss the letters and the request and may well do so, especially in the case of the FCC and its rulemaking on 47 U.S.C. 230. Additionally, as rumored, the FTC may soon file an antitrust case against Facebook for its dominance of the social messaging market when Democrats on the FTC and elsewhere might prefer a broader case.
  • The Office of Personnel Management’s (OPM) Office of the Inspector General (OIG) released a pair of audits on the agency’s information security practices and procedures and found continued weaknesses in the agency’s systems. The OPM was breached by People’s Republic of China (PRC) hackers during the Obama Administration and massive amounts of information about government employees was exfiltrated. Since that time, the OPM has struggled to mend its information security and systems.
    • In “Audit of the Information Technology Security Controls of the U.S. Office of Personnel Management’s Agency Common Controls,” the OIG found explained that its “audit of the agency common controls listed in the Common Security Control Collection (CSCC) determined that:
      • Documentation assigning roles and responsibilities for the governance of the CSCC does not exist.
      • Inconsistencies in the risk assessment and reporting of deficient controls were identified in the most recent assessment results documentation of the CSCC.
      • Weaknesses identified in an assessment of the CSCC were not tracked through a plan of actions and milestones.
      • Weaknesses identified in an assessment of the CSCC were not communicated to the Information System Security Officers, System Owners or Authorizing Officials of the systems that inherit the controls.
      • We tested 56 of the 94 controls in the CSCC. Of the 56 controls tested, 29 were either partially satisfied or not satisfied. Satisfied controls are fully implemented controls according to the National Institute of Standards and Technology.”
    • And, in the annual Federal Information Security Modernization Act (FISMA) audit, the OIG found middling progress. Specifically, with respect to the FISMA IG Reporting Metrics, the OIG found:
      • Risk Management – OPM has defined an enterprise-wide risk management strategy through its risk management council. OPM is working to implement a comprehensive inventory management process for its system interconnections, hardware assets, and software.
      • Configuration Management – OPM continues to develop baseline configurations and approve standard configuration settings for its information systems. The agency is also working to establish routine audit processes to ensure that its systems maintain compliance with established configurations.
      • Identity, Credential, and Access Management (ICAM) – OPM is continuing to develop its agency ICAM strategy, and acknowledges a need to implement an ICAM program. However, OPM still does not have sufficient processes in place to manage contractors in its environment.
      • Data Protection and Privacy – OPM has implemented some controls related to data protection and privacy. However, there are still resource constraints within OPM’s Office of Privacy and Information Management that limit its effectiveness.
      • Security Training – OPM has implemented a security training strategy and program, and has performed a workforce assessment, but is still working to address gaps identified in its security training needs.
      • Information Security Continuous Monitoring – OPM has established many of the policies and procedures surrounding continuous monitoring, but the agency has not completed the implementation and enforcement of the policies. OPM also continues to struggle to conduct security controls assessments on all of its information systems.
      • Incident Response – OPM has implemented many of the required controls for incident response. Based upon our audit work, OPM has successfully implemented all of the FISMA metrics at the level of “consistently implemented” or higher.
      • Contingency Planning – OPM has not implemented several of the FISMA requirements related to contingency planning, and continues to struggle to maintain its contingency plans as well as conducting contingency plan tests on a routine basis.
  • The Australian Competition and Consumer Commission (ACCC) announced “amendments to the Consumer Data Right Rules…[that] permit the use of accredited intermediaries to collect data, through an expansion of the rules relating to outsourced service providers” per the press release. The ACCC stated “The amendments expand the Consumer Data Right system by allowing for accredited businesses to rely on other accredited businesses to collect Consumer Data Right data on their behalf, so they can provide goods and services to consumers.” The ACCC stated “[t]he Competition and Consumer (Consumer Data Right) Amendment Rules (No. 2) 2020 (Accredited Intermediary Rules) commenced on 2 October 2020 and are available on the Federal Register of Legislation.”
  • Singapore’s central bank called on financial institutions to ramp up cybersecurity because of increased threats during the COVID-19 pandemic. The Monetary Authority of Singapore (MAS)’s Cyber Security Advisory Panel (CSAP) held “its fourth annual meeting with MAS management…[and] shared its insights on cyber risks in the new operating environment and made several recommendations:”
    • Reviewing risk profiles and adequacy of risk mitigating measures. The Panel discussed the risks and vulnerabilities arising from the rapid adoption of remote access technologies and work processes that could affect FIs’ cyber risk profiles. The meeting highlighted the need for FIs to assess if their existing risk profiles have changed and remain acceptable. This is to ensure that in the long run appropriate controls are implemented to mitigate any new risks.  
    • Maintaining oversight of third-party vendors and their controls. With the increased reliance on third-party vendors, the Panel emphasised the need for FIs to step up their oversight of these counterparts and to monitor and secure remote access by third-parties to FIs’ systems. This is even more important during the COVID-19 pandemic where remote working has become pervasive.
    • Strengthening governance over the use of open-source software (OSS). Vulnerabilities in OSS are typically targeted and exploited by threat actors. The Panel recommended that FIs establish policies and procedures on the use of OSS and to ensure these codes are robustly reviewed and tested before they are deployed in the FIs’ IT environment.
  • Washington State Attorney General Bob Ferguson issued his fifth annual Data Breach Report “showed that the number of Washingtonians affected by breaches nearly doubled in the last year and ransomware attacks tripled” according to his press release. Ferguson asserted:
    • The total number of Washingtonians affected by a data breach increased significantly, from 351,000 in 2019 to 651,000 in 2020. Overall, there were fewer breaches reported to the Attorney General’s Office in 2020, decreasing from 60 reported breaches last year to 51 this year.
    • Ferguson made the following recommendations:
      • 1. Bring RCW 19.255.005 and RCW 42.56.590 into alignment by making sure that private entities also have to provide notice to consumers for breaches of a consumer’s name and the last-four digits of their Social Security number.
      • SB 6187, which was signed by Governor Inslee on March 18, 2020, and went into effect on June 11, 2020 modified the definition of personal information for breaches that occur at local and state agencies. Specifically, the bill modified the definition of personal information in RCW 42.56.590 to include the last four digits of a SSN in combination with a consumer’s name as a stand alone element that will trigger the requirement for consumer notice. This change should be extended to RCW 19.255.005 as well, to bring both laws into alignment, and provide consumers with the most robust protections possible, regardless of the type of entity that was breached.
      • 2. Expand the definition of “personal information” in RCW 19.255.005 and RCW 42.56.590 to include Individual Tax Identification numbers (ITINs).
      • ITINs are assigned by the IRS to foreign-born individuals who are unable to acquire a Social Security number for the purposes of processing various tax related documents. In other words, they are a unique identifier equivalent in sensitivity to a Social Security number. At present, ten states include ITINs in their definition of “personal information.” In 2018, Washington State was home to just over 1.1 million foreign born individuals, representing approximately 15% of the state’s population.
      • 3. Establish a legal requirement for persons or businesses that store personal information to maintain a risk-based information security program, and to ensure that information is not retained for a period longer than is reasonably required.
      • As this report discussed last year, it is imperative that entities who handle the private information of Washingtonians take steps necessary to keep it safe, and be prepared to act if they cannot. Such precautions are beneficial for both consumers and the organizations collecting their data. In 2019, Ponemon Report indicated that 48% of the companies surveyed lacked any form of security automation – security technologies used to detect breaches more efficiently than humans can.22 In 2020, that number dropped by only 7%.23
      • In 2019, the average cost of a data breach for companies without automation was nearly twice as expensive as for those who implemented security automation. That cost has only grown since, with data breaches in 2020 costing companies without security automation nearly triple that of business who have automation. Similarly, the formation of a dedicated Incident Response Team and testing of an Incident Response Plan reduced the average total cost of breaches in 2020 by more than $2 million.
      • Requiring data collectors to maintain an appropriately sized security program and incident response team and to dispose of consumer information that is no longer needed is a critical next step in mitigating the size and cost of breaches in our state.
  • Four former Secretaries of Homeland Security and two acting Secretaries wrote the leadership of the Congress regarding “the need to consolidate and strengthen Congressional oversight of the Department of Homeland Security (DHS) in order to make possible the fundamental changes that DHS urgently needs to protect the American people from the threats we face in 2021.” They noted “more than 90 different committees or subcommittees today have jurisdiction over DHS—far more than any other cabinet department.” They asserted:
    • DHS urgently needs to make major reforms, improvements, and enhancements to ensure the Department can protect the nation in the way Congress envisioned nearly two decades ago. DHS’s leadership, whether Democratic or Republican, needs to work with a single authorizing committee with broad subject matter authority to enact the changes and authorize the programs that DHS needs to address the threats of 2021.
  • Privacy International (PI) and 13 other groups from the European Union (EU) and Africa wrote the European Commission (EC), arguing the EU’s policies are supporting “the funding and development of projects and initiatives which threaten the right to privacy and other fundamental rights, such as freedom of expression and freedom of assembly.” These groups contended:
    • that by sponsoring such activities, the EU drives the adoption and use of surveillance technologies that, if abused by local actors, can potentially violate the fundamental rights of people residing in those countries. In the absence of rule of law and human rights safeguards enshrined in law, which seek to limit the state’s powers and protect people’s rights, these technologies can be exploited by authorities and other actors with access and result in onerous implications not just for the rights of privacy and data protection but also for other rights, such as freedom of expression and freedom of assembly.
    • In their press release, these groups stated the letter “comes following the public release of hundreds of documents obtained by PI after a year of negotiating with EU bodies under access to documents laws, which show:
      • How police and security agencies in Africa and the Balkans are trained with the EU’s support in spying on internet and social media users and using controversial surveillance techniques and tools; Read PI’s report here.
      • How EU bodies are training and equipping border and migration authorities in non-member countries with surveillance tools, including wiretapping systems and other phone surveillance tools, in a bid to ‘outsource’ the EU’s border controls; Read PI’s report here.
      • How Civipol, a well-connected French security company, is developing mass biometric systems with EU aid funds in Western Africa in order to stop migration and facilitate deportations without adequate risk assessments. Read PI’s report here.
    • They stated “we call on the European Commission, in coordination with the European Parliament and EU member states to:
      • Ensure no support is provided for surveillance or identity systems across external assistance funds and instruments to third countries that lack a clear and effective legal framework governing the use of the surveillance equipment or techniques.
      • Only provide support for surveillance or identity systems after an adequate risk assessment and due diligence are carried out.
      • Provide Parliament greater capabilities of scrutiny and ensuring accountability over funds.
      • All future projects aimed at addressing “the root causes of instability, forced displacement, and irregular migration” should be mainstreamed into the NDICI. In turn, discontinue the EUTF for Africa when the current fund comes to its end in 2020.
      • Ensure that EC Directorate-General for International Cooperation and Development (DEVCO), the EU body in charge of development aid, establishes a new Fund aimed at improving governance and legal frameworks in non-EU countries to promote the right to privacy and data protection. Priorities of the Fund should include:
        • Revising existing privacy and data protection legal frameworks, or where there is none developing new ones, that regulate surveillance by police and intelligence agencies, aimed at ensuring they are robust, effectively implemented, and provide adequate redress for individuals;
        • Strengthening laws or introducing new ones that set out clear guidelines within which the government authorities may conduct surveillance activities;
        • Focusing on promotion and strengthening of democratisation and human rights protections;
        • Strengthening the independence of key monitoring institutions, such as the judiciary, to ensure compliance with human rights standards.

Coming Events

  • On 17 November, the Senate Judiciary Committee will hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • The Senate Homeland Security and Governmental Affairs Committee’s Regulatory Affairs and Federal Management Subcommittee will hold a hearing on how to modernize telework in light of what was learned during the COVID-19 pandemic on 18 November.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by cottonbro from Pexels

Further Reading, Other Developments, and Coming Events (11 November)

Further Reading

  • ICE, IRS Explored Using Hacking Tools, New Documents Show” By Joseph Cox — Vice. Federal agencies other than the Federal Bureau of Investigation (FBI) and the Intelligence Community (IC) appear to be interesting in utilizing some of the capabilities offered by the private sector to access devices or networks in the name of investigating cases.
  • China’s tech industry relieved by Biden win – but not relaxed” By Josh Horwitz and Yingzhi Yang — Reuters. While a Biden Administration will almost certainly lower the temperature between Beijing and Washington, the People’s Republic of China is intent on addressing the pressure points used by the Trump Administration to inflict pain on its technology industry.
  • Trump Broke the Internet. Can Joe Biden Fix It?” By Gilad Edelman — WIRED. This piece provides a view of the waterfront in technology policy under a Biden Administration.
  • YouTube is awash with election misinformation — and it isn’t taking it down” By Rebecca Heilweil — Recode. For unexplained reasons, YouTube seems to have avoided the scrutiny facing Facebook and Twitter on their content moderation policies. Whether the lack of scrutiny is a reason is not clear, but the Google owned platform had much more election-related misinformation than the other social media platforms.
  • Frustrated by internet service providers, cities and schools push for more data” By Cyrus Farivar — NBC News. Internet service providers are not helping cities and states identify families eligible for low-cost internet to help children attend school virtually. They have claimed these data are proprietary, so jurisdictions have gotten creative about identifying such families.

Other Developments

  • The Consumer Product Safety Commission’s (CPSC) Office of the Inspector General (OIG) released its annual Federal Information Security Modernization Act (FISMA) audit and found “that although management continues to make progress in implementing the FISMA requirements much work remains to be done.” More particularly, it was “determined that the CPSC has not implemented an effective information security program and practices in accordance with FISMA requirements.” The OIG asserted:
    • The CPSC information security program was not effective because the CPSC has not developed a holistic formal approach to manage information security risks or to effectively utilize information security resources to address previously identified information security deficiencies. Although the CPSC has begun to develop an Enterprise Risk Management (ERM) program to guide risk management practices at the CPSC, explicit guidance and processes to address information security risks and integrate those risks into the broader agency-wide ERM program has not been developed.
    • In addition, the CPSC has not leveraged the relevant information security risk management guidance prescribed by NIST to develop an approach to manage information security risk.
    • Further, as asserted by CPSC personnel, the CPSC has limited resources to operate the information security program and to address the extensive FISMA requirements and related complex cybersecurity challenges.
    • Therefore, the CPSC has not dedicated the resources necessary to fully address these challenges and requirements. The CPSC began addressing previously identified information security deficiencies but was not able to address all deficiencies in FY 2020.
  • The United States (U.S.) Department of Justice (DOJ) announced the seizure of 27 websites allegedly used by Iran’s Islamic Revolutionary Guard Corps (IRGC) “to further a global covert influence campaign…in violation of U.S. sanctions targeting both the Government of Iran and the IRGC.” The DOJ contended:
    • Four of the domains purported to be genuine news outlets but were actually controlled by the IRGC and targeted audiences in the United States, to covertly influence United States policy and public opinion, in violation of the Foreign Agents Registration Act (FARA). The remainder targeted audiences in other parts of the world.  This seizure warrant follows an earlier seizure of 92 domains used by the IRGC for similar purposes.
  • The United Nations (UN) Special Rapporteur on the right to privacy Joseph Cannataci issued his annual report that “constitutes  a  preliminary  assessment  as  the  evidence  base required to reach definitive conclusions on whether privacy-intrusive, anti-COVID-19 measures are necessary and proportionate in a democratic society is not yet available.” Cannataci added “[a] more definitive report is planned for mid-2021, when 16 months of evidence will be available to allow a more accurate assessment.” He “addresse[d]  two  particular  aspects  of  the impact of COVID-19 on the right to privacy: data protection and surveillance.” The Special Rapporteur noted:
    • While the COVID-19 pandemic has generated much debate about the value of contact tracing and reliance upon technology that track citizens and those they encounter, the use of information and technology is not new in managing public health emergencies. What is concerning in some States are reports of how technology is being used and the degree of intrusion and control being exerted over citizens –possibly to little public health effect.
    • The Special Rapporteur concluded:
      • It is far too early to assess definitively whether some COVID-19-related measures might be unnecessary or disproportionate. The Special Rapporteur will continue to monitor the impact of surveillance in epidemiology on the right to privacy and report to the General Assembly in 2021. The main privacy risk lies in the use of non-consensual methods, such as those outlined in the section on hybrid systems of surveillance, which could result in function creep and be used for other purposes that may be privacy intrusive.
      • Intensive and omnipresent technological surveillance is not the panacea for pandemic situations such as COVID-19. This has been especially driven home by those countries in which the use of conventional contact-tracing methods, without recourse to smartphone applications, geolocation or other technologies, has proven to be most effective in countering the spread of COVID-19.
      • If a State decides that technological surveillance is necessary as a response to the global COVID-19 pandemic, it must make sure that, after proving both the necessity and proportionality of the specific measure, it has a law that explicitly provides for such surveillance measures (as in the example of Israel).
      • A State wishing to introduce a surveillance measure for COVID-19 purposes, should not be able to rely on a generic provision in law, such as one stating that the head of the public health authority may “order such other action be taken as he [or she] may consider appropriate”. That does not provide explicit and specific safeguards which are made mandatory both under the provisions of Convention 108 and Convention 108+, and based on the jurisprudence of the European Court of Human Rights. Indeed, if the safeguard is not spelled out in sufficient detail, it cannot be considered an adequate safeguard.
  • The University of Toronto’s Citizen Lab issued its submission to the Government of Canada’s “public consultation on the renewal of its Responsible Business Conduct (RBC) strategy, which is intended to provide guidance to the Government of Canada and Canadian companies active abroad with respect to their business activities.” Citizen Lab addressed “Canadian technology companies and the threat they pose to human rights abroad” and noted two of its reports on Canadian companies whose technologies were used to violate human rights:
    • In 2018, the Citizen Lab released a report documenting Netsweeper installations on public IP networks in ten countries that each presented widespread human rights concerns. This research revealed that Netsweeper technology was used to block: (1) political content sites, including websites linked to political groups, opposition groups, local and foreign news, and regional human rights issues in Bahrain, Kuwait, Yemen, and UAE; (2) LGBTQ content as a result of Netsweeper’s pre-defined ‘Alternative Lifestyles’ content category, as well as Google searches for keywords relating to LGBTQ content (e.g., the words “gay” or “lesbian”) in the UAE, Bahrain, and Yemen; (3) non-pornographic websites under the mis-categorization of sites like the World Health Organization and the Center for Health and Gender Equity as “pornography”; (4) access to news reporting on the Rohingya refugee crisis and violence against Muslims from multiple news outlets for users in India; (5) Blogspot-hosted websites in Kuwait by categorizing them as “viruses” as well as a range of political content from local and foreign news and a website that monitors human rights issues in the region; and (6) websites like Date.com, Gay.com (the Los Angeles LGBT Center), Feminist.org, and others through categorizing them as “web proxies.” 
    • In 2018, the Citizen Lab released a report documenting the use of Sandvine/Procera devices to redirect users in Turkey and Syria to spyware, as well as the use of such devices to hijack the Internet users’ connections in Egypt, redirecting them to revenue-generating content. These examples highlight some of the ways in which this technology can be used for malicious purposes. The report revealed how Citizen Lab researchers identified a series of devices on the networks of Türk Telekom—a large and previously state-owned ISP in Turkey—being used to redirect requests from users in Turkey and Syria who attempted to download certain common Windows applications like antivirus software and web browsers. Through the use of Sandvine/Procera technology, these users were instead redirected to versions of those applications that contained hidden malware. 
    • Citizen Lab made a number of recommendations:
      • Reform Canadian export law:  
        • Clarify that all Canadian exports are subject to the mandatory analysis set out in section 7.3(1) and section 7.4 of the Export and Import Permits Act (EIPA). 
        • Amend section 3(1) the EIPA such that the human rights risks of an exported good or technology provide an explicit basis for export control.
        • Amend the EIPA to include a ‘catch-all’ provision that subjects cyber-surveillance technology to export control, even if not listed on the Export Control List, when there is evidence that the end-use may be connected with internal repression and/or the commission of serious violations of international human rights or international humanitarian law. 
      • Implement mandatory human rights due diligence legislation:
        • Similar to the French duty of vigilance law, impose a human rights due diligence requirement on businesses such that they are required to perform human rights risk assessments, develop mitigation strategies, implement an alert system, and develop a monitoring and public reporting scheme. 
        • Ensure that the mandatory human rights due diligence legislation provides a statutory mechanism for liability where a company fails to conform with the requirements under the law. 
      • Expand and strengthen the Canadian Ombudsperson for Responsible Enterprise (CORE): 
        • Expand the CORE’s mandate to cover technology sector businesses operating abroad.
        • Expand the CORE’s investigatory mandate to include the power to compel companies and executives to produce testimony, documents, and other information for the purposes of joint and independent fact-finding.
        • Strengthen the CORE’s powers to hold companies to account for human rights violations abroad, including the power to impose fines and penalties and to impose mandatory orders.
        • Expand the CORE’s mandate to assist victims to obtain legal redress for human rights abuses. This could include the CORE helping enforce mandatory human rights due diligence requirements, imposing penalties and/or additional statutory mechanisms for redress when requirements are violated.
        • Increase the CORE’s budgetary allocations to ensure that it can carry out its mandate.
  • A week before the United States’ (U.S.) election, the White House’s Office of Science and Technology Policy (OSTP) issued a report titled “Advancing America’s Global Leadership in Science and Technology: Trump Administration Highlights from the Trump Administration’s First Term: 2017-2020,” that highlights the Administration’s purported achievements. OSTP claimed:
    • Over the past four years, President Trump and the entire Administration have taken decisive action to help the Federal Government do its part in advancing America’s global science and technology (S&T) preeminence. The policies enacted and investments made by the Administration have equipped researchers, health professionals, and many others with the tools to tackle today’s challenges, such as the COVID-19 pandemic, and have prepared the Nation for whatever the future holds.

Coming Events

  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Brett Sayles from Pexels

Further Reading, Other Developments, and Coming Events (4 November)

Further Reading

  • U.S. Cyber Command Expands Operations to Hunt Hackers From Russia, Iran and China” By Julian Barnes — The New York Times. The United States (U.S.) agency charged with offensive cyber operations sent teams around the world to undisclosed locations to work with partner nations to foil Russian, Chinese, and Iranian efforts to disrupt the U.S. election. It appears this exercise is more about building relations with partners in key regions and having personnel see first-hand the effect of constant cyber attacks, especially in regions targeted by the Russian Federation rather than the rationale offered by Cyber Command that “hunting forward” puts its people closer to the action. Considering this is cyberspace, does it really matter where personnel are?
  • U.S. undertook cyber operation against Iran as part of effort to secure the 2020 election” By Ellen Nakashima — The Washington Post. United States (U.S.) Cyber Command is out setting a narrative about how effective its operations against nations like Iran have been in protecting the election. Of course, one cannot prove this easily, so it is perhaps an open question as to the effectiveness of U.S. efforts. Nonetheless, this uncharacteristic openness may be on account of successful operations to foil and fend off efforts to disrupt the election, and it certainly reflects the U.S. security services’ desire to avoid 2016’s mistake of not going public with information so Americans would understand what is happening.
  •  “Europe and the US are drifting apart on tech. Joe Biden wouldn’t fix that.” By Nicholas Vincour — Politico EU. This rundown of the significant policy differences suggests the United States (U.S.) and the European Union (EU) will be at odds on major tech issues even under a Biden Administration that one can safely assume will return the U.S. to closer relations with the EU. Most of these differences transcend personality, however, suggesting structural and systemic reasons, which foretell continued friction.
  • What Big Tech has to gain—and lose—from a Biden presidency” By Mark Sullivan — Fast Company. This piece lays out how a Biden Administration might continue and discontinue Trump Administration policy if Joe Biden prevails in the election. One aspect this piece glosses over, however, is how the composition of Congress would inform a Biden Administration’s capability to achieve its policy goals on tech.
  • Robocalls Told at Least 800,000 Swing State Residents to “Stay Home” on Election Day. The FBI Is Investigating.” By Jack Gillum and Jeremy B. Merrill — ProPublica. Robocalls to more than 3 million people were made yesterday, urging them to stay home and stay safe. This is akin to voter suppression tactics that have been used for decades in the United States, but it is unlikely the culprit or true motive (if it was not intended as suppression) will ever be discovered given the ease of use, scale, and anonymity spoofing provides.

Other Developments

  • Australia’s Department of Home Affairs (Department) released for comment “Critical Technology Supply Chain Principles (the Principles)” that “are intended to assist organisations – including governments and businesses of all sizes – in making decisions about their suppliers.” The Department stated that “[t]he Principles also complement the Protecting Critical Infrastructure and Systems of National Significance reforms…[and] [t]ogether, these measures will help protect the supply of essential services that all Australians rely on.​​”
    • The Department stated:
      • Supply chains for critical technologies in Australia must be more resilient. Australia’s COVID-19 experience highlights the vulnerabilities of supply chains for products essential to the country. At the same time, the global technological landscape is evolving at an unprecedented pace and geostrategic competition is affecting how critical technologies are being developed and used.
      • The more dependent society becomes on technology, the less governments and organisations can rely on traditional habits and decision-making frameworks when it comes to their supply chains. Improving the management of critical technology supply chains specifically, across the economy will help build Australia’s resilience to future shocks, as well as address the inherent risks to our nation’s national security, economic prosperity and social cohesion. Advances in technology underpin our future prosperity, however they also expose our nation to more risks. Malicious actors can use critical technologies to harm our national security, and undermine our democracy. One way to address these risks is to consider the supply chains of critical technologies, and how these could be made more secure. Understanding the risks is the first step towards organisations of all sizes taking action to create diverse, trusted and secure supply chains.
      • That’s why the Australian Government is developing the Critical Technology Supply Chain Principles. These Principles will be non-binding and voluntary, and are intended to act as a tool to assist governments and businesses in making decisions about their suppliers and transparency of their own products. The Principles will help Australian business consider the unforeseen risks when developing critical technologies, building business resilience. The suggested Principles will be grouped under three pillars: security-by-design, transparency, and autonomy and integrity. The suggested Principles below align with guidance provided by the Australian Signals Directorate’s Australian Cyber Security Centre on supply chain risk management.
    • The Department provided an overview of the conceptual framework of the document:
      • Security should be a core component of critical technologies. Organisations should ensure they are making decisions that build in security from the ground-up.
        • 1. Understand what needs to be protected and why.
        • 2. Understand the security risks posed by your supply chain.
        • 3. Build security considerations into contracting processes that are proportionate to the level of risk (and encourage suppliers to do the same).
        • 4. Raise awareness of security within your supply chain
      • Transparency of technology supply chains is critical, both from a business perspective and a national security perspective.
        • 5. Know who suppliers are and build an understanding of security measures.
        • 6. Set and communicate minimum transparency requirements consistent with existing standards and international benchmarks for your suppliers and encourage continuous improvement.
        • 7. Encourage suppliers to understand their supply chains, and be able to provide this information to consumers.
      • Knowing that your suppliers demonstrate integrity and are acting autonomously is fundamental to securing your supply chain.
        • 8. Consider the influence of foreign governments on suppliers and seek to ensure they operate with appropriate levels of autonomy.
        • 9. Consider if suppliers operate ethically, with integrity, and consistently with their human rights responsibilities.
        • 10. Build trusted, strategic relationships with suppliers
  • The United States’ (U.S.) Department of Justice (DOJ) announced that a member of a $100 million botnet conspiracy was sentenced to eight years in prison “for his role in operating a sophisticated scheme to steal and traffic sensitive personal and financial information in the online criminal underground.” The DOJ stated:
    • Aleksandr Brovko, 36, formerly of the Czech Republic, pleaded guilty in February to conspiracy to commit bank and wire fraud. According to court documents, Brovko was an active member of several elite, online forums designed for Russian-speaking cybercriminals to gather and exchange their criminal tools and services. 
    • As reflected in court documents, from 2007 through 2019, Brovko worked closely with other cybercriminals to monetize vast troves of data that had been stolen by “botnets,” or networks of infected computers.  Brovko, in particular, wrote software scripts to parse botnet logs and performed extensive manual searches of the data in order to extract easily monetized information, such as personally identifiable information and online banking credentials.  Brovko also verified the validity of stolen account credentials, and even assessed whether compromised financial accounts had enough funds to make it worthwhile to attempt to use the accounts to conduct fraudulent transactions. 
    • According to court documents, Brovko possessed and trafficked over 200,000 unauthorized access devices during the course of the conspiracy. These access devices consisted of either personally identifying information or financial account details. Under the U.S. Sentencing Guidelines, the estimated intended loss in this case has been calculated as exceeding $100 million.
  • The Office of the Privacy Commissioner of Canada (OPC), Office of the Information and Privacy Commissioner of Alberta (OIPC AB) and the Office of the Information and Privacy Commissioner for British Columbia (OIPC BC) found that “Cadillac Fairview – one of North America’s largest commercial real estate companies – embedded cameras inside their digital information kiosks at 12 shopping malls across Canada and used facial recognition technology without their customers’ knowledge or consent.”  The Commissioners asserted:
    • The goal, the company said, was to analyze the age and gender of shoppers and not to identify individuals. Cadillac Fairview also asserted that shoppers were made aware of the activity via decals it had placed on shopping mall entry doors that referred to their privacy policy – a measure the Commissioners determined was insufficient.
    • Cadillac Fairview also asserted that it was not collecting personal information, since the images taken by camera were briefly analyzed then deleted. However, the Commissioners found that Cadillac Fairview did collect personal information, and contravened privacy laws by failing to obtain meaningful consent as they collected the 5 million images with small, inconspicuous cameras. Cadillac Fairview also used video analytics to collect and analyze sensitive biometric information of customers.
    • The investigation also found that:
      • Facial recognition software was used to generate additional personal information about individual shoppers, including estimated age and gender.
      • While the images were deleted, investigators found that the sensitive biometric information generated from the images was being stored in a centralized database by a third party.
      • Cadillac Fairview stated that it was unaware that the database of biometric information existed, which compounded the risk of potential use by unauthorized parties or, in the case of a data breach, by malicious actors.
  • The United States (U.S.) Department of Defense (DOD) published its “DOD Electromagnetic Spectrum Superiority Strategy” the purpose of which “is to align DOD electromagnetic spectrum (EMS) activities with the objectives of the 2017 National Security Strategy, the 2018 National Defense Strategy, and national economic and technology policy goals.” The DOD stated:
    • This Strategy embraces the enterprise approach required to ensure EMS superiority by integrating efforts to enhance near-term and long-term EMS capabilities, activities, and operations. The Strategy informs the Department’s domestic EMS access policies and reinforces the need to develop cooperative frameworks with other EMS stakeholders in order to advance shared national policy goals. The traditional functions of Electromagnetic Spectrum Management (EMSM) and Electromagnetic Warfare (EW)—integrated as Electromagnetic Spectrum Operations (EMSO)—are addressed within the document’s strategic goals. This 2020 Strategy builds upon the successes of and supersedes both the DOD’s 2013 EMS Strategy and 2017 EW Strategy.
    • The DOD concluded:
      • DOD faces rapidly increasing challenges to its historical EMS dominance due in part to increasingly complex EMOEs. Threats to DOD capabilities due to EMS vulnerabilities have become increasingly sophisticated and easily attainable. Commercial technology advancements are proliferating wireless devices and services that are eroding DOD’s freedom of action in the EMS. At the same time, the U.S. military has increasing spectrum requirements for the operations, testing, and training of advanced warfighting capabilities. Finally, DOD must exploit near-peer adversaries’ EMS vulnerabilities through advanced EW to offset their capacity overmatch.
      • To cope with these challenges and achieve the vision of Freedom of Action in the Electromagnetic Spectrum, the DOD will actively pursue the areas outlined herein. DOD will enhance the ability to plan, sense, manage, and control military operations with advanced EMS technologies to ensure EMS superiority. The Department will also proactively engage with spectrum policymakers and partners to ensure spectrum policies support U.S . capability requirements. DOD will perform the governance functions needed to ensure our efforts are aligned and coordinated to maximize the results of our efforts.
      • The NDS directs the Department to “determine an approach to enhancing the lethality of the joint force against high end competitors and the effectiveness of our military against a broad spectrum of potential threats.” Realization of the NDS requires DOD to actualize the vision of this DOD EMS Superiority Strategy by implementing its goals and objectives through an empowered EMS enterprise. Advancing how DOD conducts operations in the EMS, and generates EMS superiority, will be critical to the success of all future missions for the United States, its allies, and partners.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by skeeze from Pixabay

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay

Section 230 Hearing Almost Devoid Of Discussion About Section 230

The Section 230 hearing was largely political theater.

The Senate Commerce, Science, and Transportation Committee held its long awaited hearing ostensibly on 47 U.S.C. 230 (Section 230) with the CEOs of Facebook, Google, and Twitter. I suppose the title of the hearing should have told us all we need to know about the approach of the Republican majority: “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” And, oddly enough, there are likely areas where Republicans can agree with Democrats in terms of less desirable outcomes flowing perhaps from Section 230 immunity. For example, The New York Times and other outlets have highlighted how technology platforms do at identifying and taking down child pornography or non-consensual pornography, and I would think tech’s staunchest supports would concede there is room for improvement. However, this hearing seem conceived and executed for to perpetuate the Republican narrative that technology companies are biased against them and their content. And, to amplify this message, Republican Senators crafted novel arguments (e.g. Senator Mike Lee (R-UT) claiming that a platform labeling a false or misleading statement is censorship) or all but yelled at the CEOs (e.g. Senator Ted Cruz (R-TX) positively shouting at Twitter head Jack Dorsey).

Chair Roger Wicker (R-MS) again propounded the position that technology companies should not be able to moderate, correct, label, or block political content, especially conservative material. In essence, Republicans seem to be making the case that Twitter, Facebook, Google, and others have become the de facto public square for 21st Century America, and just as a person marching with a sign in an actual town cannot be stopped, so, too, should it be online. This argument conveniently ignores the long-established fact that the First Amendment applies to government regulation or suppression of speech, and that private regulation or suppression is largely not protected by the First Amendment. Also, Republicans are taking the paradoxical position that the government should be able to dictate or bully private companies into complying with their desired policy outcome when they purport to favor free market economics. It is also telling that Wicker only wants to change Section 230 and not do away with it entirely. A cynic might observe that so long as the social media platforms are giving conservatives the treatment they want, the many other, extensively documented abuse and harassment women and people of color face online do not seem to be important enough to address. Moreover, Wicker had little to say about the tide of lies, misinformation, and disinformation flooding the online world. Finally, Wicker relied only on anecdotal evidence that conservatives and Republicans are somehow being muted or silenced at a greater rate than liberals and Democrats for the very good reason that no evidence from reputable research supports this argument. The data we have show conservative material flourishing online.

In his opening statement, Wicker claimed:

  • We have convened this morning to continue the work of this Committee to ensure that the internet remains a free and open space, and that the laws that govern it are sufficiently up to date. The internet is a great American success story, thanks in large part to the regulatory and legal structure our government put in place. But we cannot take that success for granted. The openness and freedom of the internet are under attack.
  • For almost 25 years, the preservation of internet freedom has been the hallmark of a thriving digital economy in the United States. This success has largely been attributed to a light-touch regulatory framework and to Section 230 of the Communications Decency Act – often referred to as the “26 words that created the internet.”
  • There is little dispute that Section 230 played a critical role in the early development and growth of online platforms.  Section 230 gave content providers protection from liability to remove and moderate content that they or their users consider to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” This liability shield has been pivotal in protecting online platforms from endless and potentially ruinous lawsuits. But it has also given these internet platforms the ability to control, stifle, and even censor content in whatever manner meets their respective “standards.”   The time has come for that free pass to end.
  • After 24 years of Section 230 being the law of the land, much has changed. The internet is no longer an emerging technology. The companies before us today are no longer scrappy startups operating out of a garage or a dorm room. They are now among the world’s largest corporations, wielding immense power in our economy, culture, and public discourse – immense power. The applications they have created are connecting the world in unprecedented ways, far beyond what lawmakers could have imagined three decades ago. These companies are controlling the overwhelming flow of news and information that the public can share and access. 
  • One noteworthy example occurred just two weeks ago after our subpoenas were unanimously approved; the New York Post – the country’s fourth largest newspaper – ran a story revealing communications between Hunter Biden and a Ukrainian official. The report alleged that Hunter Biden facilitated a meeting with his father, Joe Biden, who was then the Vice President of the United States. Almost immediately, both Twitter and Facebook took steps to block or limit access to the story. Facebook, according to its Policy Communications Manager, began “reducing its distribution on [the] platform” pending a third-party fact check.  Twitter went beyond that, blocking all users — including the House Judiciary Committee — from sharing the article on feeds and through direct messages. Twitter even locked the New York Post’s account entirely, claiming the story included “hacked materials” and was “potentially harmful.”
  • It is worth noting that both Twitter and Facebook’s aversion to hacked materials has not always been so stringent. For example, when the President’s tax returns were illegally leaked, neither company acted to restrict access to that information. Similarly, the now-discredited Steele dossier was widely shared without fact checking or disclaimers. This apparent double standard would be appalling under normal circumstances. But the fact that selective censorship is occurring in the midst of the 2020 election cycle dramatically amplifies the power wielded by Facebook and Twitter.
  • Google recently generated its own controversy when it was revealed that the company threatened to cut off several conservative websites, including the Federalist, from their ad platform. Make no mistake, for sites that rely heavily on advertising revenue for their bottom line, being blocked from Google’s services – or “demonetized” – can be a death sentence.
  • According to Google, the offense of these websites was hosting user-submitted comment sections that included objectionable content. But Google’s own platform, YouTube, hosts user-submitted comment sections for every video uploaded. It seems that Google is far more zealous in policing conservative sites than its own YouTube platform for the same types of offensive and outrageous language.
  • It is ironic, that when the subject is net neutrality technology companies, including Facebook, Google, and Twitter, have warned about the grave threat of blocking or throttling the flow of information on the internet. Meanwhile, these same companies are actively blocking and throttling the distribution of content on their own platforms and are using protections under 30 to do it.  Is it any surprise that voices on the right are complaining about hypocrisy or, even worse, anti-democratic election interference.
  • These recent incidents are only the latest in a long trail of censorship and suppression of conservative voices on the internet. Reasonable observers are left to wonder whether big tech firms are obstructing the flow of information to benefit one political ideology or agenda.
  • My concern is that these platforms have become powerful arbiters of what is true and what content users can access. The American public gets little insight into the decision-making process when content is moderated, and users have little recourse when they are censored or restricted. I hope we can all agree that the issues the Committee will discuss today are ripe for thorough examination and action. 
  • I have introduced legislation to clarify the intent of Section 230’s liability protections and increase the accountability of companies who engage in content moderation. The “Online Freedom and Viewpoint Diversity Act” would make important changes to “right-size” the liability shield and make clear what type of content moderation is protected. This legislation would address the challenges we have discussed while still leaving fundamentals of Section 230 in place.
  • Although some of my colleagues on the other side of the aisle have characterized this as a purely partisan exercise, there is strong bipartisan support for reviewing Section 230. In fact, both presidential candidates Trump and Biden have proposed repealing Section 230 in its entirety – a position I have not yet embraced. I hope we can focus today’s discussion on the issues that affect all Americans. Protecting a true diversity of viewpoints and free discourse is central to our way of life. I look forward to hearing from today’s witnesses about what they are doing to promote transparency, accountability, and fairness in their content moderation processes. And I thank each of them for cooperating with us in the scheduling of this testimony.

Ranking Member Maria Cantwell (D-WA) stayed largely in the mainstream of Democratic thought and policy on Section 230. She opened the aperture on technology issues and spotlighted the problems she sees, including the effect that declining advertising revenue has had on the U.S. media and the growing dominance of Facebook and Google have in online advertising. This is not surprising since she released a report on this very subject the day before. Cantwell discussed at some length Russian election interference, a subject tangentially related to Section 230. Perhaps, she was hinting that technology companies should be charged with finding and removing the types of misinformation foreign governments and malign actors are spreading to wreak havoc in the U.S. If so, she did not hit this point too hard. Rather her recitation of election interference was intended to get Republicans on their back foot, for if the subject of the hearing turned to Russian disinformation and related efforts, they may have to break ranks with the White House and President Donald Trump on the threat posed by Russia. Cantwell also went off topic a bit by obliquely discussing statements made by Trump and others about the security and legality of mail-in voting. She suggested without being specific that there may be means of bolstering Section 230 to drive platforms to take down disinformation and harmful material more expeditiously. Cantwell also poked Wicker by noting that the print media was not being subpoenaed to testify on why they largely ignore the New York Post’s questionable Hunter Biden article.

Cantwell asserted:

  • So these issues about how we harness the information age to work for us, and not against us, is something that we deal with every day of the week, and we want to have discussion and discourse. I believe that discussion and discourse today should be broader than just 230. There are issues of privacy that our committee has addressed and issues of how to make sure there is a free and competitive news market.
  • I noticed today we’re not calling in the NAB or the Publishers Association asking them why they haven’t printed or reprinted information that you alluded to in your testimony that you wish was more broadly distributed. To have the competition in the news market is to have a diversity of voices and diversity of opinion, and in my report, just recently released, we show that true competition really does help perfect information, both for our economy, and for the health of our democracy. So I do look forward to discussing these issues today. What I do not want today’s hearing to be is a chilling effect on the very important aspects of making sure that hate speech or misinformation related to health and public safety, are allowed to remain on the internet.
  • We all know what happened in 2016, and we had reports from the FBI, our intelligence agencies, and a bipartisan Senate committee that concluded in 2016, that Russian operatives did, masquerading as Americans, use targeted advertisements, intentionally falsified news articles, self generated content and social media platform tools to interact and attempt to deceive tens of millions of social media users in the United States. Director of National Intelligence, then Republican Senator–former Senator–Dan Coats said in July 2018, “The warning lights are blinking red that the digital infrastructure that serves our country is literally under attack.”
  • So I take this issue very seriously and have had for many years, that is, making sure, as the Mueller–Special Counsel Mueller indicated, 12 Russian intelligence officers hacked the DNC, and various information detailing phishing attacks into our state election boards, online personas, and stealing documents. So, when we had a subcommittee hearing and former Bush Homeland Security Director Michael Chertoff testified, I asked him point blank, because there were some of our colleagues who were saying, “you know what? Everybody does election interference.” So I asked him if election interference was something that we did, or should be encouraging? He responded that he agreed:  “Interfering with infrastructure or elections is completely off limits and unacceptable.”
  • That is why I believe that we should be working aggressively internationally to sanction anybody that interferes in our elections. So I hope today that we will get a report from the witnesses on exactly what they have been doing to clamp down on election interference. I hope that they will tell us what kind of hate speech and misinformation that they have taken off the books. It is no secret that there are various state actors who are doing all they can to take a whack at democracy, to try to say that our way of government, that our way of life, that our way of freedom of speech and information, is somehow not as good as we have made it, being the beacon of democracy around the globe.
  • I am not going to let or tolerate people to continue to whack at our election process, our vote by mail system, or the ability of tech platforms, security companies, our law enforcement entities, and the collective community to speak against misinformation and hate speech. We have to show that the United States of America stands behind our principles and that our principles do also transfer to the responsibility of communication online. As my colleagues will note, we’ve all been through this in the past. That is why you, Mr. Chairman, and I, and Senators Rosen and Thune, sponsored the Hack Act that is to help increase the security and cyber security of our nation and create a workforce that can fight against that. That is why I joined with Van Hollen and Rubio on the Deter Act, especially in establishing sanctions against Russian election interference, and to continue to make sure that we build the infrastructure of tomorrow.
  • So, I know that some people think that these issues are out of sight and out of mind. I guarantee you, they’re not. There are actors who have been at this for a long time. They wanted to destabilize Eastern Europe, and we became the second act when they tried to destabilize our democracy here by sewing disinformation. I want to show them that we in the United States do have fair elections. We do have a fair process. We are going to be that beacon of democracy.
  • So, I hope that as we talk about 230 today and we hear from the witnesses on the progress that they have made in making sure that disinformation is not allowed online, that we will also consider ways to help build and strengthen that. That is to say, as some of those who are testifying today, what can we do on transparency, on reporting, on analysis, and yes, I think you’re going to hear a lot about algorithms today, and the kinds of oversight that we all want to make sure that we can continue to have the diversity of voices in the United States of America, both online and offline.
  • I do want to say though, Mr. Chairman, I am concerned about the vertical nature of news and information. Today I expect to ask the witnesses about the fact that I believe they create a choke point for local news. The local news media have lost 70% of their revenue over the last decade, and we have lost thousands, thousands of journalistic jobs that are important. It was even amazing to me that the sequence of events yesterday had me being interviewed by someone at a newspaper who was funded by a joint group of the Knight Foundation, and probably Facebook funds, to interview me about the fact that the news media and broadcast has fallen on such a decline because of loss of revenue as they’ve made the transition to the digital age.
  • Somehow, somehow, we have to come together to show that the diversity of voices that local news represent need to be dealt with fairly when it comes to the advertising market. And that too much control in the advertising market puts a foot on their ability to continue to move forward and grow in the digital age. Just as other forms of media have made the transition, and yes still making the transition, we want to have a very healthy and dynamic news media across the United States of America. So, I plan to ask the witnesses today about that.
  • I wish we had time to go into depth on privacy and privacy issues but Mr. Chairman, you know, and so does Senator Thune and other colleagues of the Committee on my side, how important it is that we protect American consumers on privacy issues. That we’re not done with this work, that there is much to do to bring consensus in the United States on this important issue. And I hope that as we do have time or in the follow up to these questions, that we can ask the witnesses about that today.
  • But make no mistake, gentlemen, thank you for joining us, but this is probably one of many, many, many conversations that we will have about all of these issues. But again, let’s harness the information age, as you are doing, but let’s also make sure that consumers are fairly treated and that we are making it work for all of us to guarantee our privacy, our diversity of voices, and upholding our democratic principles and the fact that we, the United States of America, stand for freedom of information and freedom of the press.

Twitter CEO Jack Dorsey’s written testimony seeks to distinguish his platform’s good practices (e.g. transparency and no cow-towing to political powers that be) from Facebook’s bad practices. Regarding algorithms, the secret sauce of how users see what they see and why some content gets amplified, Dorsey seems to make the case that a platform should makes multiple algorithms available to users and they should choice. A couple of troubling implications follow from such an approach. First, if a user if seeing content that is objectionable, well, he bears some of the blame because he chose it. Secondly, allowing people to pick their own algorithms seems very similar to a platform using different algorithms for people in that the net effect will still be filter bubbles. The difference is with choice, there will be the illusion of control. Finally, on privacy, Dorsey sidesteps the issue of whether people should be allowed to stop platforms from collecting personal data by pledging his loyalty to giving people choice and control of its collection, use, and distribution.

In terms of Section 230, here are Dorsey’s thoughts:

  • As you consider next steps, we urge your thoughtfulness and restraint when it comes to broad regulatory solutions to address content moderation issues. We must optimize for new startups and independent developers. In some circumstances, sweeping regulations can further entrench companies that have large market shares and can easily afford to scale up additional resources to comply. We are sensitive to these types of competition concerns because Twitter does not have the same breadth of interwoven products or market size as compared to our industry peers. We want to ensure that new and small companies, like we were in 2006, can still succeed today. Doing so ensures a level playing field that increases the probability of competing ideas to help solve problems going forward. We must not entrench the largest companies further.
  • I believe the best way to address our mutually-held concerns is to require the publication of moderation processes and practices, a straightforward process to appeal decisions, and best efforts around algorithmic choice. These are achievable in short order. We also encourage Congress to enact a robust​ federal privacy framework that protects consumers while fostering competition and innovation.

Facebook CEO Mark Zuckerberg framed Section 230 as allowing free speech to thrive online because platforms would avoid legal liability and not host any material that could result in a lawsuit. He also praised the provisions that allow for content moderation, such as “basic moderation” for “removing hate speech and harassment that impacts the safety and security of their communities.” Zuckerberg avoids moderation of political content where the leaders of nations post material that is patently untrue or inflammatory. He then claimed Facebook supports giving people a voice, but, then this is contrary to media accounts of the company doing the bidding of authoritarian regimes to take down the posts of and shut down the accounts of dissidents and critics. Moreover, Zuckerberg argued that Section 230’s liability shield permits the company to police and remove material that creates risk through “harm by trying to organize violence, undermine elections, or otherwise hurt people.” Some have argued the opposite is the case, and if Facebook faced regulatory or legal jeopardy for hosting such material or not taking it down in a timely fashion, it would act much more quickly and expend more resources to do so.

Zuckerberg then detailed his company’s efforts to ensure the social media giant is providing Americans with accurate information about voting, much of which would please Democrats and displease Republicans, the latter of which have inveighed against the appending of fact checking to assertions made by Trump and others about the election.

Zuckerberg also pushed back on Cantwell’s assertions regarding the effect his platform and Google have had on journalism. He claimed Facebook is another venue by which media outlets can make money and touted the Facebook Journalism Project, in which Facebook has “invested more than $425 million in this effort, including developing news products;  providing grants, training, and tools for journalists; and working with publishers and educators to increase media literacy.”

As for Zuckerberg’s position on Section 230 legislation, he argued:

  • However, the debate about Section 230 shows that people of all political persuasions are unhappy with the status quo. People want to know that companies are taking responsibility for combatting harmful content—especially illegal activity—on their platforms. They want to know that when platforms remove content, they are doing so fairly and transparently. And they want to make sure that platforms are held accountable.
  • Section 230 made it possible for every major internet service to be built and ensured important values like free expression and openness were part of how platforms operate. Changing it is a significant decision. However, I believe Congress should update the law to make sure it’s working as intended. We support the ideas around transparency and industry collaboration that are being discussed in some of the current bipartisan proposals, and I look forward to a meaningful dialogue about how we might update the law to deal with the problems we face today.
  • At Facebook, we don’t think tech companies should be making so many decisions about these important issues alone. I believe we need a more active role for governments and regulators, which is why in March last year I called for regulation on harmful content, privacy, elections, and data portability. We stand ready to work with Congress on what regulation could look like in these areas. By updating the rules for the internet, we can preserve what’s best about it—the freedom for people to express themselves and for entrepreneurs to build new things—while also protecting society from broader harms. I would encourage this Committee and other stakeholders to make sure that any changes do not have unintended consequences that stifle expression or impede innovation.

Alphabet CEO Sundar Pichai framed Google’s many products as brining the world information for free. He voiced support for amorphous privacy legislation and highlighted Google’s $1 billion commitment to supporting some journalism outlets. He asserted Google, YouTube, and related properties exercise their content moderation without political bias. Pichia offered these sentiments on Section 230:

As you think about how to shape policy in this important area, I would urge the Committee to be very thoughtful about any changes to Section 230 and to be very aware of the consequences those changes might have on businesses and consumers.At the end of the day, we all share the same goal: free access to information for everyone and responsible protections for people and their data. We support legal frameworks that achieve these goals…

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Setting The Plate For Section 230 Hearing

The top Republican and Democrat on the Senate Commerce Committee seek to frame the 28 October hearing on Section 230 in the light they favor.

Before the Senate Commerce, Science, and Transportation Committee held its hearing today on 47 U.S.C. 230 (Section 230), both Chair Roger Wicker (R-MS) and Ranking Member Maria Cantwell (D-WA) sought to provide their slant on the proceedings. Wicker continued with the Republican narrative by suggesting social media platforms may be cooperating with the Biden Campaign, and Cantwell released a report on how these platforms have adversely affected local journalism to the detriment of American democracy.

Wicker sent letters to Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey that seem obliquely along the same lines as Senator Josh Hawley’s (R-MO) letter to the Federal Election Commission (FEC) claiming that the two platforms’ restriction on spreading the dubious New York Post story on Hunter Biden was an in-kind campaign contribution.

Wicker wrote to Zuckerberg and Dorsey

In the interest of fully disclosing any interactions with the candidates and their campaigns, I request that you provide the Committee with specific information regarding whether and how [Facebook/Twitter have] provided access to any data, analytics, or other information to either major political party, candidate, or affiliates thereof. This includes information related to advertising, post or page performance, engagement, or other data that might shape or influence decision-making by the candidate or campaign. In addition, please indicate whether this information is provided equitably to all candidates, and how decisions are made regarding what information is provided and to whom.

Clearly Wicker is after any indication that the Biden Campaign has received undue or extra help or information the Trump Campaign has not. Facebook taken millions in dollars of advertising from the two campaigns and from other parties. Twitter stopped accepting political advertising in late 2019. Consequently, it is likely there will be mountains of material to provide the committee. This inquiry may have been made in the interest of ensuring a fairly contested election. Or, perhaps Wicker and his staff have some inside information into the two platforms relations to the Biden Campaign. Perhaps the letter is meant as a fishing expedition in the hopes any such evidence will turn up.

Nonetheless, these letters may have the prophylactic effect of chilling any efforts Facebook and Twitter may take in the last week of the election lest they be hauled again before Congress to answer for their moderation and take down decisions regarding political and misinformation material. If it turns out the Trump Campaign has gotten advantageous treatment, it would be hard to see how Wicker and other Republicans would weave the fact of greater assistance to President Donald Trump into their perpetual campaign of decrying alleged but never proven anti-conservative bias.

But, as mentioned before, Wicker could attempt to portray any assistance provided to the Biden Campaign as an in-kind contribution as Hawley did after sharing of the dubious New York Post article was limited on the platforms even though there are clear exemptions for the media to federal laws and regulations on aid to campaigns.

Hawley claimed in a letter to the FEC that Twitter and Facebook have given the Biden Campaign an in-kind contribution by blocking the article in violation of federal campaign finance law. Hawley, however, was careful to couch his language in language suggesting that Twitter and Facebook’s actions (which he terms suppression) were in-kind contributions instead of outright asserting they are.

While Hawley quite accurately quotes the law on what constitutes a contribution (“[a] “contribution” includes “anything of value . . . for the purpose of influencing any election for Federal office”), he is apparently unaware of the regulations promulgated by the FEC to explicate gaps and unaddressed issues in the statute. FEC regulations shed further light on the issue at hand. Notably, in 11 CFR 100.71, the FEC’s regulations provide extensive exceptions to what is a contribution and provide “[t]he term contribution does not include payments, services or other things of value described in this subpart.” One such exception is found in 11 CFR 100.73, “News story, commentary, or editorial by the media,” which makes clear:

Any cost incurred in covering or carrying a news story, commentary, or editorial by any broadcasting station (including a cable television operator, programmer or producer), Web site, newspaper, magazine, or other periodical publication, including any Internet or electronic publication, is not a contribution unless the facility is owned or controlled by any political party, political committee, or candidate, in which case the costs for a news story.

One of the essential elements for such an action to be a contribution is control or ownership. I am fairly certain the Biden Campaign neither owns nor controls Twitter or Facebook. For if they do, they have been colossally inept in allowing President Donald Trump and his partisans to spread widely misinformation and lies about mail-in voting to name one such subject.

Moreover, the FEC and federal courts have long recognized the “press exemption” to what might otherwise be considered in-kind contributions or expenditures in violation of the law. This exemption includes websites and the internet. It would seem that Facebook and Twitter were acting in ways much more similar to how the traditional print media has. It is telling that Hawley and others have not pilloried the so-called liberal media for looking askance at the New York Post’s story and not taking it at face value to the extent they have covered it at all. Therefore, it seems like any value the Biden Campaign may have derived from social media platforms using 47 USC 230 in moderating content on their platform is not an in-kind contribution.

Cantwell released a report that she has mentioned during her opening statement at the 23 September hearing aimed at trying to revive data privacy legislation. She and her staff investigated the decline and financial troubles of local media outlets, which are facing a cumulative loss in advertising revenue of up to 70% since 2000. And since advertising revenue has long been the life blood of print journalism, this has devastated local media with many outlets shutting their doors or radically cutting their staff. This trend has been exacerbated by consolidation in the industry, often in concert with private equity or hedge funds looking to wring the last dollars of value from bargain basement priced newspapers. Cantwell also claimed that the overwhelming online advertising dominance of Google and Facebook has further diminished advertising revenue and other possible sources of funding through a variety of means. She intimates that much of this content may be illegal under U.S. law, and the Federal Trade Commission (FTC) may well be able to use its Section 5 powers against unfair and deceptive acts and its anti-trust authority to take action.

Cantwell detailed “Current and Suggested Congressional Considerations to Save Local News:”

  • Providing COVID-19 Emergency Financial Relief
    • As discussed in this report, the COVID-19 pandemic has had a devastating impact on local media outlets around the country. Congress should provide immediate support to stabilize these critical community institutions because it is very difficult to recreate a functioning local newsroom once its unique blend of knowledgeable local reporters, editorial controls, and regional subscribers is lost.
    • Congress should renew the Paycheck Protection Program (PPP), created by the Coronavirus Aid, Relief, and Economic Security (CARES) Act, to continue to support jobs at local news outlets. It should also expand the PPP to make thousands more local newspapers, radio, and television broadcasters eligible for emergency federal support.
    • Congress should also consider targeted tax incentives and grants as at least a short-term bridge to enable local news entities to survive the current economic turmoil.
  • Ensure Fair Return for Local News Content
    • Local news outlets create unmatched trusted content for local communities but, as discussed in this report, they are not being fairly compensated for their intellectual property by news aggregators, who are abusing their dominant positions in the marketplace.
    • Congress should consider requiring that news aggregation platforms enter into good faith negotiations with local news organizations and pay them fair market value for their content. Congress should also consider allowing local news organizations for a limited duration to collectively bargain for reuse of their content, provided there are strong controls in place to ensure that smaller publishers are not left behind.
  • Level the Playing Field for Local News
    • As detailed in this report, news aggregation platforms are using their market power and data aggregation practices to disadvantage local news.
    • Congress has a long history of addressing market abuses that stifle innovation and harm consumers. Rules preventing unfair, deceptive, and abusive practices can stop platforms from taking local news content without financial payment and retaliating against local news by hiding or removing their content from search engines or social media feeds. Similarly, statutes that prohibit market manipulation in other industries can serve as models to ensure online advertising markets are transparent and not contrived to benefit a dominant firm. Federal privacy protections can also serve to empower consumers to provide more support to local news organizations that provide them with more trusted and relevant information. Each of these changes should be crafted in a way to promote competition and consumer welfare and spur growth and innovation in the digital economy.

Cantwell’s report follows the House Judiciary Committee’s Antitrust, Commercial and Administrative Law Subcommittee’s “Investigation into Competition in Online Markets,” which also examined, in part, the effect of the digital dominance of Facebook and Google on the U.S. journalism industry. The Subcommittee asserted:

received testimony and submissions showing that the dominance of some online platforms has contributed to the decline of trustworthy sources of news, which is essential to our democracy. In several submissions, news publishers raised concerns about the “significant and growing asymmetry of power” between dominant platforms and news organizations, as well as the effect of this dominance on the production and availability of trustworthy sources of news. Other publishers said that they are “increasingly beholden” to these firms, and in particular, to Google and Facebook. Google and Facebook have an outsized influence over the distribution and monetization of trustworthy sources of news online, undermining the quality and availability of high-quality sources of journalism. This concern is underscored by the COVID-19 pandemic, which has laid bare the importance of preserving a vibrant free press in both local and national markets.

The Subcommittee recommended:

To address this imbalance of bargaining power, we recommend that the Subcommittee consider legislation to provide news publishers and broadcasters with a narrowly tailored and temporary safe harbor to collectively negotiate with dominant online platforms.

The Subcommittee noted:

In April 2019, Subcommittee Chairman [David] Cicilline (D-RI) and Doug Collins (R-GA), the former- Ranking Member of the Committee on the Judiciary, introduced H.R. 2054, the “Journalism Competition and Preservation Act of 2019.” H.R. 2054 would allow coordination by news publishers under the antitrust laws if it (1) directly relates to the quality, accuracy, attribution or branding, or interoperability of news; (2) benefits the entire industry, rather than just a few publishers, and is non-discriminatory to other news publishers; and (3) directly relates to and is reasonably necessary for these negotiations, instead of being used for other purposes.

Cantwell noted in her report “regulators across Europe and in Australia are taking steps to ensure that local publishers can continue to monetize their content and reach consumers.” She claimed “[p]artly in response to these regulatory actions, Google and Facebook have announced plans to provide limited compensation to a small slice of the news sector…[and [w]hether this compensation will be sufficient, or negotiated on fair terms, remains to be seen.”

In late July, the Australian Competition and Consumer Commission (ACCC) issued for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury and the companies. The ACCC explained:

The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.

This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off. Moreover, A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”

In mid-August, Google and the ACCC exchanged public letters, fighting over the latter’s proposal to ensure that media companies are compensated for articles and content the former uses.

  • In an Open Letter to Australians, Google claimed:
    • A proposed law, the News Media Bargaining Code, would force us to provide you with a dramatically worse Google Search and YouTube, could lead to your data being handed over to big news businesses, and would put the free services you use at risk in Australia.
    • You’ve always relied on Google Search and YouTube to show you what’s most relevant and helpful to you. We could no longer guarantee that under this law. The law would force us to give an unfair advantage to one group of businesses – news media businesses – over everyone else who has a website, YouTube channel or small business. News media businesses alone would be given information that would help them artificially inflate their ranking over everyone else, even when someone else provides a better result. We’ve always treated all website owners fairly when it comes to information we share about ranking. The proposed changes are not fair and they mean that Google Search results and YouTube will be worse for you.
    • You trust us with your data and our job is to keep it safe. Under this law, Google has to tell news media businesses “how they can gain access” to data about your use of our products. There’s no way of knowing if any data handed over would be protected, or how it might be used by news media businesses.
    • We deeply believe in the importance of news to society. We partner closely with Australian news media businesses — we already pay them millions of dollars and send them billions of free clicks every year. We’ve offered to pay more to license content. But rather than encouraging these types of partnerships, the law is set up to give big media companies special treatment and to encourage them to make enormous and unreasonable demands that would put our free services at risk.

In its response, the ACCC asserted:

  • The open letter published by Google today contains misinformation about the draft news media bargaining code which the ACCC would like to address. 
  • Google will not be required to charge Australians for the use of its free services such as Google Search and YouTube, unless it chooses to do so.
  • Google will not be required to share any additional user data with Australian news businesses unless it chooses to do so.
  • The draft code will allow Australian news businesses to negotiate for fair payment for their journalists’ work that is included on Google services.
  • This will address a significant bargaining power imbalance between Australian news media businesses and Google and Facebook.

Google has since published a follow up letter, claiming it does not oppose the draft code but rather wants a few changes. Google also dodged blame for the decline of media revenue, asserting “the fall in newspaper revenue over recent years was mainly the result of the loss of classified ads to online classifieds businesses.” Google trumpeted its 1 October decision to “to pay a number of publishers to license their content for a new product, including some in Australia, as well as helping train thousands of Australian journalists.” As announced by Google and Alphabet CEO Sundar Pichai, Google will pay some media outlets up to $1 billion over the next three years  “to create and curate high-quality content for a different kind of online news experience” for its new product, Google News Showcase. Pichai claimed:

This approach is distinct from our other news products because it leans on the editorial choices individual publishers make about which stories to show readers and how to present them. It will start rolling out today to readers in Brazil and Germany, and will expand to other countries in the coming months where local frameworks support these partnerships.

This decision was not well-received everywhere, especially in the European Union (EU), which is in the process of implementing an EU measure requiring Google and Facebook to pay the media for content. The European Publishers Council (EPC) noted:

The French Competition Authority decision from April considered that Google’s practices were likely to constitute an abuse of a dominant position and brought serious and immediate damage to the press sector. It calls on Google, within three months, to conduct negotiations in good faith with publishers and press agencies on the remuneration for their protected content. Google’s appeal in July seeks to get some legal clarity on parts of the decision.

Moreover, the European Union (EU) Directive on Copyright in the Digital Single Market is being implemented in EU member states and would allow them to require compensation from platforms like Facebook and Google. The EPC claimed:

Many are quite cynical about Google’s perceived strategy. By launching their own product, they can dictate terms and conditions, undermine legislation designed to create conditions for a fair negotiation, while claiming they are helping to fund news production.

Incidentally, earlier this month, a French appeals court ruled against Google in its fight to stop France’s competition authority to negotiate licensing fees for the use of French media. And, earlier today, Italy’s competition authority announced an investigation “against Google for an alleged abuse of dominant position in the Italian market for display advertising.” The agency asserted:

  • In the key market for online advertising, which Google controls also thanks to its dominant position on a large part of the digital value chain, the Authority questions the undertaking’s discriminatory use of the huge amount of data collected through its various applications, preventing rivals in the online advertising markets from competing effectively. More specifically, Google appears to have engaged in an internal/external discriminatory conduct, refusing to provide its competitors with Google ID decryption keys and excluding third-party tracking pixels. At the same time, Google has allegedly used tracking elements enabling its advertising intermediation services to achieve a targeting capability that some equally efficient competitors are unable to replicate.
  • The conducts investigated by the Authority may have a significant impact on competition in the various markets of the digital advertising value chain, with wide repercussions on competitors and consumers. The absence of competition in the intermediation of digital advertising, in fact, might reduce the resources allocated to website producers and publishers, thus impoverishing the quality of content directed to end customers. Moreover, the absence of effective competition based on merits could discourage technological innovation for the development of advertising technologies and techniques less intrusive for consumers.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Roman Kraft on Unsplash

Further Reading, Other Developments, and Coming Events (26 October)

Further Reading

  •  “Google is giving data to police based on search keywords, court docs show” By Alfred Ng — c|net. Google is responding to keyword warrants where prosecutors ask the company to provide IP addresses for all people who made a certain search within a geographical area during a certain time. In the case discussed in the piece (bizarrely witness intimidation of someone testifying against R. Kelly), a keyword warrant allowed them to locate a person who may have burned down someone’s house. It is likely this warrant will be challenged on Fourth Amendment grounds.
  • Google AI Tech Will Be Used for Virtual Border Wall, CBP Contract Shows” By Lee Fang and Sam Biddle — The Intercept. Google may again be wading into territory its labor force may find objectionable. The United States (U.S.) Customs and Border Protection (CBP) will use Google Cloud in its artificial intelligence-driven virtual fence on the U.S.-Mexico border. This may result in employee push back as it did in 2018 when this sort of internal pressure caused Google to walk away from a Department of Defense program, Project Maven. A whistleblower group ferreted out the fact that Google is contracting with CBP, which took some effort considering Google appears to be a subcontractor to a prime contractor.
  • Facebook Manipulated the News You See to Appease Republicans, Insiders Say” By Monika Bauerlein and Clara Jeffery — Mother Jones. In January 2018 Facebook changed its algorithm to try to address the growing toxicity during and after the 2016 election. The supposed solution was to remove untrustworthy information. However, the original test of this new algorithm led to deprioritizing many conservative sources that traffic in misinformation and slanted stories. This was deemed unacceptable from a political point of view, and the opposite was done. A number of liberal media organizations saw their traffic drop off a cliff.
  • Why A Gamer Started A Web Of Disinformation Sites Aimed At Latino Americans” By Kaleigh Rogers and Jaime Longoria — FiveThirtyEight. The reason why a gamer and YouTuber started fake sites aimed at Latinos was profit, nothing else.
  • Twitter and White House deny claims that researcher hacked Trump’s account” By Adi Robertson — The Verge. A Dutch researcher claims the password maga2020 got him into President Donald Trump’s Twitter account even though the White House and Twitter both deny the claim. There is a bizarre tweet Trump sent earlier this month that may, in fact, be the work of this researcher. In any event, he is being coy about whether he sent it or not.

Other Developments

  • The United Kingdom’s Information Commissioner’s Office (ICO) reduced its fine on British Airways (BA) to a ninth of the preliminary total for violations of the General Data Protection Regulation (GDPR). The ICO has levied a £20 million fine on the airline “for failing to protect the personal and financial details of more than 400,000 of its customers.” In July 2019, the ICO issued a notice of its intention to fine British Airways £183.39 million because the “[p]ersonal data of approximately 500,000 customers were compromised.” After BA made its case, the ICO reduced the fine to £30 million before knocking off another £10 million because of mitigating factors and a British government policy to ease up on businesses during the pandemic. Conceivably, the fine could have been much higher for the GDPR allows for fines of up to 4% of worldwide revenue, and in this case, for the period in question, BA had £12.26 billion in revenue. The ICO explained:
    • The attacker is believed to have potentially accessed the personal data of approximately 429,612 customers and staff. This included names, addresses, payment card numbers and CVV numbers of 244,000 BA customers.
    • Other details thought to have been accessed include the combined card and CVV numbers of 77,000 customers and card numbers only for 108,000 customers.
    • Usernames and passwords of BA employee and administrator accounts as well as usernames and PINs of up to 612 BA Executive Club accounts were also potentially accessed.
    • The ICO found:
      • There were numerous measures BA could have used to mitigate or prevent the risk of an attacker being able to access the BA network. These include:
        • limiting access to applications, data and tools to only that which are required to fulfil a user’s role
        • undertaking rigorous testing, in the form of simulating a cyber-attack, on the business’ systems;
        • protecting employee and third party accounts with multi-factor authentication.
      • Additional mitigating measures BA could have used are listed in the penalty notice.
      • None of these measures would have entailed excessive cost or technical barriers, with some available through the Microsoft Operating System used by BA.
      • Since the attack, BA has made considerable improvements to its IT security.
      • ICO investigators found that BA did not detect the attack on 22 June 2018 themselves but were alerted by a third party more than two months afterwards on 5 September. Once they became aware BA acted promptly and notified the ICO.
      • It is not clear whether or when BA would have identified the attack themselves. This was considered to be a severe failing because of the number of people affected and because any potential financial harm could have been more significant.
  • The Congressionally created Cyberspace Solarium Commission (CSC) issued a white paper “Building a Trusted ICT Supply Chain,” with its assessment as to why the United States (U.S.) no longer has a thriving technological industrial base and how it might again, which is nothing less than a matter of signal importance considering the growing dominance of the People’s Republic of China (PRC) in those fields. With the CSC releasing this white paper, it has become another player on the field in U.S. government policy circles proposing how the U.S. may protect its information and communications technology (ICT) supply chain against sabotage, malice, or control by an adversarial power. 
    • The CSC claimed:
      • United States lacks key industrial capacities crucial to the production of essential technologies, including fifth-generation (5G) telecommunications equipment. Among other factors, the willingness of countries such as China to subsidize and support their domestic industries has created the uneven playing field that hinders the competitiveness and, ultimately, the viability of U.S. companies in global markets. The resulting lack of industrial capacity has forced critical dependencies on companies that manufacture in adversary countries, such as China, where companies are beholden to Chinese national intelligence, national cybersecurity, and national security laws. While dependency on foreign production and foreign goods is not inherently bad—indeed, the United States relies on manufacturing and companies headquartered in partner countries such as Finland, Sweden, South Korea, and Taiwan—the U.S. government must emphasize the importance of trusted suppliers, and these dependencies pose three concrete risks to the security of the United States.
    • The CSC explained why fostering a supply chain for ICT in the U.S. will not be easy:
      • Three main challenges confront attempts to rebuild U.S. high-tech manufacturing capacity: (1) lack of patient funding capital, (2) high investment barriers to entry, and (3) standards and intellectual property barriers to entry. These challenges arise from the simple fact that the economics of the hardware industry are not as attractive as those of many other technology sectors. One of the major shortcomings of U.S. efforts to date to secure ICT supply chains is their failure to address how the United States got to this point, where ICT equipment manufacturing and production is a critical economic weakness. In order to craft an effective strategy to rebuild high-tech manufacturing and gain greater industrial independence, policymakers must first understand the challenges to reinvigorating the United States’ high-tech manufacturing industry. Only then can they comprehend why market forces have pushed U.S. high-tech industrial capacity to atrophy over the past two decades and recognize the issues that they must tackle in developing an industrial base strategy.
      • None of these barriers are insurmountable, but the reality is that the United States has lost much of its market share for the manufacture of electronics components and nearly all of its market share for the manufacture and assembly of finished electronics products. Nonetheless, a U.S. strategy to secure its ICT supply chain from all threats must include a plan to identify the key technologies and materials, and then attract more patient investment in hardware manufacturing, devise a method to retrain the atrophied muscles of production, and set the conditions to overcome barriers to entry posed by the constraints of standards and intellectual property.
    • The CSC “specifies a strategy to build trusted supply chains for critical ICT by:
      • Identifying key technologies and equipment through government reviews and public-private partnerships to identify risk.
      • Ensuring minimum viable manufacturing capacity through both strategic investment and the creation of economic clusters.
      • Protecting supply chains from compromise through better intelligence, information sharing, and product testing.
      • Stimulating a domestic market through targeted infrastructure investment and ensuring the ability of firms to offer products in the United States similar to those offered in foreign markets.
      • Ensuring global competitiveness of trusted supply chains, including American and partner companies, in the face of Chinese anti-competitive behavior in global markets.
    • The CSC also highlighted “five key and eight supporting recommendations to build trusted supply chains for critical ICT technologies:
      • Supply Chain 1: Congress should direct the executive branch to develop and implement an information and communication technologies industrial base strategy.
      • Supply Chain 2: Congress should direct the Department of Homeland Security, in coordination with the Department of Commerce, Department of Defense, Department of State, and other departments and agencies, to identify key information and communication technologies and materials through industry consultation and government review.
      • Supply Chain 3: Congress should direct the Department of Commerce, in consultation with the Department of Homeland Security, the Department of State, and the Department of Defense, to conduct a viability study of localities fit for economic clustering. It should fund the Department of Commerce, in consultation with the Department of Homeland Security, Department of State, and Department of Defense, to solicit competitive bids and applications from candidate states, municipalities, and localities for the designation of no fewer than three and no more than five critical technology manufacturing clusters.
        • Supply Chain 3.1: The federal government should commit significant and consistent funding toward research and development in emerging technologies.
        • Supply Chain 3.2: The federal government should, in partnership with partner and ally governments, develop programs to incentivize the movement of critical chip and technology manufacturing out of China.
        • Supply Chain 3.3: Congress should direct the President to conduct a study on the viability of a public-private national security investment corporation to attract private capital for investment in strategically important areas.
      • Supply Chain 4: The President should designate a lead agency to integrate and coordinate government ICT supply chain risk management efforts into an ongoing national strategy and to serve as the nexus for public-private partnerships on supply chain risk management.
        • Supply Chain 4.1: Congress should direct the President to construct or designate a National Supply Chain Intelligence Center.
        • Supply Chain 4.2: Congress should fund three Critical Technology Security Centers, selected and designated by DHS, in collaboration with the Department of Commerce, Department of Energy, Office of the Director of National Intelligence (ODNI), and Department of Defense.
      • Supply Chain 5: The Federal Communications Commission (FCC) should tie 5G infrastructure investment to open and interoperable standards and work with the Department of Defense and the National Telecommunications and Information Agency to facilitate the release of more mid-band spectrum in order to ensure a strong domestic market for telecommunications equipment.
        • Supply Chain 5.1: The U.S. Agency for International Development (USAID) should work with international partners to develop a digital risk impact assessment that highlights the risks associated with the use of untrusted technologies in implementing digitization and telecommunications infrastructure projects.
        • Supply Chain 5.2: Congress should ensure that the Export-Import Bank (EXIM), U.S. International Development Finance Corporation (DFC), and United States Trade Development Agency (USTDA) all operate in legal, regulatory, and funding environments conducive to successfully competing with Chinese state-owned and state-backed enterprises, including their ability to support investments from companies headquartered in partner and ally countries.
        • Supply Chain 5.3: USAID, DFC, and USTDA should develop and maintain a list of prohibited contractors and clients, including companies subject to the Chinese national security and national intelligence laws, that may not be used to implement USAID-, DFC-, and USTDA-funded projects.
  • The Federal Trade Commission (FTC) has reportedly met to review its anti-trust case against Facebook that could get filed as soon as next month. The FTC start looking into Facebook’s dominance in the social messaging market about the same time it handed down a $5 billion fire for the tech giant’s involvement with Cambridge Analytica that violated the 2012 consent decree. The anti-trust investigation is reportedly focused on Facebook’s acquisitions of WhatsApp and Instagram, two of the world’s largest messaging platforms. The FTC is reportedly focused on the effects of Facebook’s buying two potential competitors, WhatsApp and Instagram, and if the FTC succeeds in a suit against Facebook, the company may be forced to spin off those two entities. Moreover, New York Attorney General Tish James is leading a state investigation of Facebook that “focuses on Facebook’s dominance in the industry and the potential anticompetitive conduct stemming from that dominance.” This inquiry started over a year ago, and any timing on possible action is not clear. The European Commission is also reportedly looking at Facebook for anti-trust violations as media accounts indicated in late 2019.
    • The House Judiciary Committee argued in its recent report on competition in digital markets that “the strong network effects associated with Facebook has tipped the market toward monopoly such that Facebook competes more vigorously among its own products—Facebook, Instagram, WhatsApp, and Messenger—than with actual competitors.” In response to the House Judiciary Committee’s view on these deals, a Facebook spokesperson claimed “[a] strongly competitive landscape existed at the time of both acquisitions and exists today…[and] [r]egulators thoroughly reviewed each deal and rightly did not see any reason to stop them at the time.”
    • In February 2019, the German agency with jurisdiction over competition issued a decision that potentially could block Facebook from combining the personal data of Germans from other Facebook-owned entities such as Instagram and WhatsApp or from unrelated third-party sources. According to the Bundeskartellamt’s press release, the agency “has imposed on Facebook far-reaching restrictions in the processing of user data.”
  • A group of nations are proposing a third way to bridge the dual efforts of two United Nations (U.N.) groups to develop cyber norms. In the “The future of discussions on ICTs and cyberspace at the UN,” this group of nations propose to “explore establishment of a Programme of Action for advancing responsible State behaviour in cyberspace with a view to ending the dual track discussions (GGE/OEWG) and establishing a permanent UN forum to consider the use of ICTs by States in the context of international security.” They stressed “the urgent need for the international community to address the use of ICTs in the context of international peace and security.” France, Egypt, Argentina, Colombia, Ecuador, Gabon, Georgia, Japan, Morocco, Norway, Salvador, Singapore, the Republic of Korea, the Republic of Moldova, The Republic of North Macedonia, the United Kingdom, the EU and its member States submitted the proposal.
    • These nations argued:
      • Since 2018, two working groups and many initiatives have started under the auspices of the UN. We welcome the willingness of the international community to engage, and recognize that each of those initiatives has its own merits and specificities. Yet, they aim at tackling the same issues: advancing norms of responsible behaviour, understanding how international law concretely applies to cyberspace, developing CBMs and fostering capacity building. We consider that this situation, although evidencing the growing commitment of the international community to dedicating time and resources to the matters at hand, creates redundancies and, at times, can be counter-productive. It is therefore a cause for concern.
      • In the fall of 2019, the U.N. Group of Governmental Experts (GGE) and the U.N. Open-ended Working Group (OEWG) started meeting per U.N. resolutions to further consultative discussions on an international agreement or set of agreements on what is considered acceptable and unacceptable cyber practices. Previous efforts largely stalled over disagreements between a bloc led by the U.S. and its allies and nations like the People’s Republic of China (PRC), Russia, and others with a different view on acceptable practices. Notably, unlike 2010, 2013 and 2015, the 2017 U.N. GGE could not reach agreement on additional voluntary, non-binding norms on how nations should operate in cyberspace. The OEWG was advocated for by countries like Russia, the PRC, and others seen as being in opposition to some of the views propagated by the U.S. and its allies, notably on the issue of what kind of measures a nation may use inside its borders to limit internet usage for its citizens.
      • As explained in a 2018 U.N. press release, competing resolutions were offered to create groups “aimed at shaping norm-setting guidelines for States to ensure responsible conduct in cyberspace:”
        • the draft resolution “Developments in the field of information and telecommunications in the context of international security” (document A/C.1/73/L.27.Rev.1), tabled by the Russian Federation.  By the text, the Assembly would decide to convene in 2019 an open-ended working group acting on a consensus basis to further develop the rules, norms and principles of responsible behaviour of States.
        • the draft resolution “Advancing Responsible State Behaviour in Cyberspace in the Context of International Security” (document A/C.1/73/L.37), tabled by the United States…[that] would request the Secretary-General, with the assistance of a group of governmental experts to be established in 2019, to continue to study possible cooperative measures to address existing and potential threats in the sphere of information security, including norms, rules and principles of responsible behaviour of States.
  • The United Kingdom’s Information Commissioner’s Office (ICO) published a compulsory audit of the Department for Education (DfE) and found:
    • The audit found that data protection was not being prioritised and this had severely impacted the DfE’s ability to comply with the UK’s data protection laws. A total of 139 recommendations for improvement were found, with over 60% classified as urgent or high priority.
    • The ICO explained:
      • The Commissioner’s Enforcement team ran a broad range investigation in 2019 following complaints from DefendDigitalMe and Liberty and their concerns around the National Pupil Database (NPD). The ICO met with key senior level data protection professionals at the DfE’s offices in London in November2019 where the possibilities of a consensual audit were discussed. However, due to the risks associated with the volume and types of personal data processed within the NPD as well as the ages of the data subjects involved, the Commissioner decided, in line with her own Regulatory Action Policy, to undertake a compulsory audit using her powers under section 146 of the DPA18.The Commissioner determined this approach would provide a comprehensive review of DfE data protection practices, governance and other key control measures supporting the NPD and internally held databases, using the framework of scope areas of audit as listed below. This would allow the Commissioner to identify any risk associated with the data processed and implications to the individual rights of over 21 million data subjects.
  • The European Commission (EC) announced it “made commitments offered by [United States firm] Broadcom legally binding under EU antitrust rules.” The EC started looking into the company in mid-2019 for supposedly abusive behavior that was harming players and people in the TV and modem chipset markets in the European Union.
    • The EC explained:
      • In June 2019, the Commission initiated proceedings into alleged abuse of dominance by Broadcom and at the same time issued a Statement of Objections seeking the imposition of interim measures. In October 2019, the Commission took a decision concluding that interim measures were necessary to prevent serious and irreparable damage to competition from occurring in the worldwide markets for SoCs for (i) TV set-top boxes, (ii) xDSL modems, (iii) fibre modems, as well as (iv) cable modems.
      • The Commission took issue with certain exclusivity or quasi-exclusivity and leveraging arrangements imposed by Broadcom in relation to SoCs for TV set top boxes, xDSL and fibre modems. The decision ordered Broadcom to stop applying these provisions contained in agreements with six of its main customers and ordered the implementation of interim measures applicable for a period of three years.
    • The EC asserted Broadcom has agreed to the following:
      • At European Economic Area (EEA) level, Broadcom will:
        • a) Not require or induce by means of price or non-price advantages an OEM to obtain any minimum percentage of its EEA requirements for SoCs for TV set-top boxes, xDSL modems and fibre modems from Broadcom; and
        • b) Not condition the supply of, or the granting of advantages for, SoCs for TV set-top boxes, xDSL modems and fibre modems on an OEM obtaining from Broadcom another of these products or any other product within the scope of the commitments (i.e. SoCs for cable modems, Front End Chips for set-top boxes and modems and/or Wi-Fi Chips for set-top boxes and modems).
      • At worldwide level (excluding China), Broadcom will:
        • a) Not require or induce an OEM by means of certain types of advantages to obtain more than 50% of its requirements for SoCs for TV set-top boxes, xDSL modems and fibre modems from Broadcom; and
        • b) Not condition the supply of, or the granting of advantages for, SoCs for TV set-top boxes, xDSL modems and fibre modems on an OEM obtaining from Broadcom more than 50% of its requirements for any other of these products, or for other products within the scope of the commitments.
      • The commitments also include specific provisions regarding incentives to bid equipment based on Broadcom products as well as certain additional clauses with regard to service providers in the EEA.

Coming Events

  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released its agenda:
    • Restoring Internet Freedom Order Remand. The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11-42)
    • Establishing a 5G Fund for Rural America . The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces. The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications . The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option . The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets. The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements. The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next-generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.
  • The Senate Commerce, Science, and Transportation Committee will hold a hearing on 28 October regarding 47 U.S.C. 230 titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior?” with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.
  • On 29 October, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Isaac Struna on Unsplash