Preview of Senate Democratic Chairs

It’s not clear who will end up where, but new Senate chairs will change focus and agenda of committees and debate over the next two years.

With the victories of Senators-elect Rafael Warnock (D-GA) and Jon Ossoff (D-GA), control of the United States Senate will tip to the Democrats once Vice President-elect Kamala Harris (D) is sworn in and can break the 50-50 tie in the chamber in favor of the Democrats. With the shift in control, new chairs will take over committees key to setting the agenda over the next two years in the Senate. However, given the filibuster, and the fact that Senate Republicans will exert maximum leverage through its continued use, Democrats will be hamstrung and forced to work with Republicans on matters such as federal privacy legislation, artificial intelligence (AI), the Internet of Things (IOT), cybersecurity, data flows, surveillance, etc. just as Republicans have had to work with Democrats over the six years they controlled the chamber. Having said that, Democrats will be in a stronger position than they had been and will have the power to set the agenda in committee hearings, being empowered to call the lion’s share of witnesses and to control the floor agenda. What’s more, Democrats will be poised to confirm President-elect Joe Biden’s nominees at agencies like the Federal Communications Commission (FCC), Federal Trade Commission (FTC), the Department of Justice (DOJ), and others, giving the Biden Administration a free hand in many areas of technology policy.

All of that being said, this is not meant to be an exhaustive look at all the committees of jurisdiction and possible chairs. Rather, it seeks to survey likely chairs on selected committees and some of their priorities for the next two years. Subcommittee chairs will also be important, but until the cards get shuffled among the chairs, it will not be possible to see where they land at the subcommittee level.

When considering the possible Democratic chairs of committees, one must keep in mind it is often a matter of musical chairs with the most senior members getting first choice. And so, with Senator Patrick Leahy (D-VT) as the senior-most Democratic Senator, he may well choose to leave the Appropriations Committee and move back to assume the gavel of the Judiciary Committee. Leahy has long been a stakeholder on antitrust, data security, privacy, and surveillance legislation and would be in a position to influence what bills on those and other matters before the Senate look like. If Leahy does not move to the chair on Judiciary, he may still be entitled to chair a subcommittee and exert influence.

If Leahy stays put, then current Senate Minority Whip Dick Durbin (D-IL) would be poised to leapfrog Senator Dianne Feinstein (D-CA) to chair Judiciary after Feinstein was persuaded to step aside on account of her lackluster performance in a number of high-profile hearings in 2020. Durbin has also been active on privacy, data security, and surveillance issues. The Judiciary Committee will be central to a number of technology policies, including Foreign Intelligence Surveillance Act reauthorization, privacy legislation, Section 230 reform, antitrust, and others. On the Republican side of the dais, Senator Lindsey Graham (R-SC) leaving the top post because of term limit restrictions imposed by Republicans, and Senator Charles Grassley (R-IA) is set to replace him. How this changes the 47 USC 230 (Section 230) debate is not immediately clear. And yet, Grassley and three colleagues recently urged the Trump Administration in a letter to omit language in a trade agreement with the United Kingdom (UK) that mirrors the liability protection Section 230. Senators Rob Portman (R-OH), Mark R. Warner (D-VA), Richard Blumenthal (D-CT), and Grassley argued to U.S. Trade Representative Ambassador Robert Lighthizer that a “safe harbor” like the one provided to technology companies for hosting or moderating third party content is outdated, not needed in a free trade agreement, contrary to the will of both the Congress and UK Parliament, and likely to be changed legislatively in the near future. It is likely, however, Grassley will fall in with other Republicans propagating the narrative that social media is unfairly biased against conservatives, particularly in light of the recent purge of President Donald Trump for his many, repeated violations of policy.

The Senate Judiciary Committee will be central in any policy discussions of antitrust and anticompetition in the technology realm. But it bears note the filibuster (and the very low chances Senate Democrats would “go nuclear” and remove all vestiges of the functional supermajority requirement to pass legislation) will give Republicans leverage to block some of the more ambitious reforms Democrats might like to enact (e.g. the House Judiciary Committee’s October 2020 final report that calls for nothing less than a complete remaking of United States (U.S.) antitrust policy and law; see here for more analysis.)

It seems Senator Sherrod Brown (D-OH) will be the next chair of the Senate Banking, Housing, and Urban Development Committee which has jurisdiction over cybersecurity, data security, privacy, and other issues in the financial services sector, making it a player on any legislation designed to encompass the whole of the United States economy. Having said that, it may again be the case that sponsors of, say, privacy legislation decide to cut the Gordian knot of jurisdictional turf battles by cutting out certain committees. For example, many of the privacy bills had provisions making clear they would deem financial services entities in compliance with the Financial Services Modernization Act of 1999 (P.L. 106-102) (aka Gramm-Leach-Bliley) to be in compliance with the new privacy regime. I suppose these provisions may have been included on the basis of the very high privacy and data security standards Gramm-Leach-Bliley has brought about (e.g. the Experian hack), or sponsors of federal privacy legislation made the strategic calculation to circumvent the Senate Banking Committee as much as they can. Nonetheless, this committee has sought to insert itself into the policymaking process on privacy last year as Brown and outgoing Chair Mike Crapo (R-ID) requested “feedback” in February 2019 “from interested stakeholders on the collection, use and protection of sensitive information by financial regulators and private companies.” Additionally, Brown released what may be the most expansive privacy bill from the perspective of privacy and civil liberties advocates, the “Data Accountability and Transparency Act of 2020” in June 2020 (see here for my analysis.) Therefore, Brown may continue to push for a role in federal privacy legislation with a gavel in his hands.

In a similar vein, Senator Patty Murray (D-WA) will likely take over the Senate Health, Education, Labor, and Pensions (HELP) Committee which has jurisdiction over health information privacy and data security through the Health Insurance Portability and Accountability Act of 1996 (HIPAA) and the Health Information Technology for Economic and Clinical Health Act of 2009 (HITECH Act). Again, as with the Senate Banking Committee and Gramm-Leach-Bliley, most of the privacy bills exempt HIPAA-compliant entities. And yet, even if her committee is cut out of a direct role in privacy legislation, Murray will still likely exert influence through oversight of and possible legislation changing HIPAA regulations and the Department of Health and Human Services (HHS) enforcement and rewriting of these standards for most of the healthcare industry. For example, HHS is rushing a rewrite of the HIPAA regulations at the tail end of the Trump Administration, and Murray could be in a position to inform how the Biden Administration and Secretary of Health and Human Services-designate Xavier Berra handles this rulemaking. Additionally, Murray may push the Office of Civil Rights (OCR), the arm of HHS that writes and enforces these regulations, to prioritize matters differently.

Senator Maria Cantwell (D-WA) appears to be the next chair of the Senate Commerce, Science, and Transportation Committee and arguably the largest technology portfolio in the Senate. It is the primary committee of jurisdiction for the FCC, FTC, National Telecommunications and Information Administration (NTIA), the National Institute of Standards and Technology (NIST), and the Department of Commerce. Cantwell may exert influence on which people are nominated to head and staff those agencies and others. Her committee is also the primary committee of jurisdiction for domestic and international privacy and data protection matters. And so, federal privacy legislation will likely be drafted by this committee, and legislative changes so the U.S. can enter into a new personal data sharing agreement with the European Union (EU) would also likely involve her and her committee.

Cantwell and likely next Ranking Member Roger Wicker (R-MS) agree on many elements of federal privacy law but were at odds last year on federal preemption and whether people could sue companies for privacy violations. Between them, they circulated three privacy bills. In September 2020, Wicker and three Republican colleagues introduced the “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) (see here for more analysis). Wicker had put out for comment a discussion draft, the “Consumer Data Privacy Act of 2019” (CDPA) (See here for analysis) in November 2019 shortly after the Ranking Member on the committee, Senator Maria Cantwell (D-WA) and other Democrats had introduced their privacy bill, the “Consumer Online Privacy Rights Act“ (COPRA) (S.2968) (See here for more analysis).

Cantwell could also take a leading role on Section 230, but her focus, of late, seems to be on how technology companies are wreaking havoc to traditional media. released a report that she has mentioned during her opening statement at the 23 September hearing aimed at trying to revive data privacy legislation. She and her staff investigated the decline and financial troubles of local media outlets, which are facing a cumulative loss in advertising revenue of up to 70% since 2000. And since advertising revenue has long been the life blood of print journalism, this has devastated local media with many outlets shutting their doors or radically cutting their staff. This trend has been exacerbated by consolidation in the industry, often in concert with private equity or hedge funds looking to wring the last dollars of value from bargain basement priced newspapers. Cantwell also claimed that the overwhelming online advertising dominance of Google and Facebook has further diminished advertising revenue and other possible sources of funding through a variety of means. She intimates that much of this content may be illegal under U.S. law, and the FTC may well be able to use its Section 5 powers against unfair and deceptive acts and its anti-trust authority to take action. (see here for more analysis and context.) In this vein, Cantwell will want her committee to play in any antitrust policy changes, likely knowing massive changes in U.S. law are not possible in a split Senate with entrenched party positions and discipline.

Senator Jack Reed (D-RI) will take over the Senate Armed Services Committee and its portfolio over national security technology policy that includes the cybersecurity, data protection and supply chain of national security agencies and their contractors, AI, offensive and defensive U.S. cyber operations, and other realms. Much of the changes Reed and his committee will seek to make will be through the annual National Defense Authorization Act (NDAA) (see here and here for the many technology provisions in the FY 2021 NDAA.) Reed may also prod the Department of Defense (DOD) to implement or enforce the Cybersecurity Maturity Model Certification (CMMC) Framework differently than envisioned and designed by the Trump Administration. In December 2020, a new rule took effect designed to drive better cybersecurity among U.S. defense contractors. This rule brings together two different lines of effort to require the Defense Industrial Base (DIB) to employ better cybersecurity given the risks they face by holding and using classified information, Federal Contract Information (FCI) and Controlled Unclassified Information (CUI). The Executive Branch has long wrestled with how to best push contractors to secure their systems, and Congress and the White House have opted for using federal contract requirements in that contractors must certify compliance. However, the most recent initiative, the CMMC Framework will require contractors to be certified by third party assessors. And yet, it is not clear the DOD has wrestled with the often-misaligned incentives present in third party certification schemes.

Reed’s committee will undoubtedly delve deep into the recent SolarWinds hack and implement policy changes to avoid a reoccurrence. Doing so may lead the Senate Armed Services Committee back to reconsidering the Cyberspace Solarium Commission’s (CSC) March 2020 final report and follow up white papers, especially their views embodied in “Building a Trusted ICT Supply Chain.”

Senator Mark Warner (D-VA) will likely take over the Senate Intelligence Committee. Warner has long been a stakeholder on a number of technology issues and would be able to exert influence on the national security components of such issues. He and his committee will almost certainly play a role in the Congressional oversight of and response to the SolarWinds hack. Likewise, his committee shares jurisdiction over FISA with the Senate Judiciary Committee and over national security technology policy with the Armed Services Committee.

Senator Amy Klobuchar (D-MN) would be the Senate Democratic point person on election security from her perch at the Senate Rules and Administration Committee, which may enable her to more forcefully push for the legislative changes she has long advocated for. In May 2019, Klobuchar and other Senate Democrats introduced the “Election Security Act” (S. 1540), the Senate version of the stand-alone measure introduced in the House that was taken from the larger package, the “For the People Act” (H.R. 1) passed by the House.

In August 2018, the Senate Rules and Administration Committee postponed indefinitely a markup on a compromise bill to provide states additional assistance in securing elections from interference, the “The Secure Elections Act” (S.2593). Reportedly, there was concern among state officials that a provision requiring audits of election results would be in effect an unfunded mandate even though this provision was softened at the insistence of Senate Republican leadership. However, a Trump White House spokesperson indicated in a statement that the Administration opposed the bill, which may have posed an additional obstacle to Committee action. However, even if the Senate had passed its bill, it was unlikely that the Republican controlled House would have considered companion legislation (H.R. 6663).

Senator Gary Peters (D-MI) may be the next chair of the Senate Homeland Security and Governmental Affairs Committee, and if so, he will continue to face the rock on which many the bark of cybersecurity legislation has been dashed: Senator Ron Johnson (R-WI). So significant has Johnson’s opposition been to bipartisan cybersecurity legislation from the House, some House Republican stakeholders have said so in media accounts not bothering to hide in anonymity. And so whatever Peters’ ambitions may be to shore up the cybersecurity of the federal government as his committee will play a role in investigating and responding to the Russian hack of SolarWinds and many federal agencies, he will be limited by whatever Johnson and other Republicans will allow to move through the committee and through the Senate. Of course, Peters’ purview would include the Department of Homeland Security and the Cybersecurity and Infrastructure Security Agency (CISA) and its remit to police the cybersecurity practices of the federal government. Peters would also have in his portfolio the information technology (IT) practices of the federal government, some $90 billion annually across all agencies.

Finally, whether it be Leahy or Durbin at the Senate Appropriations Committee, this post allows for immense influence in funding and programmatic changes in all federal programs through the power of the purse Congress holds.

Privacy Shield Hearing

The focus was on how the U.S. and EU can reach agreement on an arrangement that will not be struck down by the EU’s highest court.

Last week, the Senate Commerce, Science, and Transportation Committee held a hearing on the now invalidated European Union (EU)-United States (U.S.) Privacy Shield, a mechanism that allowed companies to transfer the personal data of EU residents to the U.S. The EU’s highest court struck down the adequacy decision that underpinned the system on the basis of U.S. surveillance activities and lack of redress that violated EU law. This is the second time in the decade the EU’s top court has invalidated a transfer arrangement, the first being the Safe Harbor system. Given the estimated billions, or even trillions, of dollars in value realized from data flows between the EU and U.S. there is keen interest on both sides of the Atlantic in finding a legal path forward. However, absent significant curtailment of U.S. surveillance and/or a significant expansion of the means by which EU nationals could have violations of their rights rectified, it would appear a third agreement may not withstand the inevitable legal challenges. Moreover, there are questions as to the legality of other transfer tools in light of the Court of Justice for the European Union’s decision in the case known as Schrems II, and the legality of some Standard Contractual Clauses (SCC) and Binding Corporate Rules (BCR) may be soon be found in violation, too.

Consequently, a legislative fix, or some portion thereof, could be attached to federal privacy legislation. Hence, the striking down of Privacy Shield may provide additional impetus to Congress and the next Administration to reach a deal on privacy. Moreover, the lapsed reauthorization of some Foreign Intelligence Surveillance Act authorities may be another legislative opportunity for the U.S. to craft an approach amendable to the EU in order to either obtain an adequacy decision or a successor agreement to the Privacy Shield.

Chair Roger Wicker (R-MS) approached the issue from the perspective of international trade and the economic benefit accruing to businesses on both sides of the Atlantic. His opening remarks pertained less to the privacy and surveillance aspects of the CJEU’s ruling. Wicker appears to be making the case that the EU seems to misunderstand that redress rights in the U.S. are more than adequate, and the U.S.’ surveillance regime is similar to those of some EU nations. One wonders if the CJEU is inclined to agree with this position. Nonetheless, Wicker expressed hope that the EU and U.S. can reach “a durable and lasting data transfer framework…that provides meaningful data protections to consumers, sustains the free flow of information across the Atlantic, and encourages continued economic and strategic partnership with our European allies – a tall order but an essential order.” He worried about the effect of the CJEU’s ruling on SCCs. Wicker made the case that the EU and U.S. share democratic values and hinted that the ongoing talks in the committee to reach a federal data privacy law might include augmented redress rights that might satisfy the CJEU.

Ranking Member Maria Cantwell (D-WA) spoke very broadly about a range of issues related to data transfers and privacy. She stressed the importance of data flows in the context of larger trade relations. Cantwell also stressed the shared values between the U.S. and the EU and her hope that the two entities work “together on these very important national concerns, trade and technology, so that we can continue to improve economic opportunities and avoid moves towards protectionism.” She also called for federal privacy legislation but hinted that states should still be able to regulate privacy, suggesting her commitment to having a federal law be a floor for state laws. Cantwell also asserted that bulk surveillance, the likes of which the National security Agency has engaged in, may simply not be legal under EU law.

Deputy Assistant Secretary of Commerce for Services James Sullivan blurred the issues presented by Schrems II much like Cantwell did. The CJEU’s decision that focused on U.S. surveillance practices and the lack of meaningful recourse in the U.S. if an EU resident’s rights were violated was merged into a call for like-minded nations to unite against authoritarian nations. Sullivan distinguished between U.S. surveillance and the surveillance conducted by the People’s Republic of China (without naming the nation) and other regimes as if this should satisfy the EU as to the legality and propriety of U.S. treatment of EU personal data. Sullivan stated:

  • The Schrems II decision has created enormous uncertainties for U.S. companies and the transatlantic economy at a particularly precarious time. Immediately upon issuance of the ruling, the 5,400 Privacy Shield participants and their business partners in the EU could no longer rely on the Framework as a lawful basis for transferring personal data from Europe to the United States. Because neither the Court nor European data protection authorities provided for any enforcement grace period, Privacy Shield companies were left with three choices: (1) risk facing potentially huge fines (of up to 4 percent of total global turnover in the preceding year) for violating GDPR, (2) withdraw from the European market, or (3) switch right away to another more expensive data transfer mechanism.
  • Unfortunately, because of the Court’s ruling in the Privacy Shield context that U.S. laws relating to government access to data do not confer adequate protections for EU personal data, the use of other mechanisms like SCCs and BCRs to transfer EU personal data to the United States is now in question as well.
  • The objective of any potential agreement between the United States and the European Commission to address Schrems II is to restore the continuity of transatlantic data flows and the Framework’s privacy protections by negotiating targeted enhancements to Privacy Shield that address the Court’s concerns in Schrems II. Any such enhancements must respect the U.S. Government’s security responsibilities to our citizens and allies.
  • To be clear, we expect that any enhancements to the Privacy Shield Framework would also cover transfers under all other EU-approved data transfer mechanisms like SCCs and BCRs as well.
  • The Schrems II decision has underscored the need for a broader discussion among likeminded democracies on the issue of government access to data. Especially as a result of the extensive U.S. surveillance reforms since 2015, the United States affords privacy protections relating to national security data access that are equivalent to or greater than those provided by many other democracies in Europe and elsewhere.
  • To minimize future disruptions to data transfers, we have engaged with the European Union and other democratic nations in a multilateral discussion to develop principles based on common practices for addressing how best to reconcile law enforcement and national security needs for data with protection of individual rights.
  • It is our view that democracies should come together to articulate shared principles regarding government access to personal data—to help make clear the distinction between democratic societies that respect civil liberties and the rule of law and authoritarian governments that engage in the unbridled collection of personal data to surveil, manipulate, and control their citizens and other individuals without regard to personal privacy and human rights. Such principles would allow us to work with like-minded partners in preserving and promoting a free and open Internet enabled by the seamless flow of data.

Federal Trade Commission (FTC) Commissioner Noah Joshua Phillips stressed he was speaking in a personal capacity and not for the FTC. He extolled the virtues of the “free and open” internet model in the U.S. with the double implication that it is superior both to nations like the PRC and Russia but also the EU model. Phillips seemed to be advocating for talking the EU into accepting that the U.S.’s privacy regime and civil liberties are stronger than any other nation. Her also made the case, like other witnesses, that the U.S. data privacy and protection regulation is more similar to the EU than the PRC, Russia, and others. Phillips also sought to blur the issues and recast Privacy Shield in the context of the global struggle between democracies and authoritarian regimes. Phillips asserted:

  • First, we need to find a path forward after Schrems II, to permit transfers between the U.S. and EU. I want to recognize the efforts of U.S. and EU negotiators to find a replacement for Privacy Shield. While no doubt challenging, I have confidence in the good faith and commitment of public servants like Jim Sullivan, with whom I have the honor of appearing today, and our partners across the Atlantic. I have every hope and expectation that protecting cross-border data flows will be a priority for the incoming Administration, and I ask for your help in ensuring it is.
  • Second, we must actively engage with nations evaluating their approach to digital governance, something we at the FTC have done, to share and promote the benefits of a free and open Internet. There is an active conversation ongoing internationally, and at every opportunity—whether in public forums or via private assistance—we must ensure our voice and view is heard.
  • Third, we should be vocal in our defense of American values and policies. While we as Americans always look to improve our laws—and I commend the members of this committee on their important work on privacy legislation and other critical matters—we do not need to apologize to the world. When it comes to civil liberties or the enforcement of privacy laws, we are second to none. Indeed, in my view, the overall U.S. privacy framework—especially with the additional protections built into Privacy Shield—should certainly qualify as adequate under EU standards.
  • Fourth, as European leaders call to strengthen ties with the U.S., we should prioritize making our regimes compatible for the free flow of data. This extends to the data governance regimes of like-minded countries outside of Europe as well. Different nations will have different rules, but relatively minor differences need not impede mutually-beneficial commerce. We need not and should not purport to aim for a single, identical system of data governance. And we should remind our allies, and remind ourselves, that far more unites liberal democracies than divides us.
  • Fifth and finally, if we must draw lines, those lines should be drawn between allies with shared values—the U.S., Europe, Japan, Australia, and others—and those, like China and Russia, that offer a starkly different vision. I am certainly encouraged when I hear recognition of this distinction from Europe. European Data Protection Supervisor Wojciech Wiewiórowski recently noted that the U.S. is much closer to Europe than is China and that he has a preference for data being processed by countries that share values with Europe. Some here in the U.S. are even proposing agreements to solidify the relationships among technologically advanced democracies, an idea worth exploring in more detail

Washington University Professor of Law Neil Richards stressed that the Schrems II decision spells out how the U.S. would achieve adequacy: reforming surveillance and providing meaningful redress for alleged privacy violations. Consequently, FISA would need to be rewritten and narrowed and a means for EU residents to seek relief beyond the current Ombudsman system is needed, possibly a statutory right to sue. Moreover, he asserted strong data protection and privacy laws are needed and some of the bills introduced in this Congress could fit the bill. Richards asserted:

In sum, the Schrems litigation is a creature of distrust, and while it has created problems for American law and commerce, it has also created a great opportunity. That opportunity lies before this Committee –the chance to regain American leadership in global privacy and data protection by passing a comprehensive law that provides appropriate safeguards, enforceable rights, and effective legal remedies for consumers. I believe that the way forward can not only safeguard the ability to share personal data across the Atlantic, but it can do so in a way that builds trust between the United States and our European trading partners and between American companies and their American and European customers. I believe that there is a way forward, but it requires us to recognize that strong, clear, trust-building rules are not hostile to business interest, that we need to push past the failed system of “notice and choice,” that we need to preserve effective consumer remedies and state-level regulatory innovation, and seriously consider a duty of loyalty. In that direction, I believe, lies not just consumer protection, but international cooperation and economic prosperity.

Georgia Tech University Professor Peter Swire explained that the current circumstances make the next Congress the best possibility in memory to enact privacy legislation because of the need for a Privacy Shield replacement, passage of the new California Privacy Rights Act (Proposition 24), and the Biden Administration’s likely support for such legislation. Swire made the following points:

  1. The European Data Protection Board in November issued draft guidance with an extremely strict interpretation of how to implement the Schrems II case.
  2. The decision in Schrems II is based on EU constitutional law. There are varying current interpretations in Europe of what is required by Schrems II, but constitutional requirements may restrict the range of options available to EU and U.S. policymakers.
  3. Strict EU rules about data transfers, such as the draft EDPB guidance, would appear to result in strict data localization, creating numerous major issues for EU- and U.S.-based businesses, as well as affecting many online activities of EU individuals.
  4. Along with concerns about lack of individual redress, the CJEU found that the EU Commission had not established that U.S. surveillance was “proportionate” in its scope and operation. Appendix 2 to this testimony seeks to contribute to an informed judgment on proportionality, by cataloguing developments in U.S. surveillance safeguards since the Commission’s issuance of its Privacy Shield decision in 2016.
  5. Negotiating an EU/U.S. adequacy agreement is important in the short term.
  6. A short-run agreement would assist in creating a better overall long-run agreement or agreements.
  7. As the U.S. considers its own possible legal reforms in the aftermath of Schrems II, it is prudent and a normal part of negotiations to seek to understand where the other party – the EU – may have flexibility to reform its own laws.
  8. Issues related to Schrems II have largely been bipartisan in the U.S., with substantial continuity across the Obama and Trump administrations, and expected as well for a Biden administration.
  9. Passing comprehensive privacy legislation would help considerably in EU/U.S. negotiations.
  10. This Congress may have a unique opportunity to enact comprehensive commercial privacy legislation for the United States.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Dooffy Design from Pixabay

Further Reading, Other Developments, and Coming Events (10 December)

Further Reading

  • Social media superspreaders: Why Instagram, not Facebook, will be the real battleground for COVID-19 vaccine misinformation” By Isobel Asher Hamilton — Business Insider. According to one group, COVID-19 anti-vaccination lies and misinformation are proliferating on Instagram despite its parent company’s, Facebook, efforts to find and remove such content. There has been dramatic growth in such content on Instagram, and Facebook seems to be applying COVID-19 standards more loosely on Instagram. In fact, some people kicked off of Facebook for violating that platform’s standards on COVID-19 are still on Instagram spreading the same lies, misinformation, and disinformation. For example, British anti-vaccination figure David Icke was removed from Facebook for making claims that COVID-19 was caused by or related to 5G, but he has a significant following on Instagram.
  • ‘Grey area’: China’s trolling drives home reality of social media war” By Chris Zappone — The Sydney Morning Herald. The same concept that is fueling aggressive cyber activity at a level below outright war has spread to diplomacy. The People’s Republic of China (PRC) has been waging “gray” social media campaigns against a number of Western nations, including Australia, mainly be propagating lies and misinformation. The most recent example is the spreading a fake photo of an Australian soldier appearing to kill an Afghan child. This false material seems designed to distract from the real issues between the two nations arising from clashing policies on trade and human rights. The PRC’s activities do not appear to violate Australia’s foreign interference laws and seem to have left Canberra at a loss as to how to respond effectively.
  • Facebook to start policing anti-Black hate speech more aggressively than anti-White comments, documents show” By Elizabeth Dwoskin, Nitasha Tiku and Heather Kelly — The Washington Post. Facebook will apparently seek to revamp its algorithms to target the types of hate speech that have traditionally targeted women and minority groups. Up until now all attacks were treated equally so that something like “white people suck” would be treated the same way as anti-Semitic content. Facebook has resisted changes for years even though experts and civil rights groups made the case that people of color, women, and LGBTI people endure far more abuse online. There is probably no connection between Facebook’s more aggressive content moderation policies and the advent of a new administration in Washington more receptive to claims that social media platforms allow the abuse of these people.
  • How Joe Biden’s Digital Team Tamed the MAGA Internet” By Kevin Roose — The New York Times. Take this piece with a block of salt. The why they won articles are almost always rife with fallacies, including the rationale that if a candidate won, his or her strategy must have worked. It is not clear that the Biden Campaign’s online messaging strategy of being nice and emphasizing positive values actually beat the Trump Campaign’s “Death Star” so much as the President’s mishandling of the pandemic response and cratering of the economy did him in.
  • Coronavirus Apps Show Promise but Prove a Tough Sell” By Jennifer Valentino-DeVries — The New York Times. It appears the intersection of concerns about private and public sector surveillance from two very different groups has worked to keep down rates of adopting smartphone COVID tracking apps in the United States. There are people wary of private sector practices to hoover up as much data as possible, and others concerned about the government’s surveillance activities. Consequently, many are shunning Google and Apple’s COVID contact tracing apps to the surprise of government, industry, and academia. A pair of studies show resistance to downloading or using such apps even if there are very strong privacy safeguards. This result may well be a foreseeable outcome from U.S. policies that have allowed companies and the security services to collect and use vast quantities of personal information.
  • UAE target of cyber attacks after Israel deal, official says” — Reuters. A top cybersecurity official in the United Arab Emirates claimed his nation’s financial services industries were targeted for cyber attack and implied Iran and affiliated hackers were responsible.

Other Developments

  • President-elect Joe Biden announced his intention to nominate California Attorney General Xavier Becerra to serve as the next Secretary of Health and Human Services (HHS). If confirmed by the Senate, California Governor Gavin Newsom would name Becerra’s successor who would need to continue enforcement of the “California Consumer Privacy Act” (CCPA) (AB 375) while also working towards the transition to the “California Privacy Rights Act” (Proposition 24) approved by California voters last month. The new statute establishes the California Privacy Protection Agency that will assume the Attorney General’s responsibilities regarding the enforcement of California’s privacy laws. However, Becerra’s successor may play a pivotal role in the transition between the two regulators and the creation of the new regulations needed to implement Proposition 24.
  • The Senate approved the nomination of Nathan Simington to be a Commissioner of the Federal Communications Commission (FCC) by a 49-46 vote. Once FCC Chair Ajit Pai steps down, the agency will be left with two Democratic and two Republican Commissioners, pending the Biden Administration’s nominee to fill Pai’s spot. If the Senate stays Republican, it is possible the calculation could be made that a deadlocked FCC is better than a Democratic agency that could revive net neutrality rules among other Democratic and progressive policies. Consequently, Simington’s confirmation may be the first step in a FCC unable to develop substantive policy.
  • Another federal court has broadened the injunction against the Trump Administration’s ban on TikTok to encompass the entirety of the Department of Commerce’s September order meant to stop the usage of the application in the United States (U.S.) It is unclear as to whether the Trump Administration will appeal, and if it should, whether a court would decide the case before the Biden Administration begins in mid-January. The United States Court for the District of Columbia found that TikTok “established that  the government likely exceeded IEEPA’s express limitations as part of an agency action that was arbitrary and capricious” and would likely suffer irreparable harm, making an injunction an appropriate remedy.
  • The United States’ National Security Agency (NSA) “released a Cybersecurity Advisory on Russian state-sponsored actors exploiting CVE-2020-4006, a command-injection vulnerability in VMware Workspace One Access, Access Connector, Identity Manager, and Identity Manager Connector” and provided “mitigation and detection guidance.”
  • The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint alert, warning that U.S. think tanks are being targeted by “persistent continued cyber intrusions by advanced persistent threat (APT) actors.” The agencies stated “[t]his malicious activity is often, but not exclusively, directed at individuals and organizations that focus on international affairs or national security policy.” CISA and the FBI stated its “guidance may assist U.S. think tanks in developing network defense procedures to prevent or rapidly detect these attacks.” The agencies added:
    • APT actors have relied on multiple avenues for initial access. These have included low-effort capabilities such as spearphishing emails and third-party message services directed at both corporate and personal accounts, as well as exploiting vulnerable web-facing devices and remote connection capabilities. Increased telework during the COVID-19 pandemic has expanded workforce reliance on remote connectivity, affording malicious actors more opportunities to exploit those connections and to blend in with increased traffic. Attackers may leverage virtual private networks (VPNs) and other remote work tools to gain initial access or persistence on a victim’s network. When successful, these low-effort, high-reward approaches allow threat actors to steal sensitive information, acquire user credentials, and gain persistent access to victim networks.
    • Given the importance that think tanks can have in shaping U.S. policy, CISA and FBI urge individuals and organizations in the international affairs and national security sectors to immediately adopt a heightened state of awareness and implement the critical steps listed in the Mitigations section of this Advisory.
  • A group of Democratic United States Senators have written the CEO of Alphabet and Google about its advertising policies and how its platforms may have been used to spread misinformation and contribute to voter suppression. Thus far, most of the scrutiny about the 2020 election and content moderation policy has fallen on Facebook and Twitter even though Google-owned YouTube has been flagged as containing the same amount of misinformation. Senators Amy Klobuchar (D-MN) and Mark Warner (D-VA) led the effort and expressed “serious concerns regarding recent reports that Google is profiting from the sale of ads spreading election-related disinformation” to Alphabet and Google CEO Sundar Pichai. Klobuchar, Warner, and their colleagues asserted:
    • Google is also helping organizations spreading election-related disinformation to raise revenue by placing ads on their websites. While Google has some policies in place to prevent the spread of election misinformation, they are not properly enforced and are inadequate. We urge you to immediately strengthen and improve enforcement of your policies on election-related disinformation and voter suppression, reject all ads spreading election-related disinformation, and stop providing advertising services on sites that spread election-related disinformation.
    • …a recent study by the Global Disinformation Index (GDI) found that Google services ads on 145 out of 200 websites GDI examined that publish disinformation. 
    • Similarly, a recent report from the Center for Countering Digital Hate (CCDH) found that Google has been placing ads on websites publishing disinformation designed to undermine elections. In examining just six websites publishing election-related disinformation, CCDH estimates that they receive 40 million visits a month, generating revenue for these sites of up to $3.4 million annually from displaying Google ads. In addition, Google receives $1.6 million from the advertisers’ payments annually.  These sites published stories ahead of the 2020 general election that contained disinformation alleging that voting by mail was not secure, that mail-in voting was being introduced to “steal the election,” and that election officials were “discarding mail ballots.” 
  • A bipartisan group of United States Senators on one committee are urging Congressional leadership to include funding to help telecommunications companies remove and replace Huawei and ZTE equipment and to aid the Federal Communications Commission (FCC) in drafting accurate maps of broadband service in the United States (U.S.). Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS) and a number of his colleagues wrote the leadership of both the Senate and House and argued:
    • we urge you to provide full funding for Public Law 116-124, the Secure and Trusted Communications Networks Act, and Public Law 116-130, the Broadband DATA Act.   
    • Closing the digital divide and winning the race to 5G are critical to America’s economic prosperity and global leadership in technology. However, our ability to connect all Americans and provide access to next-generation technology will depend in large part on the security of our communications infrastructure. The Secure and Trusted Communications Networks Act (“rip and replace”) created a program to help small, rural telecommunications operators remove equipment posing a security threat to domestic networks and replace it with equipment from trusted providers. This is a national security imperative. Fully funding this program is essential to protecting the integrity of our communications infrastructure and the future viability of our digital economy at large.
    • In addition to safeguarding the security of the nation’s communications systems, developing accurate broadband maps is also critically important. The United States faces a persistent digital divide, and closing this divide requires accurate maps that show where broadband is available and where it is not. Current maps overstate broadband availability, which prevents many underserved communities, particularly in rural areas, from receiving the funds needed to build or expand broadband networks to millions of unconnected Americans. Fully funding the Broadband DATA Act will ensure more accurate broadband maps and better stewardship over the millions of dollars the federal government awards each year to support broadband deployment. Without these maps, the government risks overbuilding existing networks, duplicating funding already provided, and leaving communities unserved.  
  • The Government Accountability Office (GAO) released an assessment of 5G policy options that “discusses (1) how the performance goals and expected uses are to be realized in U.S. 5Gwireless networks; (2) the challenges that could affect the performance or usage of 5G wireless networks in the U.S.; and (3) policy options to address these challenges.” The report had been requested by the chairs and ranking members of the House Armed Services, Senate Armed Services, Senate Intelligence, and House Intelligence Committees along with other Members. The GAO stated “[w]hile 5G is expected to deliver significantly improved network performance and greater capabilities, challenges may hinder the performance or usage of 5G technologies in the U.S. We grouped the challenges into the following four categories:
    • availability and efficient use of spectrum
    • security of 5G networks
    • concerns over data privacy
    • concerns over possible health effects
    • The GAO presented the following policy options along with opportunities and considerations for each:
      • Spectrum-Sharing Technologies Opportunities:
        • Could allow for more efficient use of the limited spectrum available for 5G and future generations of wireless networks.
        • It may be possible to leverage existing5G testbeds for testing the spectrum sharing technologies developed through applied research.
      • Spectrum-Sharing Technologies Considerations:
        • Research and development is costly, must be coordinated and administered, and its potential benefits are uncertain. Identifying a funding source, setting up the funding mechanism, or determining which existing funding streams to reallocate will require detailed analysis.
      • Coordinated Cybersecurity Monitoring Opportunities:
        • A coordinated monitoring program would help ensure the entire wireless ecosystem stays knowledgeable about evolving threats, in close to real time; identify cybersecurity risks; and allow stakeholders to act rapidly in response to emerging threats or actual network attacks.
      • Coordinated Cybersecurity Monitoring Considerations:
        • Carriers may not be comfortable reporting incidents or vulnerabilities, and determinations would need to be made about what information is disclosed and how the information will be used and reported.
      • Cybersecurity Requirements Opportunities
        • Taking these steps could produce a more secure network. Without a baseline set of security requirements the implementation of network security practices is likely to be piecemeal and inconsistent.
        • Using existing protocols or best practices may decrease the time and cost of developing and implementing requirements.
      • Cybersecurity Requirements Considerations
        • Adopting network security requirements would be challenging, in part because defining and implementing the requirements would have to be done on an application-specific basis rather than as a one-size-fits-all approach.
        • Designing a system to certify network components would be costly and would require a centralized entity, be it industry-led or government-led.
      • Privacy Practices Considerations
        • Development and adoption of uniform privacy practices would benefit from existing privacy practices that have been implemented by states, other countries, or that have been developed by federal agencies or other organizations.
      • Privacy Practices Opportunities
        • Privacy practices come with costs, and policymakers would need to balance the need for privacy with the direct and indirect costs of implementing privacy requirements. Imposing requirements can be burdensome, especially for smaller entities.
      • High-band Research Opportunities
        • Could result in improved statistical modeling of antenna characteristics and more accurately representing propagation characteristics.
        • Could result in improved understanding of any possible health effects from long-term radio frequency exposure to high-band emissions.
      • High-band Research Considerations
        • Research and development is costly and must be coordinated and administered, and its potential benefits are uncertain. Policymakers will need to identify a funding source or determine which existing funding streams to reallocate.

Coming Events

  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up on 10 December.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Tima Miroshnichenko from Pexels

Further Reading, Other Development, and Coming Events (8 December)

Further Reading

  • Facebook failed to put fact-check labels on 60% of the most viral posts containing Georgia election misinformation that its own fact-checkers had debunked, a new report says” By Tyler Sonnemaker — Business Insider. Despite its vows to improve its managing of untrue and false content, the platform is not consistently taking down such material related to the runoffs for the Georgia Senate seats. The group behind this finding argues it is because Facebook does not want to. What is left unsaid is that engagement drives revenue, and so, Facebook’s incentives are not to police all violations. Rather it would be to take down enough to be able to say their doing something.
  • Federal Labor Agency Says Google Wrongly Fired 2 Employees” By Kate Conger and Noam Scheiber — The New York Times. The National Labor Relations Board (NLRB) has reportedly sided with two employees Google fired for activities that are traditionally considered labor organizing. The two engineers had been dismissed for allegedly violating the company’s data security practices when they researched the company’s retention of a union-busting firm and sought to alert others about organizing. Even though Google is vowing to fight the action, which has not been finalized, it may well settle given the view of Big Tech in Washington these days. This action could also foretell how a Biden Administration NLRB may look at the labor practices of these companies.
  • U.S. states plan to sue Facebook next week: sources” By Diane Bartz — Reuters. We could see state and federal antitrust suits against Facebook this week. One investigation led by New York Attorney General Tish James could include 40 states although the grounds for alleged violations have not been leaked at this point. It may be Facebook’s acquisition of potential rivals Instagram and WhatsApp that have allowed it to dominate the social messaging market. The Federal Trade Commission (FTC) may also file suit, and, again, the grounds are unknown. The European Commission (EC) is also investigating Facebook for possible violations of European Union (EU) antitrust law over the company’s use of the personal data it holds and uses and about its operation of it online marketplace.
  • The Children of Pornhub” By Nicholas Kristof — The New York Times. This column comprehensively traces the reprehensible recent history of a Canadian conglomerate Mindgeek that owns Pornhub where one can find reams of child and non-consensual pornography. Why Ottawa has not cracked down on this firm is a mystery. The passage and implementation of the “Allow States and Victims to Fight Online Sex Trafficking Act of 2017” (P.L. 115-164) that narrowed the liability shield under 47 USC 230 has forced the company to remove content, a significant change from its indifference before the statutory change in law. Kristof suggests some easy, common sense changes Mindgeek could implement to combat the presence of this illegal material, but it seems like the company will do enough to say it is acting without seriously reforming its platform. Why would it? There is too much money to be made. Additionally, those fighting against this sort of material have been pressuring payment platforms to stop doing business with Mindgeek. PayPal has foresworn any  interaction, and due to pressure Visa and Mastercard are “reviewing” their relationship with Mindgeek and Pornhub. In a statement to a different news outlet, Pornhub claimed it is “unequivocally committed to combating child sexual abuse material (CSAM), and has instituted a comprehensive, industry-leading trust and safety policy to identify and eradicate illegal material from our community.” The company further claimed “[a]ny assertion that we allow CSAM is irresponsible and flagrantly untrue….[w]e have zero tolerance for CSAM.”
  • Amazon and Apple Are Powering a Shift Away From Intel’s Chips” By Don Clark — The New York Times. Two tech giants have chosen new faster, cheaper chips signaling a possible industry shift away from Intel, the firm that has been a significant player for decades. Intel will not go quietly, of course, and a key variable is whether must have software and applications are rewritten to accommodate the new chips from a British firm, Arm.

Other Developments

  • The Government Accountability Office (GAO) and the National Academy of Medicine (NAM) have released a joint report on artificial intelligence in healthcare, consisting of GAO’s Technology Assessment: Artificial Intelligence in Health Care: Benefits and Challenges of Technologies to Augment Patient Care and NAM’s Special Publication: Advancing Artificial Intelligence in Health Settings Outside the Hospital and Clinic. GAO’s report “discusses three topics: (1) current and emerging AI tools available for augmenting patient care and their potential benefits, (2) challenges to the development and adoption of these tools, and (3) policy options to maximize benefits and mitigate challenges to the use of AI tools to augment patient care.” NAM’s “paper aims to provide an analysis of: 1) current technologies and future applications of AI in HSOHC, 2) the logistical steps and challenges involved in integrating AI- HSOHC applications into existing provider workflows, and 3) the ethical and legal considerations of such AI tools, followed by a brief proposal of potential key initiatives to guide the development and adoption of AI in health settings outside the hospital and clinic (HSOHC).
    • The GAO “identified five categories of clinical applications where AI tools have shown promise to augment patient care: predicting health trajectories, recommending treatments, guiding surgical care, monitoring patients, and supporting population health management.” The GAO “also identified three categories of administrative applications where AI tools have shown promise to reduce provider burden and increase the efficiency of patient care: recording digital clinical notes, optimizing operational processes, and automating laborious tasks.” The GAO stated:
      • This technology assessment also identifies challenges that hinder the adoption and impact of AI tools to augment patient care, according to stakeholders, experts, and the literature. Difficulties accessing sufficient high-quality data may hamper innovation in this space. Further, some available data may be biased, which can reduce the effectiveness and accuracy of the tools for some people. Addressing bias can be difficult because the electronic health data do not currently represent the general population. It can also be challenging to scale tools up to multiple locations and integrate them into new settings because of differences in institutions and the patient populations they serve. The limited transparency of AI tools used in health care can make it difficult for providers, regulators, and others to determine whether an AI tool is safe and effective. A greater dispersion of data across providers and institutions can make securing patient data difficult. Finally, one expert described how existing case law does not specifically address AI tools, which can make providers and patients reticent to adopt them. Some of these challenges are similar to those identified previously by GAO in its first publication in this series, such as the lack of high-quality, structured data, and others are more specific to patient care, such as liability concerns.
    • The GAO “described six policy options:”
      • Collaboration. Policymakers could encourage interdisciplinary collaboration between developers and health care providers. This could result in AI tools that are easier to implement and use within an existing workflow.
      • Data Access. Policymakers could develop or expand high-quality data access mechanisms. This could help developers address bias concerns by ensuring data are representative, transparent, and equitable.
      • Best Practices. Policymakers could encourage relevant stakeholders and experts to establish best practices (such as standards) for development, implementation, and use of AI technologies. This could help with deployment and scalability of AI tools by providing guidance on data, interoperability, bias, and formatting issues.
      • Interdisciplinary Education. Policymakers could create opportunities for more workers to develop interdisciplinary skills. This could allow providers to use AI tools more effectively, and could be accomplished through a variety of methods, including changing medical curricula or grants.
      • Oversight Clarity. Policymakers could collaborate with relevant stakeholders to clarify appropriate oversight mechanisms. Predictable oversight could help ensure that AI tools remain safe and effective after deployment and throughout their lifecycle.
      • Status Quo. Policymakers could allow current efforts to proceed without intervention.
    • NAM claimed
      • Numerous AI-powered health applications designed for personal use have been shown to improve patient outcomes, building predictions based on large volumes of granular, real-time, and individualized behavioral and medical data. For instance, some forms of telehealth, a technology that has been critical during the COVID-19 pandemic, benefit considerably from AI software focused on natural language processing, which enables efficient triaging of patients based on urgency and type of illness. Beyond patient-provider communication, AI algorithms relevant to diabetic and cardiac care have demonstrated remarkable efficacy in helping patients manage their blood glucose levels in their day-to-day lives and in detecting cases of atrial fibrillation. AI tools that monitor and synthesize longitudinal patient behaviors are also particularly useful in psychiatric care, where of the exact timing of interventions is often critical. For example, smartphone-embedded sensors that track location and proximity of individuals can alert clinicians of possible substance use, prompting immediate intervention. On the population health level, these individual indicators of activity and health can be combined with environmental- and system-level data to generate predictive insight into local and global health trends. The most salient example of this may be the earliest warnings of the COVID-19 outbreak, issued in December 2019 by two private AI technology firms.
      • Successful implementation and widespread adoption of AI applications in HSOHC requires careful consideration of several key issues related to personal data, algorithm development, and health care insurance and payment. Chief among them are data interoperability, standardization, privacy, ameliorating systemic biases in algorithms, reimbursement of AI- assisted services, quality improvement, and integration of AI tools into provider workflows. Overcoming these challenges and optimizing the impact of AI tools on clinical outcomes will involve engaging diverse stakeholders, deliberately designing AI tools and interfaces, rigorously evaluating clinical and economic utility, and diffusing and scaling algorithms across different health settings. In addition to these potential logistical and technical hurdles, it is imperative to consider the legal and ethical issues surrounding AI, particularly as it relates to the fair and humanistic deployment of AI applications in HSOHC. Important legal considerations include the appropriate designation of accountability and liability of medical errors resulting from AI- assisted decisions for ensuring the safety of patients and consumers. Key ethical challenges include upholding the privacy of patients and their data—particularly with regard to non-HIPAA covered entities involved in the development of algorithms—building unbiased AI algorithms based on high-quality data from representative populations, and ensuring equitable access to AI technologies across diverse communities.
  • The National Institute of Standards and Technology (NIST) published a “new study of face recognition technology created after the onset of the COVID-19 pandemic [that] shows that some software developers have made demonstrable progress at recognizing masked faces.” In Ongoing Face Recognition Vendor Test (FRVT) Part 6B: Face Recognition Accuracy with Face Masks Using Post-COVID-19 Algorithms (NISTIR 8331), NIST stated the “report augments its predecessor with results for more recent algorithms provided to NIST after mid-March 2020.” NIST said that “[w]hile we do not have information on whether or not a particular algorithm was designed with face coverings in mind, the results show evidence that a number of developers have adapted their algorithms to support face recognition on subjects potentially wearing face masks.” NIST stated that
    • The following results represent observations on algorithms provided to NIST both before and after the COVID-19 pandemic to date. We do not have information on whether or not a particular algorithm was designed with face coverings in mind. The results documented capture a snapshot of algorithms submitted to the FRVT 1:1 in face recognition on subjects potentially wearing face masks.
      • False rejection performance: All algorithms submitted after the pandemic continue to give in-creased false non-match rates (FNMR) when the probes are masked. While a few pre-pandemic algorithms still remain within the most accurate on masked photos, some developers have submit-ted algorithms after the pandemic showing significantly improved accuracy and are now among the most accurate in our test.
      • Evolution of algorithms on face masks: We observe that a number of algorithms submitted since mid-March 2020 show notable reductions in error rates with face masks over their pre-pandemic predecessors. When comparing error rates for unmasked versus masked faces, the median FNMR across algorithms submitted since mid-March 2020 has been reduced by around 25% from the median pre-pandemic results. The figure below presents examples of developer evolution on both masked and unmasked datasets. For some developers, false rejection rates in their algorithms submitted since mid-March 2020 decreased by as much as a factor of 10 over their pre-pandemic algorithms, which is evidence that some providers are adapting their algorithms to handle facemasks. However, in the best cases, when comparing results for unmasked images to masked im-ages, false rejection rates have increased from 0.3%-0.5% (unmasked) to 2.4%-5% (masked).
      • False acceptance performance: As most systems are configured with a fixed threshold, it is necessary to report both false negative and false positive rates for each group at that threshold. When comparing a masked probe to an unmasked enrollment photo, in most cases, false match rates (FMR) are reduced by masks. The effect is generally modest with reductions in FMR usually being smaller than a factor of two. This property is valuable in that masked probes do not impart adverse false match security consequences for verification.
      • Mask-agnostic face recognition: All 1:1 verification algorithms submitted to the FRVT test since the start of the pandemic are evaluated on both masked and unmasked datasets. The test is de-signed this way to mimic operational reality: some images will have masks, some will not (especially enrollment samples from a database or ID card). And to the extent that the use of protective masks will exist for some time, our test will continue to evaluate algorithmic capability on verifying all combinations of masked and unmasked faces.
  • The government in London has issued a progress report on its current cybersecurity strategy that has another year to run. The Paymaster General assessed how well the United Kingdom (UK) has implemented the National Cyber Security Strategy 2016 to 2021 and pointed to goals yet to be achieved. This assessment comes in the shadow of the pending exit of the UK from the European Union (EU) and Prime Minister Boris Johnson’s plans to increase the UK’s role in select defense issues, including cyber operations. The Paymaster General stated:
    • The global landscape has changed significantly since the publication of the National Cyber Security Strategy Progress Report in May 2019. We have seen unprecedented levels of disruption to our way of life that few would have predicted. The COVID-19 pandemic has increased our reliance on digital technologies – for our personal communications with friends and family and our ability to work remotely, as well as for businesses and government to continue to operate effectively, including in support of the national response.
    • These new ways of living and working highlight the importance of cyber security, which is also underlined by wider trends. An ever greater reliance on digital networks and systems, more rapid advances in new technologies, a wider range of threats, and increasing international competition on underlying technologies and standards in cyberspace, emphasise the need for good cyber security practices for individuals, businesses and government.
    • Although the scale and international nature of these changes present challenges, there are also opportunities. With the UK’s departure from the European Union in January 2020, we can define and strengthen Britain’s place in the world as a global leader in cyber security, as an independent, sovereign nation.
    • The sustained, strategic investment and whole of society approach delivered so far through the National Cyber Security Strategy has ensured we are well placed to respond to this changing environment and seize new opportunities.
    • The Paymaster General asserted:
      • [The] report has highlighted growing risks, some accelerated by the COVID-19 pandemic, and longer-term trends that will shape the environment over the next decade:
      • Ever greater reliance on digital networks and systems as daily life moves online, bringing huge benefits but also creating new systemic and individuals risks.
      • Rapid technological change and greater global competition, challenging our ability to shape the technologies that will underpin our future security and prosperity.
      • A wider range of adversaries as criminals gain easier access to commoditised attack capabilities and cyber techniques form a growing part of states’ toolsets.
      • Competing visions for the future of the internet and the risk of fragmentation, making consensus on norms and ethics in cyberspace harder to achieve.
      • In February 2020 the Prime Minister announced the Integrated Review of Security, Defence, Development and Foreign Policy. This will define the government’s ambition for the UK’s role in the world and the long-term strategic aims of our national security and foreign policy. It will set out the way in which the UK will be a problem-solving and burden-sharing nation, and a strong direction for recovery from COVID-19, at home and overseas.
      • This will help to shape our national approach and priorities on cyber security beyond 2021. Cyber security is a key element of our international, defence and security posture, as well as a driving force for our economic prosperity.
  • The University of Toronto’s Citizen Lab published a report on an Israeli surveillance firm that uses “[o]ne of the widest-used—but least appreciated” means of surveilling people (i.e., “leveraging of weaknesses in the global mobile telecommunications infrastructure to monitor and intercept phone calls and traffic.” Citizen Lab explained that an affiliate of the NSO Group, “Circles is known for selling systems to exploit Signaling System 7 (SS7) vulnerabilities, and claims to sell this technology exclusively to nation-states.” Citizen Lab noted that “[u]nlike NSO Group’s Pegasus spyware, the SS7 mechanism by which Circles’ product reportedly operates does not have an obvious signature on a target’s phone, such as the telltale targeting SMS bearing a malicious link that is sometimes present on a phone targeted with Pegasus.” Citizen Lab found that
    • Circles is a surveillance firm that reportedly exploits weaknesses in the global mobile phone system to snoop on calls, texts, and the location of phones around the globe. Circles is affiliated with NSO Group, which develops the oft-abused Pegasus spyware.
    • Circles, whose products work without hacking the phone itself, says they sell only to nation-states. According to leaked documents, Circles customers can purchase a system that they connect to their local telecommunications companies’ infrastructure, or can use a separate system called the “Circles Cloud,” which interconnects with telecommunications companies around the world.
    • According to the U.S. Department of Homeland Security, all U.S. wireless networks are vulnerable to the types of weaknesses reportedly exploited by Circles. A majority of networks around the globe are similarly vulnerable.
    • Using Internet scanning, we found a unique signature associated with the hostnames of Check Point firewalls used in Circles deployments. This scanning enabled us to identify Circles deployments in at least 25 countries.
    • We determine that the governments of the following countries are likely Circles customers: Australia, Belgium, Botswana, Chile, Denmark, Ecuador, El Salvador, Estonia, Equatorial Guinea, Guatemala, Honduras, Indonesia, Israel, Kenya, Malaysia, Mexico, Morocco, Nigeria, Peru, Serbia, Thailand, the United Arab Emirates (UAE), Vietnam, Zambia, and Zimbabwe.
    • Some of the specific government branches we identify with varying degrees of confidence as being Circles customers have a history of leveraging digital technology for human rights abuses. In a few specific cases, we were able to attribute the deployment to a particular customer, such as the Security Operations Command (ISOC) of the Royal Thai Army, which has allegedly tortured detainees.
  • Senators Ron Wyden (D-OR) Elizabeth Warren (D-MA) Edward J. Markey (D-MA) and Brian Schatz (D-HI) “announced that the Department of Homeland Security (DHS) will launch an inspector general investigation into Customs and Border Protection’s (CBP) warrantless tracking of phones in the United States following an inquiry from the senators earlier this year” per their press release.
    • The Senators added:
      • As revealed by public contracts, CBP has paid a government contractor named Venntel nearly half a million dollars for access to a commercial database containing location data mined from applications on millions of Americans’ mobile phones. CBP officials also confirmed the agency’s warrantless tracking of phones in the United States using Venntel’s product in a September 16, 2020 call with Senate staff.
      • In 2018, the Supreme Court held in Carpenter v. United States that the collection of significant quantities of historical location data from Americans’ cell phones is a search under the Fourth Amendment and therefore requires a warrant.
      • In September 2020, Wyden and Warren successfully pressed for an inspector general investigation into the Internal Revenue Service’s use of Venntel’s commercial location tracking service without a court order.
    • In a letter, the DHS Office of the Inspector General (OIG) explained:
      • We have reviewed your request and plan to initiate an audit that we believe will address your concerns. The objective of our audit is to determine if the Department of Homeland Security (DHS) and it [sic] components have developed, updated, and adhered to policies related to cell-phone surveillance devices. In addition, you may be interested in our audit to review DHS’ use and protection of open source intelligence. Open source intelligence, while different from cell phone surveillance, includes the Department’s use of information provided by the public via cellular devices, such as social media status updates, geo-tagged photos, and specific location check-ins.
    • In an October letter, these Senators plus Senator Sherrod Brown (D-OH) argued:
      • CBP is not above the law and it should not be able to buy its way around the Fourth Amendment. Accordingly, we urge you to investigate CBP’s warrantless use of commercial databases containing Americans’ information, including but not limited to Venntel’s location database. We urge you to examine what legal analysis, if any, CBP’s lawyers performed before the agency started to use this surveillance tool. We also request that you determine how CBP was able to begin operational use of Venntel’s location database without the Department of Homeland Security Privacy Office first publishing a Privacy Impact Assessment.
  • The American Civil Liberties Union (ACLU) has filed a lawsuit in a federal court in New York City, seeking an order to compel the United States (U.S.) Department of Homeland Security (DHS), U.S. Customs and Border Protection (CBP), and U.S. Immigration and Customs Enforcement (ICE) “to release records about their purchases of cell phone location data for immigration enforcement and other purposes.” The ACLU made these information requests after numerous media accounts showing that these and other U.S. agencies were buying location data and other sensitive information in ways intended to evade the bar in the Fourth Amendment against unreasonable searches.
    • In its press release, the ACLU asserted:
      • In February, The Wall Street Journal reported that this sensitive location data isn’t just for sale to commercial entities, but is also being purchased by U.S. government agencies, including by U.S. Immigrations and Customs Enforcement to locate and arrest immigrants. The Journal identified one company, Venntel, that was selling access to a massive database to the U.S. Department of Homeland Security, U.S. Customs and Border Protection, and ICE. Subsequent reporting has identified other companies selling access to similar databases to DHS and other agencies, including the U.S. military.
      • These practices raise serious concerns that federal immigration authorities are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. There’s even more reason for alarm when those agencies evade requests for information — including from U.S. senators — about such practices. That’s why today we asked a federal court to intervene and order DHS, CBP, and ICE to release information about their purchase and use of precise cell phone location information. Transparency is the first step to accountability.
    • The ACLU explained in the suit:
      • Multiple news sources have confirmed these agencies’ purchase of access to databases containing precise location information for millions of people—information gathered by applications (apps) running on their smartphones. The agencies’ purchases raise serious concerns that they are evading Fourth Amendment protections for cell phone location information by paying for access instead of obtaining a warrant. Yet, more than nine months after the ACLU submitted its FOIA request (“the Request”), these agencies have produced no responsive records. The information sought is of immense public significance, not only to shine a light on the government’s use of powerful location-tracking data in the immigration context, but also to assess whether the government’s purchase of this sensitive data complies with constitutional and legal limitations and is subject to appropriate oversight and control.
  • Facebook’s new Oversight Board announced “the first cases it will be deliberating and the opening of the public comment process” and “the appointment of five new trustees.” The cases were almost all referred by Facebook users and the new board is asking for comments on the right way to manage what may be objectionable content. The Oversight Board explained it “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
    • The new trustees are:
      • Kristina Arriaga is a globally recognized advocate for freedom of expression, with a focus on freedom of religion and belief. Kristina is president of the advisory firm Intrinsic.
      • Cherine Chalaby is an expert on internet governance, international finance and technology, with extensive board experience. As Chairman of ICANN, he led development of the organization’s five-year strategic plan for 2021 to 2025.
      • Wanda Felton has over 30 years of experience in the financial services industry, including serving as Vice Chair of the Board and First Vice President of the Export-Import Bank of the United States.
      • Kate O’Regan is a former judge of the Constitutional Court of South Africa and commissioner of the Khayelitsha Commission. She is the inaugural director of the Bonavero Institute of Human Rights at the University of Oxford.
      • Robert Post is an American legal scholar and Professor of Law at Yale Law School, where he formerly served as Dean. He is a leading scholar of the First Amendment and freedom of speech.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • The Senate Judiciary Committee will hold an executive session at which the “Online Content Policy Modernization Act” (S.4632), a bill to narrow the liability shield in 47 USC 230, may be marked up.
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Gerd Altmann from Pixabay

Further Reading, Other Developments, and Coming Events (4 December)

Further Reading

  • How Misinformation ‘Superspreaders’ Seed False Election Theories” By Sheera Frenkel — The New York Times. A significant percentage of lies, misinformation, and disinformation about the legitimacy of the election have been disseminated by a small number of right-wing figures, which are then repeated, reposted, and retweeted. The Times relies on research of how much engagement people like President Donald Trump and Dan Bongino get on Facebook after posting untrue claims about the election and it turns out that such trends and rumors do not start spontaneously.
  • Facebook Said It Would Ban Holocaust Deniers. Instead, Its Algorithm Provided a Network for Them” By Aaron Sankin — The Markup. This news organization still found Holocaust denial material promoted by Facebook’s algorithm even though the platform said it was taking down such material recently. This result may point to the difficulty of policing objectionable material that uses coded language and/or the social media platforms lack of sufficient resources to weed out this sort of content.
  • What Facebook Fed the Baby Boomers” By Charlie Warzel — The New York Times. A dispiriting trip inside two people’s Facebook feeds. This article makes the very good point that comments are not moderated, and these tend to be significant sources of vitriol and disinformation.
  • How to ‘disappear’ on Happiness Avenue in Beijing” By Vincent Ni and Yitsing Wang — BBC. By next year, the People’s Republic of China (PRC) may have as many as 560 million security cameras, and one artist ran an experiment of sorts to see if a group of people could walk down a major street in the capital without being seen by a camera or without their face being seen at places with lots of cameras.
  • Patients of a Vermont Hospital Are Left ‘in the Dark’ After a Cyberattack” By Ellen Barry and Nicole Perlroth — The New York Times. A Russian hacking outfit may have struck back after the Department of Defense’s (DOD) Cyber Command and Microsoft struck them. A number of hospitals were hacked, and care was significantly disrupted. This dynamic may lend itself to arguments that the United States (U.S.) may be wise to curtail its offensive operations.
  • EU seeks anti-China alliance on tech with Biden” By Jakob Hanke Vela and David M. Herszenhorn — Politico. The European Union (EU) is hoping the United States (U.S.) will be more amenable to working together in the realm of future technology policy, especially against the People’s Republic of China (PRC) which has made a concerted effort to drive the adoption of standards that favor its companies (e.g., the PRC pushed for and obtained 5G standards that will favor Huawei). Diplomatically speaking, this is considered low-hanging fruit, and a Biden Administration will undoubtedly be more multilateral than the Trump Administration.
  • Can We Make Our Robots Less Biased Than We Are?” By David Berreby — The New York Times. The bias present in facial recognition technology and artificial intelligence is making its way into robotics, posing the question of how do we change this? Many African American and other minority scientists are calling for the inclusion of people of color inn designing such systems as a countermeasure to the usual bias for white men.

Other Developments

  • The top Democrat on the Senate Homeland Security and Governmental Affairs Committee wrote President Donald Trump and “slammed the Trump Administration for their lack of action against foreign adversaries, including Russia, China, and North Korea, that have sponsored cyber-attacks against American hospitals and research institutions in an effort to steal information related to development of Coronavirus vaccines.” Peters used language that was unusually strong as Members of Congress typically tone down the rhetoric and deploy coded language to signal their level of displeasure about administration action or inaction. Peters could well feel strongly about what he perceives to be Trump Administration indifference to the cyber threats facing institutions researching and developing COVID-19 vaccines, this is an issue on which he may well be trying to split Republicans, placing them in the difficult position of lining up behind a president disinclined to prioritize some cyber issues or breaking ranks with him.
    • Peters stated:
      • I urge you, again, to send a strong message to any foreign government attempting to hack into our medical institutions that this behavior is unacceptable. The Administration should use the tools at its disposal, including the threat of sanctions, to deter future attacks against research institutions. In the event that any foreign government directly threatens the lives of Americans through attacks on medical facilities, other Department of Defense capabilities should be considered to make it clear that there will be consequences for these actions.
  • A United States federal court has ruled against a Trump Administration appointee Michael Pack and the United States Agency for Global Media (USAGM) and their attempts to interfere illegally with the independence of government-funded news organizations such as the Voice of America (VOA). The District Court for the District of Columbia enjoined Pack and the USAGM from a list of actions VOA and USAGM officials claim are contrary to the First Amendment and the organization’s mission.
  • The Federal Trade Commission (FTC) is asking a United States federal court to compel former Trump White House advisor Steve Bannon to appear for questioning per a Civil Investigative Demand (CID) as part of its ongoing probe of Cambridge Analytica’s role in misusing personal data of Facebook users in the 2016 Presidential Election. The FTC noted it “issued the CID to determine, among other things, whether Bannon may be held individually liable for the deceptive conduct of Cambridge Analytica, LLC—the subject of an administrative law enforcement action brought by the Commission.” There had been an interview scheduled in September but the day before it was to take place, Bannon’s lawyers informed the FTC he would not be attending.
    • In 2019, the FTC settled with former Cambridge Analytica CEO Alexander Nix and app developer Aleksandr Kogan in “administrative orders restricting how they conduct any business in the future, and requiring them to delete or destroy any personal information they collected.” The FTC did not, however, settle with the company itself. The agency alleged “that Cambridge Analytica, Nix, and Kogan deceived consumers by falsely claiming they did not collect any personally identifiable information from Facebook users who were asked to answer survey questions and share some of their Facebook profile data.” Facebook settled with the FTC for a record $5 billion for its role in the Cambridge Analytica scandal and for how it violated its 2012 consent order with the agency.
  • Apple responded to a group of human rights and civil liberties organizations about its plans to deploy technology on its operating system that allows users greater control of their privacy. Apple confirmed that its App Tracking Transparency (ATT) would be made part of its iOS early next year and would provide users of Apple products with a prompt with a warning about how their information may be used by the app developer. ATT would stop app developers from tracking users when they use other apps on ta device. Companies like Facebook have objected, claiming that the change is a direct shot at them and their revenue. Apple does not reap a significant revenue stream from collecting, combining, and processing user data whereas Facebook does. Facebook also tracks users across devices and apps on a device through a variety of means.
    • Apple stated:
      • We delayed the release of ATT to early next year to give developers the time they indicated they needed to properly update their systems and data practices, but we remain fully committed to ATT and to our expansive approach to privacy protections. We developed ATT for a single reason: because we share your concerns about users being tracked without their consent and the bundling and reselling of data by advertising networks and data brokers.
      • ATT doesn’t ban the reasonable collection of user data for app functionality or even for advertising. Just as with the other data-access permissions we have added over many software releases, developers will be able to explain why they want to track users both before the ATT prompt is shown and in the prompt itself. At that point, users will have the freedom to make their own choice about whether to proceed. This privacy innovation empowers consumers — not Apple — by simply making it clear what their options are, and giving them the information and power to choose.
    • As mentioned, a number of groups wrote Apple in October “to express our disappointment that Apple is delaying the full implementation of iOS 14’s anti-tracking features until early 2021.” They argued:
      • These features will constitute a vital policy improvement with the potential to strengthen respect for privacy across the industry. Apple should implement these features as expeditiously as possible.
      • We were heartened by Apple’s announcement that starting with the iOS 14 update, all app developers will be required to provide information that will help users understand the privacy implications of an app before they install it, within the App Store interface.
      • We were also pleased that iOS 14 users would be required to affirmatively opt in to app tracking, on an app-by-app basis. Along with these changes, we urge Apple to verify the accuracy of app policies, and to publish transparency reports showing the number of apps that are rejected and/or removed from the App Store due to inadequate or inaccurate policies.
  • The United States (U.S.) Government Accountability Office (GAO) sent its assessment of the privacy notices and practices of U.S. banks and credit unions to the chair of the Senate committee that oversees this issue. Senate Banking, Housing, and Urban Affairs Committee Chair Mike Crapo (R-ID) had asked the GAO “to examine the types of personal information that financial institutions collect, use, and share; how they make consumers aware of their information-sharing practices; and federal regulatory oversight of these activities.” The GAO found that a ten-year-old model privacy disclosure form used across these industries may comply with the prevailing federal requirements but no longer encompasses the breadth and scope of how the personal information of people is collected, processed, and used. The GAO called on the Consumer Financial Protection Bureau (CFPB) to update this form. The GAO explained:
    • Banks and credit unions collect, use, and share consumers’ personal information—such as income level and credit card transactions—to conduct everyday business and market products and services. They share this information with a variety of third parties, such as service providers and retailers.
    • The Gramm-Leach-Bliley Act (GLBA) requires financial institutions to provide consumers with a privacy notice describing their information-sharing practices. Many banks and credit unions elect to use a model form—issued by regulators in 2009—which provides a safe harbor for complying with the law (see figure). GAO found the form gives a limited view of what information is collected and with whom it is shared. Consumer and privacy groups GAO interviewed cited similar limitations. The model form was issued over 10 years ago. The proliferation of data-sharing since then suggests a reassessment of the form is warranted. Federal guidance states that notices about information collection and usage are central to providing privacy protections and transparency.
    • Since Congress transferred authority to the CFPB for implementing GLBA privacy provisions, the agency has not reassessed if the form meets consumer expectations for disclosures of information-sharing. CFPB officials said they had not considered a reevaluation because they had not heard concerns from industry or consumer groups about privacy notices. Improvements to the model form could help ensure that consumers are better informed about all the ways banks and credit unions collect and share personal information
    • The increasing amounts of and changing ways in which industry collects and shares consumer personal information—including from online activities—highlights the importance of clearly disclosing practices for collection, sharing, and use. However, our work shows that banks and credit unions generally used the model form, which was created more than 10 years ago, to make disclosures required under GLBA. As a result, the disclosures often provided a limited view of how banks and credit unions collect, use, and share personal information.
    • We recognize that the model form is required to be succinct, comprehensible to consumers, and allow for comparability across institutions. But, as information practices continue to change or expand, consumer insights into those practices may become even more limited. Improvements and updates to the model privacy form could help ensure that consumers are better informed about all the ways that banks and credit unions collect, use, and share personal information. For instance, in online versions of privacy notices, there may be opportunities for readers to access additional details—such as through hyperlinks—in a manner consistent with statutory requirements.
  • The Australian Competition & Consumer Commission (ACCC) is asking for feedback on Google’s proposed $2.1 billion acquisition of Fitbit. In a rather pointed statement, the chair of the ACCC, Rod Sims, made clear “[o]ur decision to begin consultation should not be interpreted as a signal that the ACCC will ultimately accept the undertaking and approve the transaction.” The buyout is also under scrutiny in the European Union (EU) and may be affected by the suit the United States Department of Justice (DOJ) and some states have brought against the company for anti-competitive behavior. The ACCC released a Statement of Issues in June about the proposed deal.
    • The ACCC explained “[t]he proposed undertaking would require Google to:
      • not use certain user data collected through Fitbit and Google wearables for Google’s advertising purposes for 10 years, with an option for the ACCC to extend this obligation by up to a further 10 years;
      • maintain access for third parties, such as health and fitness apps, to certain user data collected through Fitbit and Google wearable devices for 10 years; and
      • maintain levels of interoperability between third party wearables and Android smartphones for 10 years.
    • In August, the EU “opened an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The European Commission (EC) expressed its concerns “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC stated “[a]t this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The EC explained that it “will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed…[and] will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • Amnesty International (AI) sent EC Executive Vice-President Margrethe Vestager a letter, arguing “[t]he merger risks further extending the dominance of Google and its surveillance-based business model, the nature and scale of which already represent a systemic threat to human rights.” AI asserted “[t]he deal is particularly troubling given the sensitive nature of the health data that Fitbit holds that would be acquired by Google.” AI argued “[t]he Commission must ensure that the merger does not proceed unless the two business enterprises can demonstrate that they have taken adequate account of the human rights risks and implemented strong and meaningful safeguards that prevent and mitigate these risks in the future.”
  • Europol, the United Nations Interregional Crime and Justice Research Institute (UNICRI) and Trend Micro have cooperated on a report that looks “into current and predicted criminal uses of artificial intelligence (AI).
    • The organizations argued “AI could be used to support:
      • convincing social engineering attacks at scale;
      • document-scraping malware to make attacks more efficient;
      • evasion of image recognition and voice biometrics;
      • ransomware attacks, through intelligent targeting and evasion;
      • data pollution, by identifying blind spots in detection rules.
    • The organizations concluded:
      • Based on available insights, research, and a structured open-source analysis, this report covered the present state of malicious uses and abuses of AI, including AI malware, AI-supported password guessing, and AI-aided encryption and social engineering attacks. It also described concrete future scenarios ranging from automated content generation and parsing, AI-aided reconnaissance, smart and connected technologies such as drones and autonomous cars, to AI-enabled stock market manipulation, as well as methods for AI-based detection and defense systems.
      • Using one of the most visible malicious uses of AI — the phenomenon of so-called deepfakes — the report further detailed a case study on the use of AI techniques to manipulate or generate visual and audio content that would be difficult for humans or even technological solutions to immediately distinguish from authentic ones.
      • As speculated on in this paper, criminals are likely to make use of AI to facilitate and improve their attacks by maximizing opportunities for profit within a shorter period, exploiting more victims, and creating new, innovative criminal business models — all the while reducing their chances of being caught. Consequently, as “AI-as-a-Service”206 becomes more widespread, it will also lower the barrier to entry by reducing the skills and technical expertise required to facilitate attacks. In short, this further exacerbates the potential for AI to be abused by criminals and for it to become a driver of future crimes.
      • Although the attacks detailed here are mostly theoretical, crafted as proofs of concept at this stage, and although the use of AI to improve the effectiveness of malware is still in its infancy, it is plausible that malware developers are already using AI in more obfuscated ways without being detected by researchers and analysts. For instance, malware developers could already be relying on AI-based methods to bypass spam filters, escape the detection features of antivirus software, and frustrate the analysis of malware. In fact, DeepLocker, a tool recently introduced by IBM and discussed in this paper, already demonstrates these attack abilities that would be difficult for a defender to stop.
      • To add, AI could also enhance traditional hacking techniques by introducing new ways of performing attacks that would be difficult for humans to predict. These could include fully automated penetration testing, improved password-guessing methods, tools to break CAPTCHA security systems, or improved social engineering attacks. With respect to open-source tools providing such functionalities, the paper discussed some that have already been introduced, such as DeepHack, DeepExploit, and XEvil.
      • The widespread use of AI assistants, meanwhile, also creates opportunities for criminals who could exploit the presence of these assistants in households. For instance, criminals could break into a smart home by hijacking an automation system through exposed audio devices.

Coming Events

  • The National Institute of Standards and Technology (NIST) will hold a webinar on the Draft Federal Information Processing Standards (FIPS) 201-3 on 9 December.
  • On 9 December, the Senate Commerce, Science, and Transportation Committee will hold a hearing titled “The Invalidation of the EU-US Privacy Shield and the Future of Transatlantic Data Flows” with the following witnesses:
    • The Honorable Noah Phillips, Commissioner, Federal Trade Commission
    • Ms. Victoria Espinel, President and Chief Executive Officer, BSA – The Software Alliance
    • Mr. James Sullivan, Deputy Assistant Secretary for Services, International Trade Administration, U.S. Department of Commerce
    • Mr. Peter Swire, Elizabeth and Tommy Holder Chair of Law and Ethics, Georgia Tech Scheller College of Business, and Research Director, Cross-Border Data Forum
  • On 10 December, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Securing the Communications Supply Chain. The Commission will consider a Report and Order that would require Eligible Telecommunications Carriers to remove equipment and services that pose an unacceptable risk to the national security of the United States or the security and safety of its people, would establish the Secure and Trusted Communications Networks Reimbursement Program, and would establish the procedures and criteria for publishing a list of covered communications equipment and services that must be removed. (WC Docket No. 18-89)
    • National Security Matter. The Commission will consider a national security matter.
    • National Security Matter. The Commission will consider a national security matter.
    • Allowing Earlier Equipment Marketing and Importation Opportunities. The Commission will consider a Notice of Proposed Rulemaking that would propose updates to its marketing and importation rules to permit, prior to equipment authorization, conditional sales of radiofrequency devices to consumers under certain circumstances and importation of a limited number of radiofrequency devices for certain pre-sale activities. (ET Docket No. 20-382)
    • Promoting Broadcast Internet Innovation Through ATSC 3.0. The Commission will consider a Report and Order that would modify and clarify existing rules to promote the deployment of Broadcast Internet services as part of the transition to ATSC 3.0. (MB Docket No. 20-145)

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Further Reading, Other Developments, and Coming Events (9 November)

Further Reading

  • Facebook bans ‘STOP THE STEAL’ group Trump allies were using to organize protests against vote counting” By Tony Romm, Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. A significant portion of the online activity among those on the right wing alleging that the Biden Campaign and Democrats have stolen the election is traceable to right-wing media influencers and it is less an organic effort. Moreover, Facebook has apparently had a mixed record in locating and taking down material that is seeking to spread lies about the integrity of the election and foment violence.
  • False News Targeting Latinos Trails the Election” By Patricia Mazzei and Nicole Perlroth — The New York Times. By the metrics used in the article (although it’s not clear exactly where the Times got its data), the disinformation in Spanish on social media in 2020 exceeded the Russian disinformation campaign in 2016. Apparently, Facebook, Twitter, and YouTube were not prepared or were not expecting the flood of lies, misinformation, and disinformation about President-elect Joe Biden or the Democrats generally, especially in South Florida where Republicans did much better than expected. Much of this content tied Biden to the former dictators of Cuba and Venezuela, Fidel Castro and Hugo Chavez.
  • Trump’s Tweeting Isn’t Crazy. It’s Strategic, Typos and All.” By Emily Dreyfuss — The New York Times. This piece traces the evolution of a campaign to paint the Biden family as engaged in criminal activity to both smear them and to blunt any criticism of the Trump family given the many and serious allegations of lawbreaking and unethical behavior.
  • TikTok invites UK lawmakers to review algorithm after being probed on China censorship concerns” By Sam Shead — CNBC. In testimony before the United Kingdom’s (UK) Parliament’s Business, Energy and Industrial Strategy Committee, TikTok’s head of policy in the UK said the platform used to censor content but then hedged the statement after the hearing in a statement. Prior to May 2019, the company hewed to the content wishes of the People’s Republic of China and material on Tiananmen Square was not on the platform. However, she did claim that TikTok’s data is stored in the United States with backups in Singapore, none of which goes to the PRC.
  • The Disinformation Is Coming From Inside the White House” By Matthew Rosenberg, Jim Rutenberg and Nick Corasaniti — The New York Times. Turns out much of the disinformation about alleged but unproven vote fraud is coming directly from the President, his advisers, his allies, and his family. It may come to pass that domestic disinformation, misinformation, and lies will have a larger impact than similar efforts from overseas.

Other Developments

  • Representative Ro Khanna (D-CA) introduced “The 21st Century Jobs Package” (H.R.8693) that establish a Federal Institute of Technology (FIT) and “allocates $900 billion in research & development (R&D) funding for emerging technologies like Advanced Manufacturing, Synthetic Biology, Artificial intelligence, Biotechnology, and Cybersecurity” according to his press release. In a summary, Khanna explained:
    • At the center of this proposal is the creation of a FIT, with presence in multiple locations around the country. These locations will initially take the form of additional facilities and faculty within or alongside existing universities and complementing ecosystems that are already dynamic. Over time, they will grow to include new stand-alone operations in areas without strong existing university bases. The vision, as in the past, is to marry federal resources and guidance with local initiative.
    • The proposed budget for this entire initiative is $900 billion over ten years. This would raise total public R&D spending to 1% of GDP by the end of the period, returning us to our role as an international leader. Most importantly, it would create as many as three million good new jobs per year. Many of these jobs would be in places that have fallen behind.
  • Australia’s Attorney-General has released an issues paper as a precursor of a possible rewrite of the country’s Privacy Act 1988 “to ensure privacy settings empower consumers, protect their data and best serve the Australian economy…as part of the government’s response to the Australian Competition and Consumer Commission’s Digital Platforms Inquiry” according to the its press release. The Attorney-General explained:
    • The review will examine and, if needed, consider options for reform on matters including:
    • The scope and application of the Privacy Act including in relation to:
      • the definition of ‘personal information’
      • current exemptions, and
      • general permitted situations for the collection, use and disclosure of personal information.
    • Whether the Privacy Act effectively protects personal information and provides a practical and proportionate framework for promoting good privacy practices including in relation to:
      • notification requirements
      • consent requirements including default privacy settings
      • overseas data flows, and
      • erasure of personal information.
    • Whether individuals should have direct rights of action to enforce privacy obligations under the Privacy Act.
    • Whether a statutory tort for serious invasions of privacy should be introduced into Australian law.
    • The impact of the notifiable data breach scheme and its effectiveness in meeting its objectives.
    • The effectiveness of enforcement powers and mechanisms under the Privacy Act and the interaction with other Commonwealth regulatory frameworks.
    • The desirability and feasibility of an independent certification scheme to monitor and demonstrate compliance with Australian privacy laws
  • The National Institute of Standards and Technology (NIST) has released for comment its “Draft Federal Information Processing Standard (FIPS) 201-3, Personal Identity Verification (PIV) of Federal Employees and Contractors (Standard).” NIST explained in the Federal Register notice:
    • This Standard defines common credentials and authentication mechanisms offering varying degrees of security for both logical and physical access applications. The draft revision proposes changes to FIPS 201-2, Standard for Personal Identity Verification of Federal Employees and Contractors to include: Expanding specification on the use of additional PIV credentials known as derived PIV credentials, procedures for supervised remote identity proofing, the use of federation as a means for a relying system to interoperate with PIV credentials issued by other agencies, alignment with the current practice/policy of the Federal Government and specific changes requested by Federal agencies and implementers. Before recommending these proposed changes to the Secretary of Commerce for review and approval, NIST invites comments from all interested parties.
    • In the draft document, NIST stated:
      • Authentication of an individual’s identity is a fundamental component of physical and logical access control. An access control decision must be made when an individual attempts to access security-sensitive buildings, information systems, and applications. An accurate determination of an individual’s identity supports making sound access control decisions. T
      • his document establishes a standard for a Personal Identity Verification (PIV) system that meets the control and security objectives of Homeland Security Presidential Directive-12 [HSPD-12]. It is based on secure and reliable forms of identity credentials issued by the Federal Government to its employees and contractors. These credentials are used by mechanisms that authenticate individuals who require access to federally controlled facilities, information systems, and applications. This Standard addresses requirements for initial identity proofing, infrastructure to support interoperability of identity credentials, and accreditation of organizations and processes issuing PIV credentials.
  • The Federal Communications Commission (FCC) announced a $200 million settlement with T-Mobile “to resolve an investigation of its subsidiary Sprint’s compliance with the Commission’s rules regarding waste, fraud, and abuse in the Lifeline program for low-income consumers” according to the agency’s press release. The FCC explained:
    • The payment is the largest fixed-amount settlement the Commission has ever secured to resolve an investigation.  The settlement comes after an Enforcement Bureau investigation into reports that Sprint, prior to its merger with T-Mobile, was claiming monthly subsidies for serving approximately 885,000 Lifeline subscribers even though those subscribers were not using the service, in potential violation of the Commission’s “non-usage” rule.  The matter initially came to light as a result of an investigation by the Oregon Public Utility Commission.  In addition to paying a $200 million civil penalty, Sprint agreed to enter into a compliance plan to help ensure future adherence to the Commission’s rules for the Lifeline program.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Walkerssk from Pixabay

Further Reading, Other Developments, and Coming Events (5 November)

Further Reading

  • Confusion and conflict stir online as Trump claims victory, questions states’ efforts to count ballots” By Craig Timberg, Tony Romm, Isaac Stanley-Becker and Drew Harwell — Washington Post. When the post-mortem on the 2020 Election is written, it is likely to be the case that foreign disinformation was not the primary threat. Rather, it may be domestic interference given the misinformation, disinformation, and lies circulating online despite the best efforts of social media platforms to label, take down, and block such material. However, if this article is accurate, much of it is coming from the right wing, including the President.
  • Polls close on Election Day with no apparent cyber interference” By Kevin Collier and Ken Dilanian — NBC News. Despite crowing from officials like The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) Director Christopher Krebs and U.S. Cyber Command head General Paul Naksone, it is not altogether clear that U.S. efforts, especially publicized offensive operations are the reason there were no significant cyber attacks on Election Day. However, officials are cautioning the country is not out of the woods as vote counting is ongoing and opportunities for interference and mischief remain.
  • Russian hackers targeted California, Indiana Democratic parties” By Raphael Satter, Christopher Bing, Joel Schectman — Reuters. Apparently, Microsoft helped foil Russian efforts to hack two state Democratic parties and think tanks, some of which are allied with the Democratic party. However, it appears none of the attempts, which occurred earlier this year, were successful. The article suggests but does not claim that increased cyber awareness and defenses foiled most of the attempts by hacking group, Fancy Bear.
  • LexisNexis to Pay $5 Million Class Action Settlement for Selling DMV Data” By Joseph Cox — Vice. Data broker LexisNexis is settling a suit that it violated the Drivers’ Privacy Protection Act (DPPA) by obtaining Department of Motor Vehicles (DMV) records on people for a purpose not authorized under the law. Vice has written a number of articles on the practices of DMVs selling people’s data, which has caught the attention of at least two Democratic Members of Congress who have said they will introduce legislation to tighten the circumstances under which these data may be shared or sold.
  • Spy agency ducks questions about ‘back doors’ in tech products” By Joseph Menn — Reuters. Senator Ron Wyden (D-OR) is demanding that the National Security Agency (NSA) reveal the guidelines put in place after former NSA contractor Edward Snowden revealed the agency’s practice of getting backdoors in United States (U.S.) technology it could use in the future. This practice allowed the NSA to sidestep warrant requirements, but it also may have weakened technology that was later exploited by other governments as the People’s Republic of China (PRC) allegedly did to Juniper in 2015. After Snowden divulged the NSA’s practice, reforms were supposedly put in place but never shared with Congress.

Other Developments

  • Australia’s Joint Committee on Intelligence and Security issued a new report into Australia’s mandatory data retention regime that makes 22 recommendations to “increase transparency around the use of the mandatory data retention and increase the threshold for when data can be accessed…[and] reduce the currently very broad access to telecommunications data under the Telecommunications Act.” The committee stated “[t]he report’s 22 recommendations include:
    • access to data kept under the mandatory data retention regime will only be available under specific circumstances
    • the Department of Home Affairs develop guidelines for data collection including an ability for enforcement agencies and Home Affairs to produce reports to oversight agencies or Parliament when requested
    • the repeal of section 280(1)(b) of the Telecommunications Act which allows for access where ‘disclosure or use is required or authorised by or under law.’ It is the broad language in this subsection that has allowed the access that concerned the committee
    • The committee explained:
      • The Parliamentary Joint Committee on Intelligence and Security (the Committee) is required by Part 5-1A of the Telecommunications (Interception and Access) Act 1979 (TIA Act) to undertake a review of the mandatory data retention regime (MDRR).
      • The mandatory data retention regime is a legislative framework which requires carriers, carriage service providers and internet service providers to retain a defined set of telecommunications data for two years, ensuring that such data remains available for law enforcement and national security investigations.
  • Senators Ron Wyden (D-OR) and Sherrod Brown (D-OH) wrote a letter “to trade associations urging them to take immediate action to ensure their members are not complicit in China’s state-directed human rights abuses, including by relocating production from the Xinjiang Uyghur Autonomous Region.” They stated:
    • We write to express our concerns over reports that the industries and companies that the U.S. Chamber of Commerce represents have supply chains that have been implicated in the state-sanctioned forced labor of Uyghurs and other Muslim groups in the Xinjiang Uyghur Autonomous Region of China (XUAR) and in sites where Uyghurs have been relocated.  The decision to operate or contract with production facilities overseas must be accompanied by high standards of supply chain accountability and transparency to ensure that no company’s products are made with forced labor.  We urge your members to take immediate action to ensure goods manufactured for them are not complicit in the China’s state-directed human rights abuses, including by relocating production from the XUAR.  In addition, we ask your members to take critical, comprehensive steps to achieve the supply chain integrity and transparency American consumers and workers deserve.  It is past time for American multinational companies to be part of the solution, not part of the problem, on efforts to eradicate forced labor and end human rights abuses against workers in China. 
  • The Federal Trade Commission (FTC) finalized a settlement alleging violations of the now struck down European Union-United States Privacy Shield. In its press release, the agency explained it had “alleged that NTT Global Data Centers Americas, Inc. (NTT), formerly known as RagingWire Data Centers, Inc., claimed in its online privacy policy and marketing materials that the company participated in the Privacy Shield framework and complied with the program’s requirements.” The FTC noted “the company’s certification lapsed in January 2018 and it failed to comply with certain Privacy Shield requirements while it was a participant in the framework.” The FTC stated:
    • Under the settlement, the company, among other things, is prohibited not just from misrepresenting its compliance with or participation in the Privacy Shield framework, but also any other privacy or data security program sponsored by the government or any self-regulatory or standard-setting organization. The company also must continue to apply the Privacy Shield requirements or equivalent protections to personal information it collected while participating in the framework or return or delete the information.
    • Although the European Court of Justice invalidated the Privacy Shield framework in July 2020, that decision does not affect the validity of the FTC’s decision and order relating to NTT’s misrepresentations about its participation in and compliance with the framework. The framework allowed participants to transfer data legally from the European Union to the United States.
  • The Commission nationale de l’informatique et des libertés (CNIL) issued a press release, explaining that France’s “Council of State acknowledges the existence of a risk of data transfer from the Health Data Hub to the United States and requests additional safeguards.” CNIL stated it “will advise the public authorities on appropriate measures and will ensure, for research authorization related to the health crisis, that there is a real need to use the platform.” This announcement follows from the Court of Justice of the European Union (CJEU) striking down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). CNIL summarized the “essentials:”
    • Fearing that some data might be transferred to the United States, some claimants lodged an appeal with the Council of State requesting the suspension of the “Health Data Hub”, the new platform designed to ultimately host all the health data of people who receive medical care in France.
    • The Court considers that a risk cannot be excluded with regard to the transfer of health data hosted on the Health Data Hub platform to the US intelligence.
    • Because of the usefulness of the Health Data Hub in managing the health crisis, it refuses to suspend the operation of the platform.
    • However, it requires the Health Data Hub to strengthen its contract with Microsoft on a number of points and to seek additional safeguards to better protect the data it hosts.
    • It is the responsibility of the CNIL to ensure, for authorization of research projects on the Health Data Hub in the context of the health crisis, that the use of the platform is technically necessary, and to advise public authorities on the appropriate safeguards.
    • These measures will have to be taken while awaiting a lasting solution that will eliminate any risk of access to personal data by the American authorities, as announced by the French Secretary of State for the Digital Agenda.
  • The United Kingdom’s (UK) National Cyber Security Centre (NCSC) has published its annual review that “looks back at some of the key developments and highlights from the NCSC’s work between 1 September 2019 and 31 August 2020.” In the foreword, new NCSC Chief Executive Officer Lindy Cameron provided an overview:
    • Expertise from across the NCSC has been surged to assist the UK’s response to the pandemic. More than 200 of the 723 incidents the NCSC handled this year related to coronavirus and we have deployed experts to support the health sector, including NHS Trusts, through cyber incidents they have faced. We scanned more than one million NHS IP addresses for vulnerabilities and our cyber expertise underpinned the creation of the UK’s coronavirus tracing app.
    • An innovative approach to removing online threats was created through the ‘Suspicious Email Reporting Service’ – leading to more than 2.3 million reports of malicious emails being flagged by the British public. Many of the 22,000 malicious URLs taken down as a result related to coronavirus scams, such as pretending to sell PPE equipment to hide a cyber attack. The NCSC has often been described as world-leading, and that has been evident over the last 12 months. Our innovative ‘Exercise in a Box’ tool, which supports businesses and individuals to test their cyber defences against realistic scenarios, was used in 125 countries in the last year.
    • Recognising the change in working cultures due to the pandemic, our team even devised a specific exercise on remote working, which has helped organisations to understand where current working practices may be presenting alternative cyber risks. Proving that cyber really is a team sport, none of this would be possible without strong partnerships internationally and domestically. We worked closely with law enforcement – particularly the National Crime Agency – and across government, industry, academia and, of course, the UK public.
    • The NCSC is also looking firmly ahead to the future of cyber security, as our teams work to understand both the risks and opportunities to the UK presented by emerging technologies. A prominent area of work this year was the NCSC’s reviews of high-risk vendors such as Huawei – and in particular the swift and thorough review of US sanctions against Huawei. The NCSC gave advice on the impact these changes would have in the UK, publishing a summary of the advice given to government as well as timely guidance for operators and the public.
  • Australia’s Department of Industry, Science, Energy and Resources has put out for comment a discussion paper titled “An AI Action Plan for all Australians” to “shape Australia’s vision for artificial intelligence (AI).” The department said it “is now consulting on the development of a whole-of-government AI Action Plan…[that] will help us maximise the benefits of AI for all Australians and manage the potential challenges.” The agency said “[t]he will help to:
    • ensure the development and use of AI in Australia is responsible
    • coordinate government policy and national capability under a clear, common vision for AI in Australia
    • explore the actions needed for our AI future
    • The department explained:
      • Building on Australia’s AI Ethics Framework, the Australian Government is developing an AI Action Plan. It is a key component of the government’s vision to be a leading digital economy by 2030. It builds on almost $800 million invested in the 2020-21 Budget to enable businesses to take advantage of digital technologies to grow their businesses and create jobs. It is an opportunity to leverage AI as part of the Australian Government’s economic recovery plan. We must work together to ensure all Australians can benefit from advances in AI.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

FTC Asks Congress For Fix Ahead of SCOTUS Decision

The Federal Trade Commission urges Congress to undo three court decisions that have weakened its enforcement powers.

The Federal Trade Commission (FTC) wrote the House and Senate committees with jurisdiction over the agency, asking for language restoring the power to seek and obtain restitution for victims of those who have violated Section 5 of the FTC Act and disgorgement of ill-gotten gains. The FTC is also asking that Congress clarify that the agency may act against violators even if their conduct has stopped as it has for more than four decades. Two federal appeals courts have ruled in ways that have limited the FTC’s long used powers, and now the Supreme Court of the United States is set to rule on these issues sometime next year. The FTC is claiming, however, that defendants are playing for time in the hopes that the FTC’s authority to seek and receive monetary penalties will ultimately be limited by the United States (U.S.) highest court. Judging by language tucked into a privacy bill introduced by the chair of one of the committees, Congress may be willing to act soon.

The FTC asked the House Energy and Commerce and Senate Commerce, Science, and Transportation Committees “to take quick action to amend Section 13(b) [of the FTC Act i.e. 15 U.S.C. § 53(b)] to make clear that the Commission can bring actions in federal court under Section 13(b) even if conduct is no longer ongoing or impending when the suit is filed and can obtain monetary relief, including restitution and disgorgement, if successful.” The agency asserted “[w]ithout congressional action, the Commission’s ability to use Section 13(b) to provide refunds to consumer victims and to enjoin illegal activity is severely threatened.” All five FTC Commissioners signed the letter.

The FTC explained that adverse rulings by two federal appeals courts are constraining the agency from seeking relief for victims and punishment for violators of the FTC Act in federal courts below those two specific courts, but elsewhere defendants are either asking courts for a similar ruling or using delaying tactics in the hopes the Supreme Court upholds the two federal appeals courts:

  • …[C]ourts of appeals in the Third and Seventh Circuits have recently ruled that the agency cannot obtain any monetary relief under Section 13(b). Although review in the Supreme Court is pending, these lower court decisions are already inhibiting our ability to obtain monetary relief under 13(b). Not only do these decisions already prevent us from obtaining redress for consumers in the circuits where they issued, prospective defendants are routinely invoking them in refusing to settle cases with agreed-upon redress payments.
  • Moreover, defendants in our law enforcement actions pending in other circuits are seeking to expand the rulings to those circuits and taking steps to delay litigation in anticipation of a potential Supreme Court ruling that would allow them to escape liability for any monetary relief caused by their unlawful conduct. This is a significant impediment to the agency’s effectiveness, its ability to provide redress to consumer victims, and its ability to prevent entities who violate the law from profiting from their wrongdoing.

In a 2019 case, FTC v. Credit Bureau Center, LLC, the United States Court of Appeals for the Seventh Circuit (Seventh Circuit) found that the authority Congress granted in Section 13(b) does not allow the agency to seek and receive restitution. The Seventh Circuit found the provision allows the FTC to seek a permanent injunction but not monetary damages. As the Seventh Circuit explained, 15 U.S.C. § 53(b) “authorizes only restraining orders and injunctions…[b]ut the Commission has long viewed it as also authorizing awards of restitution.” The Seventh Circuit added that it had endorsed this view in a 1989 case, but subsequent Supreme Court cases had thrown into question such expansive readings of agency power that was not supported by statute. Moreover, the Seventh Circuit pointed out the FTC Act “has two detailed remedial provisions that expressly authorize restitution if the Commission follows certain procedures.” Ultimately, the Seventh Circuit held that the “permanent-injunction provision [in 15 U.S.C. § 53(b)] does not authorize monetary relief.”

In the September 2020 case FTC v. AbbVie, Inc., the United States Court of Appeals for the Third Circuit (Third Circuit) followed the Seventh Circuit by holding that Section 13(b) does not permit the FTC to punish behavior that is not currently happening or about to start. Moreover, because disgorgement is a remedy designed to address past conduct, this relief is also not available under Section 13(b).

The Third Circuit held:

  • Section 13(b) authorizes a court to “enjoin” antitrust violations. It says nothing about disgorgement, which is a form of restitution, see Liu v. SEC, 140 S. Ct. 1936, 1940–41 (2020), not injunctive relief, see, e.g., Meghrig v. KFC W., Inc., 516 U.S. 479, 484 (1996) (“[N]either [a mandatory nor prohibitory injunction] contemplates the award of . . . ‘damages’ or ‘equitable restitution.’”); Owner-Operator Indep. Drivers Ass’n v. Landstar Sys., Inc., 622 F.3d 1307, 1324 (11th Cir. 2010) (“Injunctive relief constitutes a distinct type of equitable relief; it is not an umbrella term that encompasses restitution or disgorgement.”). Thus, Section 13(b) does not explicitly empower district courts to order disgorgement.
  • So if a violator’s conduct is neither imminent nor ongoing, there is nothing to enjoin, and the FTC cannot sue under Section 13(b). By contrast, the requirement makes little sense as applied to a disgorgement remedy. Disgorgement deprives a wrongdoer of past gains, see Liu, 140 S. Ct. at 1940–41, meaning that even if a wrongdoer’s conduct  is  not  imminent  or  ongoing,  he  may  have  gains  to  disgorge. If  Congress  contemplated  the  FTC  could  sue  for  disgorgement under Section 13(b), it probably would not have required the  FTC to show an imminent  or  ongoing  violation. That  requirement  suggests  Section  13(b)  does  not  empower  district courts to order disgorgement.

The FTC pointed to another Third Circuit case that further limits its Section 13(b) authority with respect to injunctions. The agency explained:

  • In FTC v. Shire ViroPharma, the court held that the FTC can bring enforcement actions under Section 13(b) only when a violation is either ongoing or “impending” at the time the suit is filed. That decision unnecessarily limits the Commission’s ability to obtain relief for consumers who have been harmed by unlawful conduct that occurred in the past but is not ongoing.
  • The decision also hampers the Commission’s longstanding ability to protect consumers by getting an injunction that prohibits defendants from resuming their unlawful activities in cases where the conduct has stopped but there is a reasonable likelihood that the defendants could resume their unlawful activities in the future.
  • The decision also is impacting our ability to settle cases. Targets of FTC investigations now routinely argue that they are immune from suit because they are no longer violating the law, despite the fact that there is a likelihood of recurrence, and they make these arguments even in cases when they stopped violating the law only after learning that the FTC was investigating them.

In that case, the Third Circuit upheld a District Court’s ruling that Section 13(b) does not allow for the enjoining of past conduct and held:

On  appeal,  the  FTC  urges  us  to  adopt  a  more expansive view of Section 13(b).  According to the FTC, the phrase“ is violating, or is about to violate” in Section 13(b) is  satisfied by showing  a  past  violation  and  a reasonable likelihood  of  recurrent  future  conduct.    We reject the FTC’s invitation to stretch Section 13(b) beyond its clear text.  The FTC admits that Shire is not currently violating the  law. And the  complaint fails  to  allege that Shire is about to violate the law.

Republicans on one of the committees included a legislative fix in a privacy bill. The “Setting an American Framework to Ensure Data Access, Transparency, and Accountability (SAFE DATA) Act” (S.4626) was introduced in September 2020 by Senate Commerce, Science, and Transportation Committee Chair Roger Wicker (R-MS), Senate Majority Whip and  Communications, Technology, Innovation, and the Internet Subcommittee Chair John Thune (R-SD), Transportation and Safety Subcommittee Chair Deb Fischer (R-NE), and Safety, and Senator Marsha Blackburn (R-TN). As noted, there is language that would seem to address these Third and Seventh Circuit cases. Section 403 would alter Section 13(b), expanding it to include past violations and the relief the FTC may seek to include restitution, disgorgement, and other equitable remedies. However, it is highly unlikely the Congress will address privacy legislation, and Republicans may have included this legislative language as a sweetener for Democrats to swallow the medicine of state preemption and no private right of action in the SAFE DATA Act. Chances of standalone legislation are unknown at present.

Additionally, in a draft law review article, FTC Commissioner Rohit Chopra and an attorney advisor Samuel Levine argued the FTC would use a dormant power to fill the gap in ts enforcement authority left by these case. They asserted:

  • [T]he agency should resurrect one of the key authorities abandoned in the 1980s: Section 5(m)(1)(B) of the FTC Act, the Penalty Offense Authority. The Penalty Offense Authority is a unique tool in commercial regulation. Typically, first- time offenses involving unfair or deceptive practices do not lead to civil penalties. However, if the Commission formally condemns these practices in a cease-and-desist order, they can become what we call “Penalty Offenses.” Other parties that commit these offenses with knowledge that they have been condemned by the Commission face financial penalties that can add up to a multiple of their illegal profits, rather than a fraction.
  • Using this authority, the Commission can substantially increase deterrence and reduce litigation risk by noticing whole industries of Penalty Offenses, exposing violators to significant civil penalties, while helping to ensure fairness for honest firms. This would dramatically improve the FTC’s effectiveness relative to our current approach, which relies almost entirely on Section 13(b) and no-money cease-and-desist orders, even in cases of blatant lawbreaking.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Ian Hutchinson on Unsplash

Further Reading, Other Developments, and Coming Events (4 November)

Further Reading

  • U.S. Cyber Command Expands Operations to Hunt Hackers From Russia, Iran and China” By Julian Barnes — The New York Times. The United States (U.S.) agency charged with offensive cyber operations sent teams around the world to undisclosed locations to work with partner nations to foil Russian, Chinese, and Iranian efforts to disrupt the U.S. election. It appears this exercise is more about building relations with partners in key regions and having personnel see first-hand the effect of constant cyber attacks, especially in regions targeted by the Russian Federation rather than the rationale offered by Cyber Command that “hunting forward” puts its people closer to the action. Considering this is cyberspace, does it really matter where personnel are?
  • U.S. undertook cyber operation against Iran as part of effort to secure the 2020 election” By Ellen Nakashima — The Washington Post. United States (U.S.) Cyber Command is out setting a narrative about how effective its operations against nations like Iran have been in protecting the election. Of course, one cannot prove this easily, so it is perhaps an open question as to the effectiveness of U.S. efforts. Nonetheless, this uncharacteristic openness may be on account of successful operations to foil and fend off efforts to disrupt the election, and it certainly reflects the U.S. security services’ desire to avoid 2016’s mistake of not going public with information so Americans would understand what is happening.
  •  “Europe and the US are drifting apart on tech. Joe Biden wouldn’t fix that.” By Nicholas Vincour — Politico EU. This rundown of the significant policy differences suggests the United States (U.S.) and the European Union (EU) will be at odds on major tech issues even under a Biden Administration that one can safely assume will return the U.S. to closer relations with the EU. Most of these differences transcend personality, however, suggesting structural and systemic reasons, which foretell continued friction.
  • What Big Tech has to gain—and lose—from a Biden presidency” By Mark Sullivan — Fast Company. This piece lays out how a Biden Administration might continue and discontinue Trump Administration policy if Joe Biden prevails in the election. One aspect this piece glosses over, however, is how the composition of Congress would inform a Biden Administration’s capability to achieve its policy goals on tech.
  • Robocalls Told at Least 800,000 Swing State Residents to “Stay Home” on Election Day. The FBI Is Investigating.” By Jack Gillum and Jeremy B. Merrill — ProPublica. Robocalls to more than 3 million people were made yesterday, urging them to stay home and stay safe. This is akin to voter suppression tactics that have been used for decades in the United States, but it is unlikely the culprit or true motive (if it was not intended as suppression) will ever be discovered given the ease of use, scale, and anonymity spoofing provides.

Other Developments

  • Australia’s Department of Home Affairs (Department) released for comment “Critical Technology Supply Chain Principles (the Principles)” that “are intended to assist organisations – including governments and businesses of all sizes – in making decisions about their suppliers.” The Department stated that “[t]he Principles also complement the Protecting Critical Infrastructure and Systems of National Significance reforms…[and] [t]ogether, these measures will help protect the supply of essential services that all Australians rely on.​​”
    • The Department stated:
      • Supply chains for critical technologies in Australia must be more resilient. Australia’s COVID-19 experience highlights the vulnerabilities of supply chains for products essential to the country. At the same time, the global technological landscape is evolving at an unprecedented pace and geostrategic competition is affecting how critical technologies are being developed and used.
      • The more dependent society becomes on technology, the less governments and organisations can rely on traditional habits and decision-making frameworks when it comes to their supply chains. Improving the management of critical technology supply chains specifically, across the economy will help build Australia’s resilience to future shocks, as well as address the inherent risks to our nation’s national security, economic prosperity and social cohesion. Advances in technology underpin our future prosperity, however they also expose our nation to more risks. Malicious actors can use critical technologies to harm our national security, and undermine our democracy. One way to address these risks is to consider the supply chains of critical technologies, and how these could be made more secure. Understanding the risks is the first step towards organisations of all sizes taking action to create diverse, trusted and secure supply chains.
      • That’s why the Australian Government is developing the Critical Technology Supply Chain Principles. These Principles will be non-binding and voluntary, and are intended to act as a tool to assist governments and businesses in making decisions about their suppliers and transparency of their own products. The Principles will help Australian business consider the unforeseen risks when developing critical technologies, building business resilience. The suggested Principles will be grouped under three pillars: security-by-design, transparency, and autonomy and integrity. The suggested Principles below align with guidance provided by the Australian Signals Directorate’s Australian Cyber Security Centre on supply chain risk management.
    • The Department provided an overview of the conceptual framework of the document:
      • Security should be a core component of critical technologies. Organisations should ensure they are making decisions that build in security from the ground-up.
        • 1. Understand what needs to be protected and why.
        • 2. Understand the security risks posed by your supply chain.
        • 3. Build security considerations into contracting processes that are proportionate to the level of risk (and encourage suppliers to do the same).
        • 4. Raise awareness of security within your supply chain
      • Transparency of technology supply chains is critical, both from a business perspective and a national security perspective.
        • 5. Know who suppliers are and build an understanding of security measures.
        • 6. Set and communicate minimum transparency requirements consistent with existing standards and international benchmarks for your suppliers and encourage continuous improvement.
        • 7. Encourage suppliers to understand their supply chains, and be able to provide this information to consumers.
      • Knowing that your suppliers demonstrate integrity and are acting autonomously is fundamental to securing your supply chain.
        • 8. Consider the influence of foreign governments on suppliers and seek to ensure they operate with appropriate levels of autonomy.
        • 9. Consider if suppliers operate ethically, with integrity, and consistently with their human rights responsibilities.
        • 10. Build trusted, strategic relationships with suppliers
  • The United States’ (U.S.) Department of Justice (DOJ) announced that a member of a $100 million botnet conspiracy was sentenced to eight years in prison “for his role in operating a sophisticated scheme to steal and traffic sensitive personal and financial information in the online criminal underground.” The DOJ stated:
    • Aleksandr Brovko, 36, formerly of the Czech Republic, pleaded guilty in February to conspiracy to commit bank and wire fraud. According to court documents, Brovko was an active member of several elite, online forums designed for Russian-speaking cybercriminals to gather and exchange their criminal tools and services. 
    • As reflected in court documents, from 2007 through 2019, Brovko worked closely with other cybercriminals to monetize vast troves of data that had been stolen by “botnets,” or networks of infected computers.  Brovko, in particular, wrote software scripts to parse botnet logs and performed extensive manual searches of the data in order to extract easily monetized information, such as personally identifiable information and online banking credentials.  Brovko also verified the validity of stolen account credentials, and even assessed whether compromised financial accounts had enough funds to make it worthwhile to attempt to use the accounts to conduct fraudulent transactions. 
    • According to court documents, Brovko possessed and trafficked over 200,000 unauthorized access devices during the course of the conspiracy. These access devices consisted of either personally identifying information or financial account details. Under the U.S. Sentencing Guidelines, the estimated intended loss in this case has been calculated as exceeding $100 million.
  • The Office of the Privacy Commissioner of Canada (OPC), Office of the Information and Privacy Commissioner of Alberta (OIPC AB) and the Office of the Information and Privacy Commissioner for British Columbia (OIPC BC) found that “Cadillac Fairview – one of North America’s largest commercial real estate companies – embedded cameras inside their digital information kiosks at 12 shopping malls across Canada and used facial recognition technology without their customers’ knowledge or consent.”  The Commissioners asserted:
    • The goal, the company said, was to analyze the age and gender of shoppers and not to identify individuals. Cadillac Fairview also asserted that shoppers were made aware of the activity via decals it had placed on shopping mall entry doors that referred to their privacy policy – a measure the Commissioners determined was insufficient.
    • Cadillac Fairview also asserted that it was not collecting personal information, since the images taken by camera were briefly analyzed then deleted. However, the Commissioners found that Cadillac Fairview did collect personal information, and contravened privacy laws by failing to obtain meaningful consent as they collected the 5 million images with small, inconspicuous cameras. Cadillac Fairview also used video analytics to collect and analyze sensitive biometric information of customers.
    • The investigation also found that:
      • Facial recognition software was used to generate additional personal information about individual shoppers, including estimated age and gender.
      • While the images were deleted, investigators found that the sensitive biometric information generated from the images was being stored in a centralized database by a third party.
      • Cadillac Fairview stated that it was unaware that the database of biometric information existed, which compounded the risk of potential use by unauthorized parties or, in the case of a data breach, by malicious actors.
  • The United States (U.S.) Department of Defense (DOD) published its “DOD Electromagnetic Spectrum Superiority Strategy” the purpose of which “is to align DOD electromagnetic spectrum (EMS) activities with the objectives of the 2017 National Security Strategy, the 2018 National Defense Strategy, and national economic and technology policy goals.” The DOD stated:
    • This Strategy embraces the enterprise approach required to ensure EMS superiority by integrating efforts to enhance near-term and long-term EMS capabilities, activities, and operations. The Strategy informs the Department’s domestic EMS access policies and reinforces the need to develop cooperative frameworks with other EMS stakeholders in order to advance shared national policy goals. The traditional functions of Electromagnetic Spectrum Management (EMSM) and Electromagnetic Warfare (EW)—integrated as Electromagnetic Spectrum Operations (EMSO)—are addressed within the document’s strategic goals. This 2020 Strategy builds upon the successes of and supersedes both the DOD’s 2013 EMS Strategy and 2017 EW Strategy.
    • The DOD concluded:
      • DOD faces rapidly increasing challenges to its historical EMS dominance due in part to increasingly complex EMOEs. Threats to DOD capabilities due to EMS vulnerabilities have become increasingly sophisticated and easily attainable. Commercial technology advancements are proliferating wireless devices and services that are eroding DOD’s freedom of action in the EMS. At the same time, the U.S. military has increasing spectrum requirements for the operations, testing, and training of advanced warfighting capabilities. Finally, DOD must exploit near-peer adversaries’ EMS vulnerabilities through advanced EW to offset their capacity overmatch.
      • To cope with these challenges and achieve the vision of Freedom of Action in the Electromagnetic Spectrum, the DOD will actively pursue the areas outlined herein. DOD will enhance the ability to plan, sense, manage, and control military operations with advanced EMS technologies to ensure EMS superiority. The Department will also proactively engage with spectrum policymakers and partners to ensure spectrum policies support U.S . capability requirements. DOD will perform the governance functions needed to ensure our efforts are aligned and coordinated to maximize the results of our efforts.
      • The NDS directs the Department to “determine an approach to enhancing the lethality of the joint force against high end competitors and the effectiveness of our military against a broad spectrum of potential threats.” Realization of the NDS requires DOD to actualize the vision of this DOD EMS Superiority Strategy by implementing its goals and objectives through an empowered EMS enterprise. Advancing how DOD conducts operations in the EMS, and generates EMS superiority, will be critical to the success of all future missions for the United States, its allies, and partners.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by skeeze from Pixabay

Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay