Further Reading, Other Developments, and Coming Events (3 November)

Further Reading

  • How Facebook and Twitter plan to handle election day disinformation” By Sam Dean — The Los Angeles Times; “How Big Tech is planning for election night” By Heather Kelly — The Washington Post; and “What to Expect From Facebook, Twitter and YouTube on Election Day” By Mike Isaac, Kate Conger and Daisuke Wakabayashi — The New York Times. Social media platforms have prepared for today and will use a variety of measures to combat lies, disinformation, and misinformation from sources foreign and domestic. Incidentally, my read is that these tech companies are more worried as measured by resource allocation about problematic domestic content.
  • QAnon received earlier boost from Russian accounts on Twitter, archives show” By Joseph Menn — Reuters. Researchers have delved into Twitter’s archive of taken down or suspended Russian accounts and are now claiming that the Russian Federation was pushing and amplifying the QAnon conspiracy in the United States (U.S.) as early as November 2017. This revised view of Russian involvement differs from conclusions reached in August that Russia played a small but significant role in the proliferation of the conspiracy.
  • Facial recognition used to identify Lafayette Square protester accused of assault” By Justin Jouvenal and Spencer S. Hsu — The Washington Post. When police were firing pepper balls and rolling gas canisters towards protestors in Lafayette Park in Washington, D.C. on 1 June 2020, a protestor was filmed assaulting two officers. A little known and not publicly revealed facial recognition technology platform available to many federal, state, and local law enforcement agencies in the Capital area resulted in a match with the footage. Now, a man stands accused of crimes during a Black Lives Matter march, and civil liberties and privacy advocates are objecting to the use of the National Capital Region Facial Recognition Investigative Leads System (NCRFRILS) on a number of grounds, including the demonstrated weakness of these systems to accurately identify people of color, the fact it has been used but not disclosed to a number of defendants, and the potential chilling effect it will have on people going to protests. Law enforcement officials claim there are strict privacy and process safeguards and an identification alone cannot be used as the basis of an arrest.
  • CISA’s political independence from Trump will be an Election Day asset” By Joseph Marks — The Washington Post. The United States’ (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) has established its bona fides with nearly all stakeholders inside and outside the U.S. government since its creation in 2018. Many credit its Director, Christopher Krebs, and many also praise the agency for conveying information and tips about cyber and other threats without incurring the ire of a White House and President antithetical to any hint of Russian hacking or interference in U.S. elections. However, today and the next few days may prove the agency’s metal as it will be in uncharted territory trying to tamp down fear about unfounded rumors and looking to discredit misinformation and disinformation in close to real time.
  • Trump allies, largely unconstrained by Facebook’s rules against repeated falsehoods, cement pre-election dominance” By Isaac Stanley-Becker and Elizabeth Dwoskin — The Washington Post. Yet another article showing that if Facebook has a bias, it is against liberal viewpoints and content, for conservative material is allowed even if it breaks the rules of the platform. Current and former employees and an analysis support this finding. The Trump Family have been among those who have benefitted from the kid gloves used by the company regarding posts that would have likely resulted in consequences from other users. However, smaller conservative outlets or less prominent conservative figures have faced consequences for content that violates Facebook’s standards, however.

Other Developments

  • The European Union’s (EU) Executive Vice-President Margrethe Vestager made a speech titled “Building trust in technology,” in which she previewed one long awaited draft EU law on technology and another to address antitrust and anti-competitive practices of large technology companies. Vestager stated “in just a few weeks, we plan to publish two draft laws that will help to create a more trustworthy digital world.” Both drafts are expected on 2 December and represent key pieces of the new EU leadership’s Digital Strategy, the bloc’s initiative to update EU laws to account for changes in technology since the beginning of the century. The Digital Services Act will address and reform the legal treatment of both online commerce and online content. The draft Digital Markets Act would give the European Commission (EC) more tools to combat the same competition and market dominance issues the United States (U.S.), Australia, and other nations are starting to tackle visa vis companies like Apple, Amazon, Facebook, and Google.
    • Regarding the Digital Services Act, Vestager asserted:
      • One part of those rules will be the Digital Services Act, which will update the E-Commerce Directive, and require digital services to take more responsibility for dealing with illegal content and dangerous products. 
      • Those services will have to put clear and simple procedures in place, to deal with notifications about illegal material on their platforms. They’ll have to make it harder for dodgy traders to use the platform, by checking sellers’ IDs before letting them on the platform. And they’ll also have to provide simple ways for users to complain, if they think their material should not have been removed – protecting the right of legitimate companies to do business, and the right of individuals to freedom of expression. 
      • Those new responsibilities will help to keep Europeans just as safe online as they are in the physical world. They’ll protect legitimate businesses, which follow the rules, from being undercut by others who sell cheap, dangerous products. And by applying the same standards, all over Europe, they’ll make sure every European can rely on the same protection – and that digital businesses of all sizes can easily operate throughout Europe, without having to meet the costs of complying with different rules in different EU countries. 
      • The new rules will also require digital services – especially the biggest platforms – to be open about the way they shape the digital world that we see. They’ll have to report on what they’ve done to take down illegal material. They’ll have to tell us how they decide what information and products to recommend to us, and which ones to hide – and give us the ability to influence those decisions, instead of simply having them made for us. And they’ll have to tell us who’s paying for the ads that we see, and why we’ve been targeted by a certain ad.  
      • But to really give people trust in the digital world, having the right rules in place isn’t enough. People also need to know that those rules really work – that even the biggest companies will actually do what they’re supposed to. And to make sure that happens, there’s no substitute for effective enforcement.  
      • And effective enforcement is a vital part of the draft laws that we’ll propose in December. For instance, the Digital Services Act will improve the way national authorities cooperate, to make sure the rules are properly enforced, throughout the EU. Our proposal won’t change the fundamental principle, that digital services should be regulated by their home country. But it will set up a permanent system of cooperation that will help those regulators work more effectively, to protect consumers all across Europe. And it will give the EU power to step in, when we need to, to enforce the rules against some very large platforms.
    • Vestager also discussed the competition law the EC hopes to enact:
      • So, to keep our markets fair and open to competition, it’s vital that we have the right toolkit in place. And that’s what the second set of rules we’re proposing – what we call the Digital Markets Act – is for. 
      • That proposal will have two pillars. The first of those pillars will be a clear list of dos and don’ts for big digital gatekeepers, based on our experience with the sorts of behaviour that can stop markets working well. 
      • For instance, the decisions that gatekeepers take, about how to rank different companies in search results, can make or break businesses in dozens of markets that depend on the platform. And if platforms also compete in those markets themselves, they can use their position as player and referee to help their own services succeed, at the expense of their rivals. For instance, gatekeepers might manipulate the way that they rank different businesses, to show their own services more visibly than their rivals’. So, the proposal that we’ll put forward in a few weeks’ time will aim to ban this particular type of unfair self-preferencing. 
      • We also know that these companies can collect a lot of data about companies that rely on their platform – data which they can then use, to compete against those very same companies in other markets. That can seriously damage fairness in these markets – which is why our proposal aims to ban big gatekeepers from misusing their business users’ data in that way. 
      • These clear dos and don’ts will allow us to act much faster and more effectively, to tackle behaviour that we know can stop markets working well. But we also need to be ready for new situations, where digitisation creates deep, structural failures in the way our markets work.  
      • Once a digital company gets to a certain size, with the big network of users and the huge collections of data that brings, it can be very hard for anyone else to compete – even if they develop a much better service. So, we face a constant risk that big companies will succeed in pushing markets to a tipping point, sending them on a rapid, unstoppable slide towards monopoly – and creating yet another powerful gatekeeper. 
      • One way to deal with risks like this would be to stop those emerging gatekeepers from locking users into their platform. That could mean, for instance, that those gatekeepers would have to make it easier for users to switch platform, or to use more than one service. That would keep the market open for competition, by making it easier for innovative rivals to compete. But right now, we don’t have the power to take this sort of action when we see these risks arising. 
      • It can also be difficult to deal with large companies which use the power of their platforms again and again, to take over one related market after another. We can deal with that issue with a series of cases – but the risk is that we’ll always find ourselves playing catch-up, while platforms move from market to market, using the same strategies to drive out their rivals. 
      • The risk, though, is that we’ll have a fragmented system, with different rules in different EU countries. That would make it hard to tackle huge platforms that operate throughout Europe, and to deal with other problems that you find in digital markets in many EU countries. And it would mean that businesses and consumers across Europe can’t all rely on the same protection. 
      • That’s why the second pillar of the Digital Markets Act would put a harmonised market investigation framework in place across the single market, giving us the power to tackle market failures like this in digital markets, and stop new ones from emerging. That would give us a harmonised set of rules that would allow us to investigate certain structural problems in digital markets. And if necessary, we could take action to make these markets contestable and competitive.
  • California Governor Gavin Newsom (D) vetoed one of the bills sent to him to amend the “California Consumer Privacy Act” (AB 375) last week. In mid-October, he signed two bills that amended the CCPA but one will only take effect if the “California Privacy Rights Act” (CPRA) (Ballot Initiative 24) is not enacted by voters in the November election. Moreover, if the CPRA is enacted via ballot, then the two other statutes would likely become dead law as the CCPA and its amendments would become moot once the CPRA becomes effective in 2023.
    • Newsom vetoed AB 1138 that would amend the recently effective “Parent’s Accountability and Child Protection Act” would bar those under the age of 13 from opening a social media account unless the platform got the explicit consent from their parents. Moreover, “[t]he bill would deem a business to have actual knowledge of a consumer’s age if it willfully disregards the consumer’s age.” In his veto message, Newsom explained that while he agrees with the spirit of the legislation, it would create unnecessary confusion and overlap with federal law without any increase in protection for children. He signaled an openness to working with the legislature on this issue, however.
    • Newsom signed AB 1281 that would extend the carveout for employers to comply with the CCPA from 1 January 2021 to 1 January 2022. The CCPA “exempts from its provisions certain information collected by a business about a natural person in the course of the natural person acting as a job applicant, employee, owner, director, officer, medical staff member, or contractor, as specified…[and also] exempts from specified provisions personal information reflecting a written or verbal communication or a transaction between the business and the consumer, if the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from that company, partnership, sole proprietorship, nonprofit, or government agency.” AB 1281 “shall become operative only” if the CPRA is not approved by voters.
    • Newsom also signed AB 713 that would:
      • except from the CCPA information that was deidentified in accordance with specified federal law, or was derived from medical information, protected health information, individually identifiable health information, or identifiable private information, consistent with specified federal policy, as provided.
      • except from the CCPA a business associate of a covered entity, as defined, that is governed by federal privacy, security, and data breach notification rules if the business associate maintains, uses, and discloses patient information in accordance with specified requirements.
      • except information that is collected for, used in, or disclosed in research, as defined.
      • additionally prohibit a business or other person from reidentifying information that was deidentified, unless a specified exception is met.
      • beginning January 1, 2021, require a contract for the sale or license of deidentified information to include specified provisions relating to the prohibition of reidentification, as provided.
  • The Government Accountability Office (GAO) published a report requested by the chair of a House committee on a Federal Communications Commission (FCC) program to provide subsidies for broadband providers to provide service for High cost areas, typically for people in rural or hard to reach areas. Even though the FCC has provided nearly $5 billion in 2019 through the Universal Service Fund’s (USF) high cost program, the agency lack data to determine whether the goals of the program are being met. House Energy and Commerce Committee Chair Frank Pallone Jr (D-NJ) had asked for the assessment, which could well form the basis for future changes to how the FCC funds and sets conditions for use of said funds for broadband.
    • The GAO noted:
      • According to stakeholders GAO interviewed, FCC faces three key challenges to accomplish its high-cost program performance goals: (1) accuracy of FCC’s broadband deployment data, (2) broadband availability on tribal lands, and (3) maintaining existing fixed-voice infrastructure and attaining universal mobile service. For example, although FCC adopted a more precise method of collecting and verifying broadband availability data, stakeholders expressed concern the revised data would remain inaccurate if carriers continue to overstate broadband coverage for marketing and competitive reasons. Overstating coverage impairs FCC’s efforts to promote universal voice and broadband since an area can become ineligible for high-cost support if a carrier reports that service already exists in that area. FCC has also taken actions to address the lack of broadband availability on tribal lands, such as making some spectrum available to tribes for wireless broadband in rural areas. However, tribal stakeholders told GAO that some tribes are unable to secure funding to deploy the infrastructure necessary to make use of spectrum for wireless broadband purposes.
    • The GAO concluded:
      • The effective use of USF resources is critical given the importance of ensuring that Americans have access to voice and broadband services. Overall, FCC’s high-cost program performance goals and measures are often not conducive to providing FCC with high-quality performance information. The performance goals lack measurable or quantifiable bases upon which numeric targets can be set. Further, while FCC’s performance measures met most of the key attributes of effective measures, the measures often lacked linkage, clarity, and objectivity. Without such clarity and specific desired outcomes, FCC lacks performance information that could help FCC make better-informed decisions about how to allocate the program’s resources to meet ongoing and emerging challenges to ensuring universal access to voice and broadband services. Furthermore, the absence of public reporting of this information leaves stakeholders, including Congress, uncertain about the program’s effectiveness to deploy these services.
    • The GAO recommended that
      • The Chairman of FCC should revise the high-cost performance goals so that they are measurable and quantifiable.
      • The Chairman of FCC should ensure high-cost performance measures align with key attributes of successful performance measures, including ensuring that measures clearly link with performance goals and have specified targets.
      • The Chairman of FCC should ensure the high-cost performance measure for the goal of minimizing the universal service contribution burden on consumers and businesses takes into account user-fee leading practices, such as equity and sustainability considerations.
      • The Chairman of FCC should publicly and periodically report on the progress it has made for its high-cost program’s performance goals, for example, by including relevant performance information in its Annual Broadband Deployment Report or the USF Monitoring Report.
  • An Irish court has ruled that the Data Protection Commission (DPC) must cover the legal fees of Maximillian Schrems for litigating and winning his case in which the Court of Justice of the European Union (CJEU) struck down the adequacy decision underpinning the European Union-United States Privacy Shield (aka Schrems II). It has been estimated that Schrems’ legal costs could total between €2-5 million for filing a complaint against Facebook, the low end of which would represent 10% of the DPC’s budget this year. In addition, the DPC has itself spent almost €3 million litigating the case.
  • House Transportation and Infrastructure Chair Peter DeFazio (D-OR) and Ranking Member Sam Graves (R-MO) asked that the Government Accountability Office (GAO) review the implications of the Federal Communications Commission’s (FCC) 2019 proposal to allow half of the Safety Band (i.e. 5.9 GHz spectrum) to be used for wireless communications even though the United States (U.S.) Department of Transportation weighed in against this decision. DeFazio and Graves are asking for a study of “the safety implications of sharing more than half of the 5.9 GHz spectrum band” that is currently “reserved exclusively for transportation safety purposes.” This portion of the spectrum was set aside in 1999 for connected vehicle technologies, according to DeFazio and Graves, and given the rise of automobile technology that requires spectrum, they think the FCC’s proposed proceeding “may significantly affect the efficacy of current and future applications of vehicle safety technologies.” DeFazio and Graves asked the GAO to review the following:
    • What is the current status of 5.9 GHz wireless transportation technologies, both domestically and internationally?
    • What are the views of industry and other stakeholders on the potential uses of these technologies, including scenarios where the 5.9 GHz spectrum is shared among different applications?
    • In a scenario in which the 5.9 GHz spectrum band is shared among different applications, what effect would this have on the efficacy, including safety, of current wireless transportation technology deployments?
    • What options exist for automakers and highway management agencies to ensure the safe deployment of connected vehicle technologies in a scenario in which the 5.9 GHz spectrum band is shared among different applications?
    • How, if at all, have DOT and FCC assessed current and future needs for 5.9 GHz spectrum in transportation applications? 
    • How, if at all, have DOT and FCC coordinated to develop a federal spectrum policy for connected vehicles?
  • The Harvard Law School’s Cyberlaw Clinic and the Electronic Frontier Foundation (EFF) have published “A Researcher’s Guide to Some Legal Risks of Security Research,” which is timely given the case before the Supreme Court of the United States to determine whether the “Computer Fraud and Abuse Act” (CFAA). The Cyberlaw Clinic and EFF stated:
    • Just last month, over 75 prominent security researchers signed a letter urging the Supreme Court not to interpret the CFAA, the federal anti-hacking/computer crime statute, in a way that would criminalize swaths of valuable security research.
    • In the report, the Cyberlaw Clinic and EFF explained:
      • This guide overviews broad areas of potential legal risk related to security research, and the types of security research likely implicated. We hope it will serve as a useful starting point for concerned researchers and others. While the guide covers what we see as the main areas of legal risk for security researchers, it is not exhaustive.
    • In Van Buren v. United States, the Court will consider the question of “[w]hether a person who is authorized to access information on a computer for certain purposes violates Section 1030(a)(2) of the Computer Fraud and Abuse Act if he accesses the same information for an improper purpose.” In this case, the defendant was a police officer who took money as part of a sting operation to illegally use his access to Georgia’s database of license plates to obtain information about a person. The Eleventh Circuit Court of Appeals denied his appeal of his conviction under the CFAA per a previous ruling in that circuit that “a defendant violates the CFAA not only when he obtains information that he has no “rightful[]” authorization whatsoever to acquire, but also when he obtains information “for a nonbusiness purpose.”
    • As the Cyberlaw Clinic and EFF noted in the report:
      • Currently, courts are divided about whether the statute’s prohibition on “exceed[ing] authorized access” applies to people who have authorization to access data (for some purpose), but then access it for a (different) purpose that violates a contractual terms of service or computer use policy. Some courts have found that the CFAA covers activities that do not circumvent any technical access barriers, from making fake profiles that violate Facebook’s terms of service to running a search on a government database without permission. Other courts, disagreeing with the preceding approach, have held that a verbal or contractual prohibition alone cannot render access punishable under the CFAA.
  • The United Kingdom’s Institution of Engineering and Technology (IET) and the National Cyber Security Centre (NCSC) have published “Code of Practice: Cyber Security and Safety” to help engineers develop technological and digital products with both safety and cybersecurity in mind. The IET and NCSC explained:
    • The aim of this Code is to help organizations accountable for safety-related systems manage cyber security vulnerabilities that could lead to hazards. It does this by setting out principles that when applied will improve the interaction between the disciplines of system safety and cyber security, which have historically been addressed as distinct activities.
    • The objectives for organizations that follow this Code are: (a) to manage the risk to safety from any insecurity of digital technology; (b) to be able to provide assurance that this risk is acceptable; and (c) where there is a genuine conflict between the controls used to achieve safety and security objectives, to ensure that these are properly resolved by the appropriate authority.
    • It should be noted that the focus of this Code is on the safety and security of digital technology, but it is recognized that addressing safety and cyber security risk is not just a technological issue: it involves people, process, physical and technological aspects. It should also be noted that whilst digital technology is central to the focus, other technologies can be used in pursuit of a cyber attack. For example, the study of analogue emissions (electromagnetic, audio, etc.) may give away information being digitally processed and thus analogue interfaces could be used to provide an attack surface.

Coming Events

  • On 10 November, the Senate Commerce, Science, and Transportation Committee will hold a hearing to consider nominations, including Nathan Simington’s to be a Member of the Federal Communications Commission.
  • On 17 November, the Senate Judiciary Committee will reportedly hold a hearing with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey on Section 230 and how their platforms chose to restrict The New York Post article on Hunter Biden.
  • On 18 November, the Federal Communications Commission (FCC) will hold an open meeting and has released a tentative agenda:
    • Modernizing the 5.9 GHz Band. The Commission will consider a First Report and Order, Further Notice of Proposed Rulemaking, and Order of Proposed Modification that would adopt rules to repurpose 45 megahertz of spectrum in the 5.850-5.895 GHz band for unlicensed operations, retain 30 megahertz of spectrum in the 5.895-5.925 GHz band for the Intelligent Transportation Systems (ITS) service, and require the transition of the ITS radio service standard from Dedicated Short-Range Communications technology to Cellular Vehicle-to-Everything technology. (ET Docket No. 19-138)
    • Further Streamlining of Satellite Regulations. The Commission will consider a Report and Order that would streamline its satellite licensing rules by creating an optional framework for authorizing space stations and blanket-licensed earth stations through a unified license. (IB Docket No. 18-314)
    • Facilitating Next Generation Fixed-Satellite Services in the 17 GHz Band. The Commission will consider a Notice of Proposed Rulemaking that would propose to add a new allocation in the 17.3-17.8 GHz band for Fixed-Satellite Service space-to-Earth downlinks and to adopt associated technical rules. (IB Docket No. 20-330)
    • Expanding the Contribution Base for Accessible Communications Services. The Commission will consider a Notice of Proposed Rulemaking that would propose expansion of the Telecommunications Relay Services (TRS) Fund contribution base for supporting Video Relay Service (VRS) and Internet Protocol Relay Service (IP Relay) to include intrastate telecommunications revenue, as a way of strengthening the funding base for these forms of TRS and making it more equitable without increasing the size of the Fund itself. (CG Docket Nos. 03-123, 10-51, 12-38)
    • Revising Rules for Resolution of Program Carriage Complaints. The Commission will consider a Report and Order that would modify the Commission’s rules governing the resolution of program carriage disputes between video programming vendors and multichannel video programming distributors. (MB Docket Nos. 20-70, 17-105, 11-131)
    • Enforcement Bureau Action. The Commission will consider an enforcement action.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Pexels from Pixabay

Further Reading, Other Developments, and Coming Events (14 October)

Further Reading

  •  “The Man Who Speaks Softly—and Commands a Big Cyber Army” By Garrett Graff — WIRED. A profile of General Paul Nakasone, the leader of both the United States’ National Security Agency (NSA) and Cyber Command, who has operated mostly in the background during the tumultuous Trump Administration. He has likely set the template for both organizations going forward for some time. A fascinating read chock with insider details.
  • Facebook Bans Anti-Vaccination Ads, Clamping Down Again” by Mike Isaac — The New York Times. In another sign of the social media platform responding to pressure in the United States and Europe, it was announced that anti-vaccination advertisements would no longer be accepted. This follows bans on Holocaust denial and QAnon material. Of course, this newest announcement is a classic Facebook half-step. Only paid advertisements will be banned, but users can continue to post about their opposition to vaccination.
  • To Mend a Broken Internet, Create Online Parks” By Eli Pariser — WIRED. An interesting argument that a public online space maintained by the government much like parks or public libraries may be just what democracies across the globe need to roll back the tide of extremism and division.
  • QAnon is tearing families apart” By Travis Andrews — The Washington Post. This is a terrifying tour through the fallout of the QAnon conspiracy that sucks some in so deeply they are marginally connected to reality in many ways.
  • AT&T has trouble figuring out where it offers government-funded Internet” By John Brodkin — Ars Technica.  So, yeah, about all that government cash given to big telecom companies that was supposed to bring more broadband coverage. Turns out, they definitely took the cash. The broadband service has been a much more elusive thing to verify. In one example, AT&T may or may not have provided service to 133,000 households in Mississippi after receiving funds from the Federal Communications Commission (FCC). Mississippi state authorities are arguing most of the service is non-existent. AT&T is basically saying it’s all a misunderstanding.

Other Developments

  • The California Attorney General’s Office (AG) has released yet another revision of the regulations necessary to implement the “California Consumer Privacy Act” (CCPA) (AB 375) and comments are due by 28 October. Of course, if Proposition 24 passes next month, the “California Privacy Rights Act” will largely replace the CCPA, requiring the drafting of even more regulations. Nonetheless, what everyone thought was the final set of CCPA regulations took effect on 14 August, but in the notice from the Office of Administrative Law was notice that the AG had withdrawn four portions of the proposed regulations. In the new draft regulations, the AG explained:
    • Proposed section 999.306, subd. (b)(3), provides examples of how businesses that collect personal information in the course of interacting with consumers offline can provide the notice of right to opt-out of the sale of personal information through an offline method.
    • Proposed section 999.315, subd. (h), provides guidance on how a business’s methods for submitting requests to opt-out should be easy and require minimal steps. It provides illustrative examples of methods designed with the purpose or substantial effect of subverting or impairing a consumer’s choice to opt-out.
    • Proposed section 999.326, subd. (a), clarifies the proof that a business may require an authorized agent to provide, as well as what the business may require a consumer to do to verify their request.
    • Proposed section 999.332, subd. (a), clarifies that businesses subject to either section 999.330, section 999.331, or both of these sections are required to include a description of the processes set forth in those sections in their privacy policies.
  • Facebook announced an update to its “hate speech policy to prohibit any content that denies or distorts the Holocaust.” Facebook claimed:
    • Following a year of consultation with external experts, we recently banned anti-Semitic stereotypes about the collective power of Jews that often depicts them running the world or its major institutions.  
    • Today’s announcement marks another step in our effort to fight hate on our services. Our decision is supported by the well-documented rise in anti-Semitism globally and the alarming level of ignorance about the Holocaust, especially among young people. According to a recent survey of adults in the US aged 18-39, almost a quarter said they believed the Holocaust was a myth, that it had been exaggerated or they weren’t sure.
  • In a 2018 interview, Facebook CEO Mark Zuckerberg asserted:
    • I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong…
    • What we will do is we’ll say, “Okay, you have your page, and if you’re not trying to organize harm against someone, or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.” But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.
    • He clarified in a follow up email:
      • I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that.
      • Our goal with fake news is not to prevent anyone from saying something untrue — but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.
  • The Government Accountability Office (GAO) issued an evaluation of the Trump Administration’s 5G Strategy and found more processes and actions are needed if this plan to vault the United States (U.S.) ahead of other nations will come to fruition. Specifically, “report examines the extent to which the Administration has developed a national strategy on 5G that address our six desirable characteristics of an effective national strategy.” The GAO identified the six desirable characteristics: (1) purpose, scope, and methodology; (2) problem definition and risk assessment; (3) goals, subordinate objectives, activities, and performance measures; (4) resources, investments, and risk management; (5) organizational roles, responsibilities, and coordination; and (6) integration and implementation. However, this assessment is necessarily limited, for National Security Council staff took the highly unusual approach of not engaging with the GAO, which may be another norm broken by the Trump Administration. The GAO stated “[t]he March 2020 5G national strategy partially addresses five of our desirable characteristics of an effective national strategy and does not address one, as summarized in table 1:
    • The GAO explained:
      • According to National Telecommunications and Information Administration (NTIA) and Office of Science and Technology Policy (OSTP) officials, the 5G national strategy was intentionally written to be at a high level and as a result, it may not include all elements of our six desirable characteristics of national strategies. These officials stated that the 5G implementation plan required by the Secure 5G and Beyond Act of 2020 is expected to include specific details, not covered in the 5G national strategy, on the U.S. government’s response to 5G risks and challenges. The implementation plan is expected to align and correspond to the lines of effort in the 5G national strategy. NTIA officials told us that the implementation plan to the 5G national strategy would be finalized by the end of October 2020. However, the officials we spoke to were unable to provide details on the final content of the implementation plan such as whether the plan would include all elements of our six desirable characteristics of national strategies given that it was not final. National strategies and their implementation plans should include all elements of the six desirable characteristics to enhance their usefulness as guidance and to ensure accountability and coordinate investments. Until the administration ensures that the implementation plan includes all elements of the six desirable characteristics, the guidance the plan provides decision makers in allocating resources to address 5G risks and challenges will likely be limited.
  • The Irish Council for Civil Liberties (ICCL) wrote the European Commission (EC) to make the case the United Kingdom (UK) is not deserving of an adequacy decision after Brexit because of institutional and cultural weaknesses at the Information Commissioner’s Office (ICO). The ICCL made the case that the ICO has been one of the most ineffectual enforcers of the General Data Protection Regulation (GDPR), especially with respect to what the ICCL called the largest data infringement under the GDPR and the largest data breach of all time: Real-Time Bidding. The ICCL took the ICO to task with having not followed through on fining companies for GDPR violations and having a tiny staff dedicated to data protection and technology issues. The ICCL invoked Article 45 of the GDPR to encourage the EC to deny the UK the adequacy decision it would need in order to transfer the personal data of EU residents to the UK.
  • In an unrelated development, the Information Commissioner’s Office (ICO) wrapped up its investigation into Facebook and Cambridge Analytica and detailed its additional findings in a letter to the Digital, Culture and Media and Sport Select Committee in the House of Commons. ICO head Elizabeth Denham asserted:
    • [w]e concluded that SCL Elections Ltd and Cambridge Analytica (SCL/CA) were purchasing significant volumes of commercially available personal data (at one estimate over 130 billion data points), in the main about millions of US voters, to combine it with the Facebook derived insight information they had obtained from an academic at Cambridge University, Dr Aleksandr Kogan, and elsewhere. In the main their models were also built from ‘off the shelf’ analytical tools and there was evidence that their own staff were concerned about some of the public statements the leadership of the company were making about their impact and influence.
    • From my review of the materials recovered by the investigation I have found no further evidence to change my earlier view that SCL/CA were not involved in the EU referendum campaign in the UK -beyond some initial enquiries made by SCL/CA in relation to UKIP data in the early stages of the referendum process. This strand of work does not appear to have then been taken forward by SCL/CA
    • I have concluded my wider investigations of several organisations on both the remain and the leave side of the UK’s referendum about membership of the EU. I identified no significant breaches of the privacy and electronic marketing regulations and data protection legislation that met the threshold for formal regulatory action. Where the organisation continued in operation, I have provided advice and guidance to support better future compliance with the rules.
    • During the investigation concerns about possible Russian interference in elections globally came to the fore. As I explained to the sub-committee in April 2019, I referred details of reported possible Russia-located activity to access data linked to the investigation to the National Crime Agency. These matters fall outside the remit of the ICO. We did not find any additional evidence of Russian involvement in our analysis of material contained in the SCL / CA servers we obtained.
  • The United States Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigation (FBI) issued a joint cybersecurity advisory regarding “recently observed advanced persistent threat (APT) actors exploiting multiple legacy vulnerabilities in combination with a newer privilege escalation vulnerability.” CISA and the FBI revealed that that these tactics have penetrated systems related to elections but claimed there has been no degrading of the integrity of electoral systems.
  • The agencies stated:
    • The commonly used tactic, known as vulnerability chaining, exploits multiple vulnerabilities in the course of a single intrusion to compromise a network or application. 
    • This recent malicious activity has often, but not exclusively, been directed at federal and state, local, tribal, and territorial (SLTT) government networks. Although it does not appear these targets are being selected because of their proximity to elections information, there may be some risk to elections information housed on government networks.
    • CISA is aware of some instances where this activity resulted in unauthorized access to elections support systems; however, CISA has no evidence to date that integrity of elections data has been compromised.
  • Canada’s Privacy Commissioner Daniel Therrien released the “2019-2020 Annual Report to Parliament on the Privacy Act and Personal Information Protection and Electronic Documents Act” and asserted:
    • Technologies have been very useful in halting the spread of COVID-19 by allowing essential activities to continue safely. They can and do serve the public good.
    • At the same time, however, they raise new privacy risks. For example, telemedicine creates risks to doctor-patient confidentiality when virtual platforms involve commercial enterprises. E-learning platforms can capture sensitive information about students’ learning disabilities and other behavioural issues.
    • As the pandemic speeds up digitization, basic privacy principles that would allow us to use public health measures without jeopardizing our rights are, in some cases, best practices rather than requirements under the existing legal framework.
    • We see, for instance, that the law has not properly contemplated privacy protection in the context of public-private partnerships, nor does it mandate app developers to consider Privacy by Design, or the principles of necessity and proportionality.
    • The law is simply not up to protecting our rights in a digital environment. Risks to privacy and other rights are heightened by the fact that the pandemic is fueling rapid societal and economic transformation in a context where our laws fail to provide Canadians with effective protection.
    • In our previous annual report, we shared our vision of how best to protect the privacy rights of Canadians and called on parliamentarians to adopt rights-based privacy laws.
    • We noted that privacy is a fundamental human right (the freedom to live and develop free from surveillance). It is also a precondition for exercising other human rights, such as equality rights in an age when machines and algorithms make decisions about us, and democratic rights when technologies can thwart democratic processes.
    • Regulating privacy is essential not only to support electronic commerce and digital services; it is a matter of justice.

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The House Intelligence Committee will conduct a virtual hearing titled “Misinformation, Conspiracy Theories, and ‘Infodemics’: Stopping the Spread Online.”
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications – The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”
  • The Senate Commerce, Science, and Transportation Committee will reportedly hold a hearing on 29 October regarding 47 U.S.C. 230 with testimony from:
    • Jack Dorsey, Chief Executive Officer of Twitter;
    • Sundar Pichai, Chief Executive Officer of Alphabet Inc. and its subsidiary, Google; and 
    • Mark Zuckerberg, Chief Executive Officer of Facebook.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Thanks for your Like • donations welcome from Pixabay

Further Reading, Other Developments, and Coming Events (8 October)

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications –
    • The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • Harvard University’s Berkman Klein Center for Internet & Society published a study, “Mail-In Voter Fraud: Anatomy of a Disinformation Campaign,” which found a concerted, almost certainly coordinated campaign led by President Donald Trump, the Republican Party, and conservative media outlets to claim against all evidence that mail voting is rife with fraud. The study points to structural issues in the United States (U.S.) and the broader media that allow parties to disseminate disinformation and propaganda. The authors found the traditional print and television media more effective and complicit in spreading lies and disinformation than social media platforms like Facebook and Twitter. The Berkman Klein Center explained:
    • The claim that election fraud is a major concern with mail-in ballots has become the central threat to election participation during the Covid-19 pandemic and to the legitimacy of the outcome of the election across the political spectrum. President Trump has repeatedly cited his concerns over voter fraud associated with mail-in ballots as a reason that he may not abide by an adverse electoral outcome. Polling conducted in September 2020 suggests that nearly half of Republicans agree with the president that election fraud is a major concern associated with expanded mail-in voting during the pandemic. Few Democrats share that belief. Despite the consensus among independent academic and journalistic investigations that voter fraud is rare and extremely unlikely to determine a national election, tens of millions of Americans believe the opposite. This is a study of the disinformation campaign that led to widespread acceptance of this apparently false belief and to its partisan distribution pattern. Contrary to the focus of most contemporary work on disinformation, our findings suggest that this highly effective disinformation campaign, with potentially profound effects for both participation in and the legitimacy of the 2020 election, was an elite-driven, mass-media led process. Social media played only a secondary and supportive role.
    • Our results are based on analyzing over fifty-five thousand online media stories, five million tweets, and seventy-five thousand posts on public Facebook pages garnering millions of engagements. They are consistent with our findings about the American political media ecosystem from 2015-2018, published in  Network Propaganda , in which we found that Fox News and Donald Trump’s own campaign were far more influential in spreading false beliefs than Russian trolls or Facebook clickbait artists. This dynamic appears to be even more pronounced in this election cycle, likely because Donald Trump’s position as president and his leadership of the Republican Party allow him to operate directly through political and media elites, rather than relying on online media as he did when he sought to advance his then-still-insurgent positions in 2015 and the first half of 2016.
    • Our findings here suggest that Donald Trump has perfected the art of harnessing mass media to disseminate and at times reinforce his disinformation campaign by using three core standard practices of professional journalism. These three are: elite institutional focus (if the President says it, it’s news); headline seeking (if it bleeds, it leads); and  balance , neutrality, or the avoidance of the appearance of taking a side. He uses the first two in combination to summon coverage at will, and has used them continuously to set the agenda surrounding mail-in voting through a combination of tweets, press conferences, and television interviews on Fox News. He relies on the latter professional practice to keep audiences that are not politically pre-committed and have relatively low political knowledge confused, because it limits the degree to which professional journalists in mass media organizations are willing or able to directly call the voter fraud frame disinformation. The president is, however, not acting alone. Throughout the first six months of the disinformation campaign, the Republican National Committee (RNC) and staff from the Trump campaign appear repeatedly and consistently on message at the same moments, suggesting an institutionalized rather than individual disinformation campaign. The efforts of the president and the Republican Party are supported by the right-wing media ecosystem, primarily Fox News and talk radio functioning in effect as a party press. These reinforce the message, provide the president a platform, and marginalize or attack those Republican leaders or any conservative media personalities who insist that there is no evidence of widespread voter fraud associated with mail-in voting.
    • The primary cure for the elite-driven, mass media communicated information disorder we observe here is unlikely to be more fact checking on Facebook. Instead, it is likely to require more aggressive policing by traditional professional media, the Associated Press, the television networks, and local TV news editors of whether and how they cover Trump’s propaganda efforts, and how they educate their audiences about the disinformation campaign the president and the Republican Party have waged.
  • The Senate Minority Leader and the top Democrats on three committees sent a letter to the acting Secretary of Homeland Security asking him to “release a document that shows President Donald Trump’s attacks on American Elections are consistent with a foreign influence campaign.” Senate Minority Leader Chuck Schumer (D-NY), Senate Intelligence Committee Ranking Member Mark Warner (D-VA), Senate Rules Committee Ranking Member Amy Klobuchar (D-MN), Senate Homeland Security and Governmental Affairs Committee Ranking Member Gary Peters (D-MI), and Senator Ron Wyden (D-OR) wrote to acting Secretary of Homeland Security Chad Wolf:
    • We write to urge you to immediately release to the public a September 3, 2020, analysis produced by the Department’s Office of Intelligence and Analysis.  This document demonstrates that a foreign actor is attempting to undermine faith in the US electoral system, particularly vote-by-mail systems, in a manner that is consistent with the rhetoric being used by President Trump, Attorney General Barr, and others.
    • The document has been marked ‘Unclassified/For Official Use Only,’ meaning that its release would not pose a risk to sources and methods and that it has already been widely distributed around the country through unclassified channels. It is now critical and urgent that the American people have access to this document so that they can understand the context of Trump’s statements and actions.
  • Representatives Abigail Spanberger (D-VA) and John Katko (R-NY) introduced the “Foreign Agent Disclaimer Enhancement (FADE) Act” “to protect against the influence of foreign nations that seek to weaken the U.S. electoral system and sow division through online disinformation campaigns.” This bill would close a loophole in the Foreign Agents Registration Act (FARA) that does not require foreign agents to disclose social media posts intended to persuade Americans as they must for other forms of communication. They provided the context for the legislation:
    • This week, the Federal Bureau of Investigation alerted Twitter that accounts likely based in Iran attempted to spread disinformation during the U.S. presidential debate.
    • An April 2020 State Department report warned that China, Iran, and Russia are using the COVID-19 crisis to launch a propaganda and disinformation onslaught against the United States.
    • Spanberger and Katko summarized the bill in their press release:
      • The Foreign Agent Disclaimer Enhancement (FADE) Act would increase transparency by requiring disclaimers attributing political content to a foreign principal be embedded on the face of a social media post itself. With this new requirement, disclaimers would remain with a post whenever the post is subsequently shared. The FADE Act would also clarify that these disclaimer requirements apply to the internet and apply to any political communications directed at the United States — regardless of the foreign agent’s location around the world.
      • To ensure enforcement of these new transparency measures, the FADE Act would requirethe Department of Justice (DOJ) to notify online platforms if a foreign agent does not meet disclaimer requirements for posts on their platforms, and in these cases, require the platform to remove the materials and use reasonable efforts to inform recipients of the materials that the information they saw was disseminated by a foreign agent. Additionally, the bipartisan bill would requireDOJ to prepare a report to Congress on enforcement challenges.
  • Europol issued its annual “Internet Organised Crime Threat Assessment (IOCTA) 2020” that “provides a unique law enforcement- focused assessment of emerging challenges and key developments in the area of cybercrime” in the European Union (EU).
  • Europol highlighted its findings:
    • Cross-Cutting Crime Facilitators And Challenges To Criminal Investigations
      • Social engineering remains a top threat to facilitate other types of cybercrime.
      • Cryptocurrencies continue to facilitate payments for various forms of cybercrime, as developments evolve with respect to privacy- oriented crypto coins and services.
      • Challenges with reporting hinder the ability to create an accurate overview of crime prevalence across the EU.
    • Cyber-Dependent Crime
      • Ransomware remains the most dominant threat as criminals increase pressure by threatening publication of data if victims do not pay.
      • Ransomware on third-party providers also creates potential significant damage for other organisations in the supply chain and critical infrastructure.
      • Emotet is omnipresent given its versatile use and leads the way as the benchmark of modern malware.
      • The threat potential of Distributed Denial of Service (DDoS) attacks is higher than its current impact in the EU.
    • Child Sexual Exploitation Online
      • The amount of online Child sexual abuse material (CSAM) detected continues to increase, further exacerbated by the COVID-19 crisis, which has serious consequences for the capacity of law enforcement authorities.
      • The use of encrypted chat apps and industry proposals to expand this market pose a substantial risk for abuse and make it more difficult for law enforcement to detect and investigate online Child sexual exploitation (CSE) activities.
      • Online offender communities exhibit considerable resilience and are continuously evolving.
      • Livestreaming of child sexual abuse continues to increase and became even more prevalent during the COVID-19 crisis.
      • The commercialisation of online CSE is becoming a more widespread issue, with individuals uploading material to hosting sites and subsequently acquiring credit on the basis of the number of downloads.
    • Payment Fraud
      • SIM swapping is a key trend that allows perpetrators to take over accounts and has demonstrated a steep rise over the last year.
      • Business email compromise (BEC) remains an area of concern as it has increased, grown in sophistication, and become more targeted.
      • Online investment fraud is one of the fastest growing crimes, generating millions in losses and affecting thousands of victims.
      • Card-not-present (CNP) fraud continues to increase as criminals diversify in terms of target sectors and electronic skimming (e-skimming) modi operandi.
    • The Criminal Abuse Of The Darkweb
      • The Darkweb environment has remained volatile, lifecycles of Darkweb market places have shortened, and no clear dominant market has risen over the past year compared to previous years to fill the vacuum left by the takedowns in 2019.
      • The nature of the Darkweb community at administrator-level shows how adaptive it is under challenging times, including more effective cooperation in the search for better security solutions and safe Darkweb interaction.
      • There has been an increase in the use of privacy- enhanced cryptocurrencies and an emergence of privacy-enhanced coinjoin concepts, such as Wasabi and Samurai.
      • Surface web e-commerce sites and encrypted communication platforms offer an additional dimension to Darkweb trading to enhance the overall business model.
  • “43 center-right organizations, think tanks, and policy experts” wrote Senate Majority Whip John Thune (R-SD) “for his leadership and support for the American competitive approach to 5G deployment.” Last week, Thune and 18 Republican colleagues sent President Donald Trump a letter “to express our concerns about a Request For Information (RFI) released by the Department of Defense (DOD) that contradicts the successful free-market strategy you have embraced for 5G.” Late last month, the United States Department of Defense (DOD) released a  RFI on the possibility of the agency sharing its prized portions of electromagnetic spectrum with commercial providers to speed the development and adoption of 5G in the United States.
    • The 43 groups argued:
      • We too are concerned with the Department of Defense Request for Information on a government-managed process for 5G development and are alarmed with how quickly it is proceeding.  Even more disturbing are the rumors that the RFI was only for show and that the DoD already has an RFP it plans to greenlight. 
      • A government-run 5G backbone, wholesale network, or whatever name it goes by, is nationalization of private business. Spectrum sharing is something that must be considered as the nation moves forward with private networks, but it is not a reason for a government takeover. For a government-run network to happen, the federal government would have to either renege on licenses granted to private users or hoard spectrum at the expense of private industry. Either approach would upend well-established licensure policies at the FCC that establish certainty in operating and maintaining complex networks and create massive unnecessary delays to launching 5G networks. Moreover, the government should not be in the business of “competing” with private industry. That’s the business model of China and Russia, not the United States. 
  • The top Democrat on the Senate Intelligence Committee wrote Facebook, Twitter, and Google, urging the companies “to implement robust accountability and transparency standards ahead of the November election, including requirements outlined in the Honest Ads Act…to help prevent foreign interference in elections and improve the transparency of online political advertisements” according to his press release. Senator Mark Warner   (D-VA) asserted that “[i]n individual letters to FacebookGoogle, and Twitter, [he] detailed the various ways in which each company continues to contribute to the spread of disinformation, viral misinformation, and voter suppression efforts.” Warner “also warned about the imminent risk of bad actors once again weaponizing American-bred social media tools to undermine democracy ahead of the November election, and urged each company to take proactive measures to safeguard against these efforts.” Warner specified:
    • In his letter to Facebook, [he] criticized the platform’s efforts to label manipulated or synthetic content, describing these as “wholly inadequate.” He also raised alarm with instances of Facebook’s amplification of harmful content.
    • Similarly, in a letter to Google, [he] raised concern with the company’s efforts to combat harmful misinformation – particularly disinformation about voting, spread by right-leaning YouTube channels. He also criticized the comprehensiveness of Google’s ad archive, which presently excludes issue ads.
    • In his letter to Twitter, which has banned paid political content and placed restrictions on cause-based advertising, [he] noted that doctored political content continues to spread organically without adequate labeling that slows its spread or contextualizes it for users.
  • Representative Lauren Underwood (D-IL), the new Chair of the House Homeland Security Committee’s Cybersecurity, Infrastructure Protection, and Innovation Subcommittee, wrote Facebook, Twitter, and YouTube, urging them “to address ongoing reports of election-related disinformation targeting Black voters on their platforms” per her press release. She argued “[d]uring the 2016 election, social media platforms were used by malicious actors attempting to silence Black voters and sow racial division…[and] [f]our years later, social media companies have made too little progress toward containing this growing threat.” Underwood “requested information on the steps the companies are taking to prevent voter suppression, interference, and disinformation targeting Black voters.”

Further Reading

  • Judge Orders Twitter To Unmask FBI Impersonator Who Set Off Seth Rich Conspiracy” By Bobby Allyn — NPR. A magistrate judge in California denied Twitter’s motion to quash a subpoena in order to not reveal the account information of an anonymous user who spread lies about deceased Democratic National Committee staffer Seth Rich and his family regarding the Russian Federation’s interference in the 2016 election.
  • Justices wary of upending tech industry in Google v. Oracle Supreme Court fight” By Tucker Higgins — CNBC. This week, the Supreme Court of the United States heard oral arguments in the decade long legal war between Google and Oracle arising from the latter’s claim that the former infringed its ownership rights by using roughly 11,500 lines of code to create its Android operating system from an application programming interface developed by Sun Microsystems, a company bought by Oracle. This case could have huge ramifications for the technology industry if Oracle wins because it could make the development of new products and services much harder.
  • Facebook to temporarily halt political ads in U.S. after polls close Nov. 3, broadening earlier restrictions” By Elizabeth Dwoskin — The Washington Post. In its newest announcement, Facebook announced it will not accept political or issues advertising in the week after election day. This effort is the latest measure the platform has announced to address misinformation and disinformation. Facebook will also label efforts of candidates to claim an election has been decided if it, in fact, has not been. The platform will also remove posts that aim to intimidate voters or suppress the voting turnout.
  • Leaked: Confidential Amazon memo reveals new software to track unions” By Jason Del Rey and Shirin Ghaffary — recode. The tech giant is turning its data collection and analysis capabilities on its workforce in an effort to prevent unionizing at the United States’ (U.S.) second largest employer.
  • QAnon High Priest Was Just Trolling Away as a Citigroup Tech Executive” By William Turton and Joshua Brustein — Bloomberg. The fascinating if not horrifying story of how a seemingly, well-to-do mild-mannered tech specialist became one of the key figures in the QAnon conspiracy.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by John Mounsey from Pixabay

Further Reading, Other Developments, and Coming Events (7 October)

Coming Events

  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, and the agency has released a tentative agenda:
    • Restoring Internet Freedom Order Remand – The Commission will consider an Order on Remand that would respond to the remand from the U.S. Court of Appeals for the D.C. Circuit and conclude that the Restoring Internet Freedom Order promotes public safety, facilitates broadband infrastructure deployment, and allows the Commission to continue to provide Lifeline support for broadband Internet access service. (WC Docket Nos. 17-108, 17-287, 11- 42)
    • Establishing a 5G Fund for Rural America – The Commission will consider a Report and Order that would establish the 5G Fund for Rural America to ensure that all Americans have access to the next generation of wireless connectivity. (GN Docket No. 20-32)
    • Increasing Unlicensed Wireless Opportunities in TV White Spaces – The Commission will consider a Report and Order that would increase opportunities for unlicensed white space devices to operate on broadcast television channels 2-35 and expand wireless broadband connectivity in rural and underserved areas. (ET Docket No. 20-36)
    • Streamlining State and Local Approval of Certain Wireless Structure Modifications –
    • The Commission will consider a Report and Order that would further accelerate the deployment of 5G by providing that modifications to existing towers involving limited ground excavation or deployment would be subject to streamlined state and local review pursuant to section 6409(a) of the Spectrum Act of 2012. (WT Docket No. 19-250; RM-11849)
    • Revitalizing AM Radio Service with All-Digital Broadcast Option – The Commission will consider a Report and Order that would authorize AM stations to transition to an all-digital signal on a voluntary basis and would also adopt technical specifications for such stations. (MB Docket Nos. 13-249, 19-311)
    • Expanding Audio Description of Video Content to More TV Markets – The Commission will consider a Report and Order that would expand audio description requirements to 40 additional television markets over the next four years in order to increase the amount of video programming that is accessible to blind and visually impaired Americans. (MB Docket No. 11-43)
    • Modernizing Unbundling and Resale Requirements – The Commission will consider a Report and Order to modernize the Commission’s unbundling and resale regulations, eliminating requirements where they stifle broadband deployment and the transition to next- generation networks, but preserving them where they are still necessary to promote robust intermodal competition. (WC Docket No. 19-308)
    • Enforcement Bureau Action – The Commission will consider an enforcement action.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • Consumer Reports released a study it did on the “California Consumer Privacy Act” (CCPA) (AB 375), specifically on the Do-Not-Sell right California residents were given under the newly effective privacy statute. For those people (like me) who expected a significant number of businesses to make it hard for people to exercise their rights, this study confirms this suspicion. Consumer Reports noted more than 40% of data brokers had hard to find links or extra, complicated steps for people to tell them not to sell their personal information.
    • In “CCPA: Are Consumers Digital Rights Protected?,” Consumer Reports used this methodology:
    • Consumer Reports’ Digital Lab conducted a mixed methods study to examine whether the new CCPA is working for consumers. This study focused on the Do-Not-Sell (DNS) provision in the CCPA, which gives consumers the right to opt out of the sale of their personal information to third parties through a “clear and conspicuous link” on the company’s homepage.1 As part of the study, 543 California residents made DNS requests to 214 data brokers listed in the California Attorney General’s data broker registry. Participants reported their experiences via survey.
    • Consumer Reports found:
      • Consumers struggled to locate the required links to opt out of the sale of their information. For 42.5% of sites tested, at least one of three testers was unable to find a DNS link. All three testers failed to find a “Do Not Sell” link on 12.6% of sites, and in several other cases one or two of three testers were unable to locate a link.
        • Follow-up research focused on the sites in which all three testers did not find the link revealed that at least 24 companies on the data broker registry do not have the required DNS link on their homepage.
        • All three testers were unable to find the DNS links for five additional companies, though follow-up research revealed that the companies did have DNS links on their homepages. This also raises concerns about compliance, since companies are required to post the link in a “clear and conspicuous” manner.
      • Many data brokers’ opt-out processes are so onerous that they have substantially impaired consumers’ ability to opt out, highlighting serious flaws in the CCPA’s opt-out model.
        • Some DNS processes involved multiple, complicated steps to opt out, including downloading third-party software.
        • Some data brokers asked consumers to submit information or documents that they were reluctant to provide, such as a government ID number, a photo of their government ID, or a selfie.
        • Some data brokers confused consumers by requiring them to accept cookies just to access the site.
        • Consumers were often forced to wade through confusing and intimidating disclosures to opt out.
        • Some consumers spent an hour or more on a request.
        • At least 14% of the time, burdensome or broken DNS processes prevented consumers from exercising their rights under the CCPA.
      • At least one data broker used information provided for a DNS request to add the user to a marketing list, in violation of the CCPA.
      • At least one data broker required the user to set up an account to opt out, in violation of the CCPA.
      • Consumers often didn’t know if their opt-out request was successful. Neither the CCPA nor the CCPA regulations require companies to notify consumers when their request has been honored. About 46% of the time, consumers were left waiting or unsure about the status of their DNS request.
      • About 52% of the time, the tester was “somewhat dissatisfied” or “very dissatisfied” with the opt-out processes.
      • On the other hand, some consumers reported that it was quick and easy to opt out, showing that companies can make it easier for consumers to exercise their rights under the CCPA. About 47% of the time, the tester was “somewhat satisfied” or “very satisfied” with the opt-out process.
    • Consumer Reports recommended:
      • The Attorney General should vigorously enforce the CCPA to address noncompliance.
      • To make it easier to exercise privacy preferences, consumers should have access to browser privacy signals that allow them to opt out of all data sales in one step.
      • The AG should more clearly prohibit dark patterns, which are user interfaces that subvert consumer intent, and design a uniform opt-out button. This will make it easier for consumers to locate the DNS link on individual sites.
      • The AG should require companies to notify consumers when their opt-out requests have been completed, so that consumers can know that their information is no longer being sold.
      • The legislature or AG should clarify the CCPA’s definitions of “sale” and “service provider” to more clearly cover data broker information sharing.
      • Privacy should be protected by default. Rather than place the burden on consumers to exercise privacy rights, the law should require reasonable data minimization, which limits the collection, sharing, retention, and use to what is reasonably necessary to operate the service.
  • Two agencies of the Department of the Treasury have issued guidance regarding the advisability and legality of paying ransomware to individuals or entities under United States (U.S.) sanction at a time when ransomware attacks are on the rise. It bears note that a person or entity in the U.S. may face criminal and civil liability for paying a sanctioned ransomware entity even if they did not know it was sanctioned. One of the agencies reasoned that paying ransoms to such parties is contrary to U.S. national security policy and only encourages more ransomware attacks.
    • The Office of Foreign Assets Control (OFAC) issued an “advisory to highlight the sanctions risks associated with ransomware payments related to malicious cyber-enabled activities.” OFAC added:
      • Demand for ransomware payments has increased during the COVID-19 pandemic as cyber actors target online systems that U.S. persons rely on to continue conducting business. Companies that facilitate ransomware payments to cyber actors on behalf of victims, including financial institutions, cyber insurance firms, and companies involved in digital forensics and incident response, not only encourage future ransomware payment demands but also may risk violating OFAC regulations. This advisory describes these sanctions risks and provides information for contacting relevant U.S. government agencies, including OFAC, if there is a reason to believe the cyber actor demanding ransomware payment may be sanctioned or otherwise have a sanctions nexus.
    • Financial Crimes Enforcement Network (FinCEN) published its “advisory to alert financial institutions to predominant trends, typologies, and potential indicators of ransomware and associated money laundering activities. This advisory provides information on:
      • (1) the role of financial intermediaries in the processing of ransomware payments;
      • (2) trends and typologies of ransomware and associated payments;
      • (4) reporting and sharing information related to ransomware attacks.
  • The Government Accountability Office (GAO) found uneven implementation at seven federal agencies in meeting the Office of Management and Budget’s (OMB) requirements in using the category management initiative for buying information technology (IT). This report follows in a long line of assessments of how the federal government is not spending its billions of dollars invested in IT to maximum effect. The category management initiative was launched two Administrations ago as a means of driving greater efficiency and savings for the nearly $350 billion the U.S. government spends annually in services and goods, much of which could be bought in large quantities instead of piecemeal by agency as is now the case.
    • The chair and ranking member of the House Oversight Committee and other Members had asked the GAO “to conduct a review of federal efforts to reduce IT contract duplication and/or waste” specifically “to determine the extent to which (1) selected agencies’ efforts to prevent, identify, and reduce duplicative or wasteful IT contracts were consistent with OMB’s category management initiative; and (2) these efforts were informed by spend analyses.” The GAO ended up looking at the Departments of Agriculture (USDA), Defense (DOD), Health and Human Services (HHS), Homeland Security (DHS), Justice (DOJ), State (State), and Veterans Affairs (VA).
    • The GAO found:
      • The seven agencies in our review varied in their implementation of OMB’s category management activities that contribute to identifying, preventing, and reducing duplicative IT contracts. Specifically, most of the agencies fully implemented the two activities to identify a Senior Accountable Official and develop processes and policies for implementing category management efforts, and to engage their workforces in category management training. However, only about half the agencies fully implemented the activities to reduce unaligned IT spending, including increasing the use of Best in Class contract solutions, and share prices paid, terms, and conditions for purchased IT goods and services. Agencies cited several reasons for their varied implementation, including that they were still working to define how to best integrate category management into the agency.
      • Most of the agencies used spend analyses to inform their efforts to identify and reduce duplication, and had developed and implemented strategies to address the identified duplication, which, agency officials reported resulted in millions in actual and anticipated future savings. However, two of these agencies did not make regular use of the spend analyses.
      • Until agencies fully implement the activities in OMB’s category management initiative, and make greater use of spend analyses to inform their efforts to identify and reduce duplicative contracts, they will be at increased risk of wasteful spending. Further, agencies will miss opportunities to identify and realize savings of potentially hundreds of millions of dollars.
  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) provided “specific Chinese government and affiliated cyber threat actor tactics, techniques, and procedures (TTPs) and recommended mitigations to the cybersecurity community to assist in the protection of our Nation’s critical infrastructure.” CISA took this action “[i]n light of heightened tensions between the United States and China.”
    • CISA asserted
      • According to open-source reporting, offensive cyber operations attributed to the Chinese government targeted, and continue to target, a variety of industries and organizations in the United States, including healthcare, financial services, defense industrial base, energy, government facilities, chemical, critical manufacturing (including automotive and aerospace), communications, IT, international trade, education, videogaming, faith-based organizations, and law firms.
    • CISA recommends organizations take the following actions:
      • Adopt a state of heightened awareness. Minimize gaps in personnel availability, consistently consume relevant threat intelligence, and update emergency call trees.
      • Increase organizational vigilance. Ensure security personnel monitor key internal security capabilities and can identify anomalous behavior. Flag any known Chinese indicators of compromise (IOCs) and TTPs for immediate response.
      • Confirm reporting processes. Ensure personnel know how and when to report an incident. The well-being of an organization’s workforce and cyber infrastructure depends on awareness of threat activity. Consider reporting incidents to CISA to help serve as part of CISA’s early warning system (see the Contact Information section below).
      • Exercise organizational incident response plans. Ensure personnel are familiar with the key steps they need to take during an incident. Do they have the accesses they need? Do they know the processes? Are various data sources logging as expected? Ensure personnel are positioned to act in a calm and unified manner.
  • The Supreme Court of the United States (SCOTUS) declined to hear a case on an Illinois revenge porn law that the Illinois State Supreme Court upheld, finding it did not impinge on a woman’s First Amendment rights. Bethany Austin was charged with a felony under an Illinois law barring the nonconsensual dissemination of private sexual pictures when she printed and distributed pictures of her ex-fiancé’s lover. Because SCOTUS decided not to hear this case, the Illinois case and others like it remain Constitutional.
    • The Illinois State Supreme Court explained the facts of the case:
      • Defendant (aka Bethany Austin) was engaged to be married to Matthew, after the two had dated for more than seven years. Defendant and Matthew lived together along with her three children. Defendant shared an iCloud account with Matthew, and all data sent to or from Matthew’s iPhone went to their shared iCloud account, which was connected to defendant’s iPad. As a result, all text messages sent by or to Matthew’s iPhone automatically were received on defendant’s iPad. Matthew was aware of this data sharing arrangement but took no action to disable it.
      • While Matthew and defendant were engaged and living together, text messages between Matthew and the victim, who was a neighbor, appeared on defendant’s iPad. Some of the text messages included nude photographs of the victim. Both Matthew and the victim were aware that defendant had received the pictures and text messages on her iPad. Three days later, Matthew and the victim again exchanged several text messages. The victim inquired, “Is this where you don’t want to message [because] of her?” Matthew responded, “no, I’m fine. [S]omeone wants to sit and just keep watching want [sic] I’m doing I really do not care. I don’t know why someone would wanna put themselves through that.” The victim replied by texting, “I don’t either. Soooooo baby ….”
      • Defendant and Matthew cancelled their wedding plans and subsequently broke up. Thereafter, Matthew began telling family and friends that their relationship had ended because defendant was crazy and no longer cooked or did household chores.
      • In response, defendant wrote a letter detailing her version of events. As support, she attached to the letter four of the naked pictures of the victim and copies of the text messages between the victim and Matthew. When Matthew’s cousin received the letter along with the text messages and pictures, he informed Matthew.
      • Upon learning of the letter and its enclosures, Matthew contacted the police. The victim was interviewed during the ensuing investigation and stated that the pictures were private and only intended for Matthew to see. The victim acknowledged that she was aware that Matthew had shared an iCloud account with defendant, but she thought it had been deactivated when she sent him the nude photographs.
    • In her petition for SCOTUS to hear her case, Austin asserted:
      • Petitioner Bethany Austin is being prosecuted under Illinois’ revenge porn law even though she is far from the type of person such laws were intended to punish. These laws proliferated rapidly in recent years because of certain reprehensible practices, such as ex-lovers widely posting images of their former mates to inflict pain for a bad breakup, malicious stalkers seeking to damage an innocent person’s reputation, or extortionists using intimate photos to collect ransom. Austin did none of those things, yet is facing felony charges because she tried to protect her reputation from her former fiancé’s lies about the reason their relationship ended.
      • The Illinois Supreme Court rejected Petitioner’s constitutional challenge to the state revenge porn law only because it ignored well-established First Amendment rules: It subjected the law only to intermediate, rather than strict scrutiny, because it incorrectly classified a statute that applies only to sexual images as content neutral; it applied diminished scrutiny because the speech at issue was deemed not to be a matter of public concern; and it held the law need not require a showing of malicious intent to justify criminal penalties, reasoning that such intent can be inferred from the mere fact that the specified images were shared. Each of these conclusions contradicts First Amendment principles recently articulated by this Court, and also is inconsistent with decisions of various state courts, including the Vermont Supreme Court.
    • Illinois argued in its brief to SCOTUS:
      • The nonconsensual dissemination of private sexual images exposes victims to a wide variety of serious harms that affect nearly every aspect of their lives. The physical, emotional, and economic harms associated with such conduct are well-documented: many victims are exposed to physical violence, stalking, and harassment; suffer from emotional and psychological harm; and face limited professional prospects and lowered income, among other repercussions. To address this growing problem and protect its residents from these harms, Illinois enacted section 11-23.5,720 ILCS 5/11-23.5. Petitioner—who was charged with violating section 11-23.5 after she disseminated nude photos of her fiancé’s paramour without consent—asks this Court to review the Illinois Supreme Court’s decision rejecting her First Amendment challenge.
  • Six U.S. Agency for Global Media (USAGM) whistleblowers have filed a complaint concerning “retaliatory actions” with the Office of the Inspector General (OIG) at the Department of State and the Office of Special Counsel, arguing the newly installed head of USAGM punished them for making complaints through proper channels about his actions. This is the latest development at the agency. the United States Court of Appeals for the District of Columbia enjoined USAGM from “taking any action to remove or replace any officers or directors of the OTF,” pending the outcome of the suit which is being expedited.
  • Additionally, USAGM CEO and Chair of the Board Michael Pack is being accused in two different letters of seeking to compromise the integrity and independence of two organizations he oversees. There have been media accounts of the Trump Administration’s remaking of USAGM in ways critics contend are threatening the mission and effectiveness of the Open Technology Fund (OTF), a U.S. government non-profit designed to help dissidents and endangered populations throughout the world. The head of the OTF has been removed, evoking the ire of Members of Congress, and other changes have been implemented that are counter to the organization’s mission. Likewise, there are allegations that politically-motivated policy changes seek to remake the Voice of America (VOA) into a less independent entity.
  • The whistleblowers claimed in their complaint:
    • Each of the Complainants made protected disclosures –whether in the form of OIG complaints, communications with USAGM leadership, and/or communications with appropriate Congressional committees–regarding their concerns about official actions primarily taken by Michael Pack, who has been serving as the Chief Executive Officer for USAGM since June 4, 2020. The Complainants’ concerns involve allegations that Mr. Pack has engaged in conduct that violates federal law and/or USAGM regulations, and that constitutes an abuse of authority and gross mismanagement. Moreover, each of the Complainants was targeted for retaliatory action by Mr. Pack because of his belief that they held political views opposed to his, which is a violation of the Hatch Act.
    • Each of the Complainants was informed by letter, dated August 12, 2020, that their respective accesses to classified information had been suspended pending further investigation. Moreover, they were all concurrently placed on administrative leave. In each of the letters to the Complainants, USAGM claimed that the Complainants had been improperly granted security clearances, and that the Complainants failed to take remedial actions to address personnel and security concerns prior to permitting other USAGM employees to receive security clearances. In addition, many or all of the Complainants were earlier subject to retaliatory adverse personnel actions in the form of substantial limitations on their ability to carry out their work responsibilities(i.e. a significant change in duties and responsibilities), which limitations were imposed without following appropriate personnel procedures.

Further Reading

  • Big Tech Was Their Enemy, Until Partisanship Fractured the Battle Plans” By Cecilia Kang and David McCabe — The New York Times. There’s a bit of court intrigue in this piece about how Republicans declined to join Democrats in the report on the antirust report released this week, sapping the recommendations on how to address Big Tech of power.
  • Facebook Keeps Data Secret, Letting Conservative Bias Claims Persist” By Bobby Allyn — NPR. Still no evidence of an anti-conservative bias at Facebook, according to experts, and the incomplete data available seem to indicate conservative content may be more favored by users than liberal content. Facebook does not release data that settle the question, however, and there are all sorts of definitional questions that need answers before this issue could be definitely settled. And yet, some food for thought is a significant percentage of sharing a link may be driven by bots and not humans.
  • News Corp. changes its tune on Big Tech” By Sara Fischer — Axios.  After beating the drum for years about the effect of Big Tech on journalism, the parent company of the Wall Street Journal and other media outlets is much more conciliatory these days. It may have something to do with all the cash the Googles and Facebooks of the world are proposing to throw at some media outlets for their content. It remains to be seen how this change in tune will affect the Australian Competition and Consumer Commission’s (ACCC) proposal to ensure that media companies are compensated for articles and content online platforms use. In late July the ACCC released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.”
  • Silicon Valley Opens Its Wallet for Joe Biden” By Daniel Oberhaus — WIRED. In what will undoubtedly be adduced as evidence that Silicon Valley is a liberal haven, this article claims according to federal elections data for this election cycle, Alphabet, Amazon, Apple, Facebook, Microsoft, and Oracle employees have contributed $4,787,752 to former Vice President Joe Biden and $239,527 to President Donald Trump. This is only for contributions of $200 and higher, so it is likely these data are not complete.
  • Facebook bans QAnon across its platforms” By Ben Collins and Brandy Zadrozny — NBC News. The social media giant has escalated and will remove all content related to the conspiracy group and theory known as QAnon. However, believers have been adaptable and agile in dropping certain terms and using methods to evade detection. Some experts say Facebook’s actions are too little, too late as these beliefs are widespread and are fueling a significant amount of violence and unrest in the real world.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by Katie White from Pixabay

Further Reading, Other Developments, and Coming Events (6 October)

Coming Events

  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The European Union Agency for Cybersecurity (ENISA), Europol’s European Cybercrime Centre (EC3) and the Computer Emergency Response Team for the EU Institutions, Bodies and Agencies (CERT-EU) will hold the 4th annual IoT Security Conference series “to raise awareness on the security challenges facing the Internet of Things (IoT) ecosystem across the European Union:”
    • Operational IoT – 7 October at 15:00 to 16:30 CET
    • Artificial Intelligence – 14 October at 15:00 to 16:30 CET
    • Supply Chain for IoT – 21 October at 15:00 to 16:30 CET
  • The Federal Communications Commission (FCC) will hold an open commission meeting on 27 October, but the agenda has not yet been announced.
  • On October 29, the Federal Trade Commission (FTC) will hold a seminar titled “Green Lights & Red Flags: FTC Rules of the Road for Business workshop” that “will bring together Ohio business owners and marketing executives with national and state legal experts to provide practical insights to business and legal professionals about how established consumer protection principles apply in today’s fast-paced marketplace.”

Other Developments

  • The Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that a “malicious cyber actor” had penetrated an unnamed federal agency and “implanted sophisticated malware—including multi-stage malware that evaded the affected agency’s anti-malware protection—and gained persistent access through two reverse Socket Secure (SOCKS) proxies that exploited weaknesses in the agency’s firewall.” Since CISA said it became aware of the penetration via EINSTEIN, it is likely a civilian agency that was compromised. The actor used “compromised credentials” to get into the agency, but “CISA analysts were not able to determine how the cyber threat actor initially obtained the credentials.” It is not clear whether this is a nation state or sophisticated hackers working independently.
    • It should be noted that last month, the Department of Veterans Affairs (VA) revealed it had been breached and “the personal information of approximately 46,000 Veterans” has been compromised. This announcement came the same day as an advisory issued by CISA that Chinese Ministry of State Security (MSS)-affiliated cyber threat actors have been targeting and possibly penetrating United States (U.S.) agency networks. 
  • Senators Ron Wyden (D-OR) and Jeff Merkley (D-OR) and Representatives Earl Blumenauer (D-OR) and Suzanne Bonamici (D-OR) wrote the Department of Homeland Security (DHS) regarding a report in The Nation alleging the DHS and Department of Justice (DOJ) surveilled the phones of protestors in Portland, Oregon in possible violation of United States (U.S.) law. These Members asked DHS to respond to the following questions by October 9:
    • During a July 23, 2020, briefing for Senate intelligence committee staff, Brian Murphy, then the Acting Under Secretary for Intelligence and Analysis (I&A) stated that DHS I&A had neither collected nor exploited or analyzed information obtained from the devices or accounts of protesters or detainees. On July 31, 2020, Senator Wyden and six other Senators on the Senate Select Committee on Intelligence wrote to Mr. Murphy to confirm the statement he had made to committee staff. DHS has yet to respond to that letter. Please confirm whether or not Mr. Murphy’s statement during the July 23, 2020, briefing was accurate at the time, and if it is still   
    • accurate.
    • Has DHS, whether directly, or with the assistance of any other government agency, obtained or analyzed data collected through the surveillance of protesters’ phones, including tracking their locations or intercepting communications content or metadata? If yes, for each phone that was surveilled, did the government obtain prior authorization from a judge before conducting this surveillance?
    • Has DHS used commercial data sources, including open source intelligence products, to investigate, identify, or track protesters or conduct network analysis? If yes, please identify each commercial data source used by DHS, describe the information DHS obtained, how DHS used it, whether it was subsequently shared with any other government agency, and whether DHS sought and obtained authorization from a court before querying the data source.
  • The National Cybersecurity Center of Excellence (NCCoE) at the National Institute of Standards and Technology (NIST) has published for comment the “Securing Data Integrity Against Ransomware Attacks: Using the NIST Cybersecurity Framework and NIST Cybersecurity Practice Guides” that provides an overview of [NCCoE and NIST’s]  Data Integrity projects…a high-level explanation of the architecture and capabilities, and how these projects can be brought together into one comprehensive data integrity solution…[that] can then be integrated into a larger security picture to address all of an organization’s data security needs.” Comments are due by 13 November. NCCoE and NIST explained:
    • This guide is designed for organizations that are not currently experiencing a loss of data integrity event (ransomware or otherwise). This document prepares an organization to adequately address future data integrity events. For information on dealing with a current attack, please explore guidance from organizations like the Federal Bureau of Investigation the United States Secret Service, or other pertinent groups or government bodies.
    • Successful ransomware impacts data’s integrity, yet ransomware is just one of many potential vectors through which an organization could suffer a loss of data integrity. Integrity is part of the CIA security triad which encompasses Confidentiality, Integrity, and Availability. As the CIA triad is applied to data security, data integrity is defined as “the property that data has not been changed, destroyed, or lost in an unauthorized or accidental manner.” An attack against data integrity can cause corruption, modification, and/or destruction of the data which ultimately results in a loss in trust in the data.
  • As referenced in media reports, Graphika released a report on a newly discovered Russian disinformation efforts that led to the creation and propagation of propaganda to appeal to the right wing in the United States (U.S.) In “Step into My Parler: Suspected Russian Operation Targeted Far-Right American Users on Platforms Including Gab and Parler, Resembled Recent IRA-Linked Operation that Targeted Progressives,” Graphika explained:
    • Russian operators ran a far-right website and social media accounts that targeted American users with pro-Trump and anti-Biden messaging, according to information from Reuters and Graphika’s investigation. This included the first known Russian activity on the platforms Gab and Parler. The operation appeared connected to a recent Russian website that targeted progressives in America with anti-Biden messaging.
    • The far-right “Newsroom for American and European Based Citizens,” naebc[.]com, pushed the opposite end of the political spectrum from the ostensibly progressive PeaceData site, but the two assets showed such a strong family resemblance that they appear to be two halves of the same operation. Both ran fake editorial personas whose profile pictures were generated by artificial intelligence; both claimed to be young news outlets based in Europe; both made language errors consistent with Russian speakers; both tried to hire freelance writers to provide their content; and, oddly enough, both had names that translate to obscenities in Russian.
    • Reuters first tipped Graphika off to the existence of the NAEBC website and its likely relationship to PeaceData. U.S. law enforcement originally alerted the social media platforms to the existence of PeaceData. On September 1, Facebook attributed PeaceData to “individuals associated with past activity by the Russian Internet Research Agency (IRA).” Twitter attributed it to Russian state actors. Social media platforms (Facebook, Twitter, LinkedIn) have taken similar action to stop activity related to NAEBC on their platforms. To date, Parler and Gab have not taken action on their platforms.
  • The Cybersecurity and Infrastructure Security Agency (CISA) and Multi-State Information Sharing and Analysis Center (MS-ISAC) issued a joint Ransomware Guide “meant to be a one-stop resource for stakeholders on how to be proactive and prevent these attacks from happening and also a detailed approach on how to respond to an attack and best resolve the cyber incident.” The organizations explained:
    • First, the guide focuses on best practices for ransomware prevention, detailing practices that organizations should continuously do to help manage the risk posed by ransomware and other cyber threats. It is intended to enable forward-leaning actions to successfully thwart and confront malicious cyber activity associated with ransomware. Some of the several CISA and MS-ISAC preventive services that are listed are Malicious Domain Blocking and Reporting, regional CISA Cybersecurity Advisors, Phishing Campaign Assessment, and MS-ISAC Security Primers on ransomware variants such as Ryuk.
    • The second part of this guide, response best practices and services, is divided up into three sections: (1) Detection and Analysis, (2) Containment and Eradication, and (3) Recovery and Post-Incident Activity. One of the unique aspects that will significantly help an organization’s leadership as well as IT professional with response is a comprehensive, step-by-step checklist. With many technical details on response actions and lists of CISA and MS-ISAC services available to the incident response team, this part of the guide can enable a methodical, measured and properly managed approach.  
  • The Government Accountability Office (GAO) released a guide on best practices for agile software development for federal agencies and contracting officers. The GAO stated:
    • The federal government spends at least $90 billion annually on information technology (IT) investments. In our January 2019 High Risk List report, GAO reported on 35 high risk areas, including the management of IT acquisitions and operations. While the executive branch has undertaken numerous initiatives to help agencies better manage their IT investments, these programs frequently fail or incur cost overruns and schedule slippages while contributing little to mission-related outcomes.
    • GAO has found that the Office of Management and Budget (OMB) continues to demonstrate its leadership commitment by issuing guidance for covered departments and agencies to implement statutory provisions commonly referred to as Federal Information Technology Acquisition Reform Act (FITARA.) However, application of FITARA at federal agencies has not been fully implemented. For example, as we stated in the 2019 High Risk report, none of the 24 major federal agencies had IT management policies that fully addressed the roles of their Chief Information Officers (CIO) consistent with federal laws and guidance.
    • This Agile Guide is intended to address generally accepted best practices for Agile adoption, execution, and control. In this guide, we use the term best practice to be consistent with the use of the term in GAO’s series of best practices guides.

Further Reading

  • GOP lawmaker: Democrats’ tech proposals will include ‘non-starters for conservatives’” By Cristiano Lima — Politico. Representative Ken Buck (R-CO) is quoted extensively in this article about Republican concerns that the House Judiciary Committee’s antitrust recommendations may include policy changes he and other GOP Members of the committee will not be able to go along with. Things like banning mandatory arbitration clauses and changing evidentiary burdens (i.e. rolling back court decisions that have made antitrust actions harder to mount) are not acceptable to Republicans who apparently agree in the main that large technology companies do indeed have too much market power. Interestingly, Buck and others think the solution is more resources for the Department of Justice and the Federal Trade Commission (FTC), which is rapidly becoming a favored policy prescription for federal privacy legislation, too. However, even with a massive infusion of funding, the agencies could not act in all cases, and, in any event, would need to contend with a more conservative federal judiciary unlikely to change the antitrust precedents that have reduced the ability of these agencies to take action in the first place. Nonetheless, Republicans may join the report if the recommendations are changed. Of course, the top Republican on the committee, Representative Jim Jordan (R-OH), is allegedly pressuring Republicans not to join the report.
  • Why Is Amazon Tracking Opioid Use All Over the United States?” By Lauren Kaori Gurley — Motherboard. The online shopping giant is apparently tracking a range of data related to opioid usage for reasons that are not entirely clear. To be fair, the company tracks all sort of data.
  • As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature” By Craig Timberg and Elizabeth Dwoskin — The Washington Post. This article traces the history of how Facebook and Twitter opted not to act against QAnon while other platforms like Reddit did, quite possibly contributing the rise and reach of the conspiracy. However, they were afraid of angering some on the right wing given the overlap between some QAnon supports and some Trump supporters.
  • Democratic Party leaders are “banging their head against the wall” after private meetings with Facebook on election misinformation” By Shirin Ghaffary — recode. Democratic officials who have been on calls with Facebook officials are saying the platform is not doing enough to combat disinformation and lies about the election. Facebook, of course, disputes this assessment. Democratic officials are especially concerned about the period between election day and when results are announced and think Facebook is not ready to handle the predicted wave of disinformation.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Bermix Studio on Unsplash

Further Reading, Other Developments, and Coming Events (13 August)

Here are Further Reading, Other Developments, and Coming Events:

Coming Events

  • On 18 August, the National Institute of Standards and Technology (NIST) will host the “Bias in AI Workshop, a virtual event to develop a shared understanding of bias in AI, what it is, and how to measure it.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) announced that its third annual National Cybersecurity Summit “will be held virtually as a series of webinars every Wednesday for four weeks beginning September 16 and ending October 7:”
    • September 16: Key Cyber Insights
    • September 23: Leading the Digital Transformation
    • September 30: Diversity in Cybersecurity
    • October 7: Defending our Democracy
    • One can register for the event here.
  • The Senate Judiciary Committee’s Antitrust, Competition Policy & Consumer Rights Subcommittee will hold a hearing on 15 September titled “Stacking the Tech: Has Google Harmed Competition in Online Advertising?.” In their press release, Chair Mike Lee (R-UT) and Ranking Member Amy Klobuchar (D-MN) asserted:
    • Google is the dominant player in online advertising, a business that accounts for around 85% of its revenues and which allows it to monetize the data it collects through the products it offers for free. Recent consumer complaints and investigations by law enforcement have raised questions about whether Google has acquired or maintained its market power in online advertising in violation of the antitrust laws. News reports indicate this may also be the centerpiece of a forthcoming antitrust lawsuit from the U.S. Department of Justice. This hearing will examine these allegations and provide a forum to assess the most important antitrust investigation of the 21st century.

Other Developments

  • Senate Intelligence Committee Acting Chair Marco Rubio (R-FL) and Vice Chairman Mark Warner (D-VA) released a statement indicating the committee had voted to adopt the fifth and final volume of its investigation of the Russian Federation’s interference in the 2016 election. The committee had submitted the report to the Intelligence Community for vetting and have received the report with edits and redactions. The report could be released sometime over the next few weeks.  Rubio and Warner stated “the Senate Intelligence Committee voted to adopt the classified version of the final volume of the Committee’s bipartisan Russia investigation. In the coming days, the Committee will work to incorporate any additional views, as well as work with the Intelligence Community to formalize a properly redacted, declassified, publicly releasable version of the Volume 5 report.” The Senate Intelligence Committee’s has released four previous reports:
  • The National Institute of Standards and Technology (NIST) is accepting comments until 11 September on draft Special Publication 800-53B, “Control Baselines for Information Systems and Organizations,” a guidance document that will serve a key role in the United States government’s efforts to secure and protect the networks and systems it operates and those run by federal contractors. NIST explained:
    • This publication establishes security and privacy control baselines for federal information systems and organizations and provides tailoring guidance for those baselines. The use of the security control baselines is mandatory, in accordance with OMB Circular A-130 [OMB A-130] and the provisions of the Federal Information Security Modernization Act4 [FISMA], which requires the implementation of a set of minimum controls to protect federal information and  information systems. Whereas use of the privacy control baseline is not mandated by law or [OMB A-130], SP 800-53B, along with other supporting NIST publications, is designed to help organizations identify the security and privacy controls needed to manage risk and satisfy the security and privacy requirements in FISMA, the Privacy Act of 1974 [PRIVACT], selected OMB policies (e.g., [OMB A-130]), and designated Federal Information Processing Standards (FIPS), among others
  • The United States Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) released an “Election Vulnerability Reporting Guide
    to provide “election administrators with a step-by-step guide, list of resources, and a template for establishing a successful vulnerability disclosure program to address possible vulnerabilities in their state and local election systems…[and] [t]he six steps include:
    • Step 1: Identify Systems Where You Would Accept Security Testing, and those Off-Limits
    • Step 2: Draft an Easy-to-Read Vulnerability Disclosure Policy (See Appendix III)
    • Step 3: Establish a Way to Receive Reports/Conduct Follow-On Communication
    • Step 4: Assign Someone to Thank and Communicate with Researchers
    • Step 5: Assign Someone to Vet and Fix the Vulnerabilities
    • Step 6: Consider Sharing Information with Other Affected Parties
  • The United Kingdom’s Information Commissioner’s Office (ICO) has issued “Guidance on AI and data protection” that “clarifies how you can assess the risks to rights and freedoms that AI can pose from a data protection perspective; and the appropriate measures you can implement to mitigate them.” The ICO explained “[w]hile data protection and ‘AI ethics’ overlap, this guidance does not provide generic ethical or design principles for your use of AI.” The ICO stated “[i]t corresponds to data protection principles, and is structured as follows:
    • part one addresses accountability and governance in AI, including data protection impact assessments (DPIAs);
    • part two covers fair, lawful and transparent processing, including lawful bases, assessing and improving AI system performance, and mitigating potential discrimination;
    • part three addresses data minimisation and security; and
    • part four covers compliance with individual rights, including rights related to automated decision-making.
  •  20 state attorneys general wrote Facebook Chief Executive Officer Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg “to request  that  you  take  additional  steps  to prevent   Facebook   from   being used   to   spread   disinformation   and   hate   and   to   facilitate discrimination.” They also asked “that you take more steps to provide redress for users who fall victim to intimidation and harassment, including violence and digital abuse.” The attorneys general said that “[b]ased on our collective experience, we believe that Facebook should take additional actions including the following steps—many of which are highlighted in Facebook’s recent Civil Rights Audit—to strengthen its commitment to civil rights and fighting disinformation and discrimination:
    • Aggressively enforce Facebook policies against hate speech and organized hate organizations: Although Facebook has developed policies against hate speech and organizations that peddle it, we remain concerned that Facebook’s policies on Dangerous Individuals and Organizations, including but not limited to its policies on white nationalist and white supremacist content, are not enforced quickly and comprehensively enough. Content that violates Facebook’s own policies too often escapes removal just because it comes as coded language, rather than specific magic words. And even where Facebook takes steps to address a particular violation, it often fails to proactively address the follow-on actions by replacement or splinter groups that quickly emerge.
    • Allow public, third-party audits of hate content and enforcement: To gauge the ongoing progress of Facebook’s enforcement efforts, independent experts should be permitted access to the data necessary to conduct regular, transparent third-party audits of hate and hate-related misinformation on the platform, including any information made available to the Global Oversight Board. As part of this effort, Facebook should capture data on the prevalence of different forms of hate content on the platform, whether or not covered by Facebook’s own community standards, thus allowing the public to determine whether enforcement of anti-hate policies differs based on the type of hate content at issue.
    • Commit to an ongoing, independent analysis of Facebook’s content population scheme and the prompt development of best practices guidance: By funneling users toward particular types of content, Facebook’s content population scheme, including its algorithms, can push users into extremist online communities that feature divisive and inflammatory messages, often directed at particular groups. Although Facebook has conducted research and considered programs to reduce this risk, there is still no mandatory guidance for coders and other teams involved in content population. Facebook should commit to an ongoing, independent analysis of its content population scheme, including its algorithms, and also continuously implement mandatory protocols as best practices are identified to curb bias and prevent recommendations of hate content and groups.
    • Expand policies limiting inflammatory advertisements that vilify minority groups: Although Facebook currently prohibits ads that claim that certain people, because of their membership in a protected group, pose a threat to the physical safety of communities or the nation, its policies still allow attacks that characterize such groups as threats to national culture or values. The current prohibition should be expanded to include such ads.
  • New Zealand’s Ministry of Statistics “launched the Algorithm Charter for Aotearoa New Zealand” that “signals that [the nation’s agencies] are committed to being consistent, transparent and accountable in their use of algorithms.”
    • The Ministry explained “[t]he Algorithm Charter is part of a wider ecosystem and works together with existing tools, networks and research, including:
      • Principles for the Safe and Effective Use of Data and Analytics (Privacy Commissioner and Government Chief Data Steward, 2018)
      • Government Use of Artificial Intelligence in New Zealand (New Zealand Law Foundation and Otago University, 2019)
      • Trustworthy AI in Aotearoa – AI Principles (AI Forum New Zealand, 2020)
      • Open Government Partnership, an international agreement to increase transparency.
      • Data Protection and Use Policy (Social Wellbeing Agency, 2020)
      • Privacy, Human Rights and Ethics Framework (Ministry of Social Development).
  • The European Union (EU) imposed its first cyber sanctions under its Framework for a Joint EU Diplomatic Response to Malicious Cyber Activities (aka the cyber diplomacy toolbox) against six hackers and three entities from the Russian Federation, the People’s Republic of China (PRC) and the Democratic People’s Republic of Korea for attacks against the against the Organisation for the Prohibition of Chemical Weapons (OPCW) in the Netherlands, the malware attacks known as Petya and WannaCry, and Operation Cloud Hopper. The EU’s cyber sanctions follow sanctions the United States has placed on a number of people and entities from the same nations and also indictments the U.S. Department of Justice has announced over the years. The sanctions are part of the effort to levy costs on nations and actors that conduct cyber attacks. The EU explained:
    • The attempted cyber-attack was aimed at hacking into the Wi-Fi network of the OPCW, which, if successful, would have compromised the security of the network and the OPCW’s ongoing investigatory work. The Netherlands Defence Intelligence and Security Service (DISS) (Militaire Inlichtingen- en Veiligheidsdienst – MIVD) disrupted the attempted cyber-attack, thereby preventing serious damage to the OPCW.
    • “WannaCry” disrupted information systems around the world by targeting information systems with ransomware and blocking access to data. It affected information systems of companies in the Union, including information systems relating to services necessary for the maintenance of essential services and economic activities within Member States.
    • “NotPetya” or “EternalPetya” rendered data inaccessible in a number of companies in the Union, wider Europe and worldwide, by targeting computers with ransomware and blocking access to data, resulting amongst others in significant economic loss. The cyber-attack on a Ukrainian power grid resulted in parts of it being switched off during winter.
    • “Operation Cloud Hopper” has targeted information systems of multinational companies in six continents, including companies located in the Union, and gained unauthorised access to commercially sensitive data, resulting in significant economic loss.
  • The United States’ Federal Communications Commission (FCC) is asking for comments on the Department of Commerce’s the National Telecommunications and Information Administration’s (NTIA) petition asking the agency to start a rulemaking to clarify alleged ambiguities in 47 USC 230 regarding the limits of the liability shield for the content others post online versus the liability protection for “good faith” moderation by the platform itself. The NTIA was acting per direction in an executive order allegedly aiming to correct online censorship. Executive Order 13925, “Preventing Online Censorship” was issued in late May after Twitter factchecked two of President Donald Trump’s Tweets regarding false claims made about mail voting in California in response to the COVID-19 pandemic. Comments are due by 2 September.
  • The Australian Competition & Consumer Commission (ACCC) released for public consultation a draft of “a mandatory code of conduct to address bargaining power imbalances between Australian news media businesses and digital platforms, specifically Google and Facebook.” The government in Canberra had asked the ACCC to draft this code earlier this year after talks broke down between the Australian Treasury
    • The ACCC explained
      • The code would commence following the introduction and passage of relevant legislation in the Australian Parliament. The ACCC released an exposure draft of this legislation on 31 July 2020, with consultation on the draft due to conclude on 28 August 2020. Final legislation is expected to be introduced to Parliament shortly after conclusion of this consultation process.
    • This is not the ACCC’s first interaction with the companies. Late last year, the ACCC announced a legal action against Google “alleging they engaged in misleading conduct and made false or misleading representations to consumers about the personal location data Google collects, keeps and uses” according to the agency’s press release. In its initial filing, the ACCC is claiming that Google mislead and deceived the public in contravention of the Australian Competition Law and Android users were harmed because those that switched off Location Services were unaware that their location information was still be collected and used by Google for it was not readily apparent that Web & App Activity also needed to be switched off.
    • A year ago, the ACCC released its final report in its “Digital Platforms Inquiry” that “proposes specific recommendations aimed at addressing some of the actual and potential negative impacts of digital platforms in the media and advertising markets, and also more broadly on consumers.”
  • The United States’ Department of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) issued “released core guidance documentation for the Trusted Internet Connections (TIC) program, developed to assist agencies in protecting modern information technology architectures and services.” CISA explained “In accordance with the Office of Management and Budget (OMB) Memorandum (M) 19-26: Update to the TIC Initiative, TIC 3.0 expands on the original initiative to drive security standards and leverage advances in technology to secure a wide spectrum of agency network architectures.” Specifically, CISA released three core guidance documents:
    • Program Guidebook (Volume 1) – Outlines the modernized TIC program and includes its historical context
    • Reference Architecture (Volume 2) – Defines the concepts of the program to guide and constrain the diverse implementations of the security capabilities
  • Senators Ron Wyden (D-OR), Bill Cassidy (R-LA) and ten other Members wrote the Federal Trade Commission (FTC) urging the agency “to investigate widespread privacy violations by companies in the advertising technology (adtech) industry that are selling private data about millions of Americans, collected without their knowledge or consent from their phones, computers, and smart TVs.” They asked the FTC “to use its authority to conduct broad industry probes under Section 6(b) of the FTC Act to determine whether adtech companies and their data broker partners have violated federal laws prohibiting unfair and deceptive business practices.” They argued “[t]he FTC should not proceed with its review of the Children’s Online Privacy Protection Act (COPPA) Rule before it has completed this investigation.”
  •  “100 U.S. women lawmakers and current and former legislators from around the world,” including Speaker of the House Nancy Pelosi (D-CA), sent a letter to Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg urging the company “to take decisive action to protect women from rampant and increasing online attacks on their platform that have caused many women to avoid or abandon careers in politics and public service.” They noted “[j]ust a few days ago, a manipulated and widely shared video that depicted Speaker Pelosi slurring her speech was once again circulating on major social media platforms, gaining countless views before TikTok, Twitter, and YouTube all removed the footage…[and] [t]he video remains on Facebook and is labeled “partly false,” continuing to gain millions of views.” The current and former legislators “called on Facebook to enforce existing rules, including:
    • Quick removal of posts that threaten candidates with physical violence, sexual violence or death, and that glorify, incite or praise violence against women; disable the relevant accounts, and refer offenders to law enforcement.
    • Eliminate malicious hate speech targeting women, including violent, objectifying or dehumanizing speech, statements of inferiority, and derogatory sexual terms;
    • Remove accounts that repeatedly violate terms of service by threatening, harassing or doxing or that use false identities to attack women leaders and candidates; and
    • Remove manipulated images or videos misrepresenting women public figures.
  • The United States’ Departments of Commerce and Homeland Security released an update “highlighting more than 50 activities led by industry and government that demonstrate progress in the drive to counter botnet threats.” in May 2018, the agencies submitted “A Report to the President on Enhancing the Resilience of the Internet and Communications Ecosystem Against Botnets and Other Automated, Distributed Threats” that identified a number of steps and prompted a follow on “A Road Map Toward Resilience Against Botnets” released in November 2018.
  • United States (U.S.) Secretary of Commerce Wilbur Ross and European Commissioner for Justice Didier Reynders released a joint statement explaining that “[t]he U.S. Department of Commerce and the European Commission have initiated discussions to evaluate the potential for an enhanced EU-U.S. Privacy Shield framework to comply with the July 16 judgment of the Court of Justice of the European Union in the Schrems II case.”
    • Maximillian Schrems filed a complaint against Facebook with Ireland’s Data Protection Commission (DPC) in 2013, alleging that the company’s transfer of his personal data violated his rights under European Union law because of the mass U.S. surveillance revealed by former National Security Agency (NSA) contractor Edward Snowden. Ultimately, this case resulted in a 2015 Court of Justice of the European Union (CJEU) ruling that invalidated the Safe Harbor agreement under which the personal data of EU residents was transferred to the US by commercial concerns. The EU and US executed a follow on agreement, the EU-U.S. Privacy Shield, that was designed to address some of the problems the CJEU turned up, and the U.S. passed a law, the “Judicial Redress Act of 2015” (P.L. 114-126), to provide EU citizens a way to exercise their EU rights in US courts via the “Privacy Act of 1974.”
    • However, Schrems continued and soon sought to challenge the legality of the European Commission’s signing off on the Privacy Shield agreement, the adequacy decision issued in 2016, and also the use of standard contractual clauses (SCC) by companies for the transfer of personal data to the US. The CJEU struck down the adequacy decision, throwing into doubt many entities’ transfers out of the EU into the U.S. but upheld SCCs in a way that suggested EU data protection authorities (DPA) may need to review all such agreements to ensure they comply with EU law.
  • The European Commission (EC) announced an “an in-depth investigation to assess the proposed acquisition of Fitbit by Google under the EU Merger Regulation.” The EC voiced its concern “that the proposed transaction would further entrench Google’s market position in the online advertising markets by increasing the already vast amount of data that Google could use for personalisation of the ads it serves and displays.” The EC detailed its “preliminary competition concerns:
    • Following its first phase investigation, the Commission has concerns about the impact of the transaction on the supply of online search and display advertising services (the sale of advertising space on, respectively, the result page of an internet search engine or other internet pages), as well as on the supply of ”ad tech” services (analytics and digital tools used to facilitate the programmatic sale and purchase of digital advertising). By acquiring Fitbit, Google would acquire (i) the database maintained by Fitbit about its users’ health and fitness; and (ii) the technology to develop a database similar to Fitbit’s one.
    • The data collected via wrist-worn wearable devices appears, at this stage of the Commission’s review of the transaction, to be an important advantage in the online advertising markets. By increasing the data advantage of Google in the personalisation of the ads it serves via its search engine and displays on other internet pages, it would be more difficult for rivals to match Google’s online advertising services. Thus, the transaction would raise barriers to entry and expansion for Google’s competitors for these services, to the ultimate detriment of advertisers and publishers that would face higher prices and have less choice.
    • At this stage of the investigation, the Commission considers that Google:
      • is dominant in the supply of online search advertising services in the EEA countries (with the exception of Portugal for which market shares are not available);
      • holds a strong market position in the supply of online display advertising services at least in Austria, Belgium, Bulgaria, Croatia, Denmark, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Norway, Poland, Romania, Slovakia, Slovenia, Spain, Sweden and the United Kingdom, in particular in relation to off-social networks display ads;
      • holds a strong market position in the supply of ad tech services in the EEA.
    • The Commission will now carry out an in-depth investigation into the effects of the transaction to determine whether its initial competition concerns regarding the online advertising markets are confirmed.
    • In addition, the Commission will also further examine:
      • the effects of the combination of Fitbit’s and Google’s databases and capabilities in the digital healthcare sector, which is still at a nascent stage in Europe; and
      • whether Google would have the ability and incentive to degrade the interoperability of rivals’ wearables with Google’s Android operating system for smartphones once it owns Fitbit.
    • In February after the deal had been announced, the European Data Protection Board (EDPB) made clear it position that Google and Fitbit will need to scrupulously observe the General Data Protection Regulation’s privacy and data security requirements if the body is sign off on the proposed $2.2 billion acquisition. Moreover, at present Google has not informed European Union (EU) regulators of the proposed deal. The deal comes at a time when both EU and U.S. regulators are already investigating Google for alleged antitrust and anticompetitive practices, and the EDPB’s opinion could carry weight in this process.
  • The United States’ (U.S.) Department of Homeland Security released a Privacy Impact Assessment for the U.S. Border Patrol (USPB) Digital Forensics Programs that details how it may conduct searches of electronic devices at the U.S. border and ports of entry. DHS explained
    • As part of USBP’s law enforcement duties, USBP may search and extract information from electronic devices, including: laptop computers; thumb drives; compact disks; digital versatile disks (DVDs); mobile phones; subscriber identity module (SIM) cards; digital cameras; vehicles; and other devices capable of storing electronic information.
    • Last year, a U.S. District Court held that U.S. Customs and Border Protection (CPB) and U.S. Immigration and Customs Enforcement’s (ICE) current practices for searches of smartphones and computers at the U.S. border are unconstitutional and the agency must have reasonable suspicion before conducting such a search. However, the Court declined the plaintiffs’ request that the information taken off of their devices be expunged by the agencies. This ruling follows a Department of Homeland Security Office of the Inspector General (OIG) report that found CPB “did not always conduct searches of electronic devices at U.S. ports of entry according to its Standard Operating Procedures” and asserted that “[t]hese deficiencies in supervision, guidance, and equipment management, combined with a lack of performance measures, limit [CPB’s] ability to detect and deter illegal activities related to terrorism; national security; human, drug, and bulk cash smuggling; and child pornography.”
    • In terms of a legal backdrop, the United States Supreme Court has found that searches and seizures of electronic devices at borders and airports are subject to lesser legal standards than those conducted elsewhere in the U.S. under most circumstances. Generally, the government’s interest in securing the border against the flow of contraband and people not allowed to enter allow considerable leeway to the warrant requirements for many other types of searches. However, in recent years two federal appeals courts (the Fourth and Ninth Circuits) have held that searches of electronic devices require suspicion on the part of government agents while another appeals court (the Eleventh Circuit) held differently. Consequently, there is not a uniform legal standard for these searches.
  • The Inter-American Development Bank (IDB) and the Organization of Americans States (OAS) released their second assessment of cybersecurity across Latin America and the Caribbean that used the Cybersecurity Capacity Maturity Model for Nations (CMM) developed at University of Oxford’s Global Cyber Security Capacity Centre (GSCC). The IDB and OAS explained:
    • When the first edition of the report “Cybersecurity: Are We Ready in Latin America and the Caribbean?” was released in March 2016, the IDB and the OAS aimed to provide the countries of Latin America and the Caribbean (LAC) not only with a picture of the state of cybersecurity but also guidance about the next steps that should be pursued to strengthen national cybersecurity capacities. This was the first study of its kind, presenting the state of cybersecurity with a comprehensive vision and covering all LAC countries.
    • The great challenges of cybersecurity, like those of the internet itself, are of a global nature. Therefore, it is undeniable that the countries of LAC must continue to foster greater cooperation among themselves, while involving all relevant actors, as well as establishing a mechanism for monitoring, analysis, and impact assessment related to cybersecurity both nationally and regionally. More data in relation to cybersecurity would allow for the introduction of a culture of cyberrisk management that needs to be extended both in the public and private sectors. Countries must be prepared to adapt quickly to the dynamic environment around us and make decisions based on a constantly changing threat landscape. Our member states may manage these risks by understanding the impact on and the likelihood of cyberthreats to their citizens, organizations, and national critical infrastructure. Moving to the next level of maturity will require a comprehensive and sustainable cybersecurity policy, supported by the country’s political agenda, with allocation of  financial resources and qualified human capital to carry it out.
    • The COVID-19 pandemic will pass, but events that will require intensive use of digital technologies so that the world can carry on will continue happening. The challenge of protecting our digital space will, therefore, continue to grow. It is the hope of the IDB and the OAS that this edition of the report will help LAC countries to have a better understanding of their current state of cybersecurity capacity and be useful in the design of the policy initiatives that will lead them to increase their level of cyberresilience.
  • The European Data Protection Supervisor (EDPS) issued an opinion on “the European Commission’s action plan for a comprehensive Union policy on preventing money laundering and terrorism financing (C(2020)2800 final), published on 7 May 2020.” The EDPS asserted:
    • While  the  EDPS acknowledges the  importance  of  the  fight  against money  laundering  and terrorism financing as an objective of general interest, we call for the legislation to strike a balance between the interference with the fundamental rights of privacy and personal data protection and  the measures that  are  necessary  to  effectively  achieve  the  general  interest goals on anti-money  laundering  and  countering the  financing  of terrorism (AML/CFT) (the principle of proportionality).
    • The EDPS recommends that the Commission monitors the effective implementation of the existing  AML/CFT  framework while ensuring that the  GDPR  and  the  data  protection framework are respected and complied with. This is particularly relevant for the works on the interconnection of central bank account mechanisms and beneficial ownership registers that should be largely inspired by the principles of data minimisation, accuracy and privacy-by-design and by default.  

Further Reading

  • China already has your data. Trump’s TikTok and WeChat bans can’t stop that.” By Aynne Kokas – The Washington Post. This article persuasively makes the case that even if a ban on TikTok and WeChat were to work, and there are substantive questions as to how a ban would given how widely the former has been downloaded, the People’s Republic of China (PRC) is almost certainly acquiring massive reams of data on Americans through a variety of apps, platforms, and games. For example, Tencent, owner of WeChat, has a 40% stake in Epic Games that has Fortnite, a massively popular multiplayer game (if you have never heard of it, ask one of the children in your family). Moreover, a recent change to PRC law mandates that companies operating in the PRC must share their data bases for cybersecurity reviews, which may be an opportunity aside from hacking and exfiltrating United States entities, to access data. In summation, if the Trump Administration is serious about stopping the flow of data from the U.S. to the PRC, these executive orders will do very little.
  • Big Tech Makes Inroads With the Biden Campaign” by David McCabe and Kenneth P. Vogel – The New York Times. Most likely long before former Vice President Joe Biden clinched the Democratic nomination, advisers volunteered to help plot out his policy positions, a process that intensified this year. Of course, this includes technology policy, and many of those volunteering for the campaign’s Innovation Policy Committee have worked or are working for large technology companies directly or as consultants or lobbyists. This piece details some of these people and their relationships and how the Biden campaign is managing possible conflicts of interest. Naturally, those on the left wing of the Democratic Party calling for tighter antitrust, competition, and privacy regulation are concerned that Biden might be pulled away from these positions despite his public statements arguing that the United States government needs to get tougher with some practices.
  • A Bible Burning, a Russian News Agency and a Story Too Good to Check Out” By Matthew Rosenberg and Julian E. Barnes – The New York Times. The Russian Federation seems to be using a new tactic with some success for sowing discord in the United States that is the information equivalent of throwing fuel onto a fire. In this case, a fake story manufactured by a Russian outlet was seized on by some prominent Republicans, in part, because it fits their preferred world view of protestors. In this instance, a Russian outlet created a fake story amplifying an actual event that went viral. We will likely see more of this, and it is not confined to fake stories intended to appeal to the right. The same is happening with content meant for the left wing in the United States.
  • Facebook cracks down on political content disguised as local news” by Sara Fischer – Axios. As part of its continuing effort to crack down on violations of its policies, Facebook will no longer allow groups with a political viewpoint to masquerade as news. The company and outside experts have identified a range of instances where groups propagating a viewpoint, as opposed to reporting, have used a Facebook exemption by pretending to be local news outlets.
  • QAnon groups have millions of members on Facebook, documents show” By Ari Sen and Brandy Zadrozny – NBC News. It appears as if some Facebooks are leaking the results of an internal investigation that identified more than 1 million users who are part of QAnon groups. Most likely these employees want the company to take a stronger stance on the conspiracy group QAnon like the company has with COVID-19 lies and misinformation.
  • And, since Senator Kamala Harris (D-CA) was named former Vice President Joe Biden’s (D-DE) vice presidential pick, this article has become even more relevant than when I highlighted it in late July: “New Emails Reveal Warm Relationship Between Kamala Harris And Big Tech” – HuffPost. Obtained via an Freedom of Information request, new email from Senator Kamala Harris’ (D-CA) tenure as her state’s attorney general suggest she was willing to overlook the role Facebook, Google, and others played and still play in one of her signature issues: revenge porn. This article makes the case Harris came down hard on a scammer running a revenge porn site but did not press the tech giants with any vigor to take down such material from their platforms. Consequently, the case is made if Harris is former Vice President Joe Biden’s vice presidential candidate, this would signal a go easy approach on large companies even though many Democrats have been calling to break up these companies and vigorously enforce antitrust laws. Harris has largely not engaged on tech issues during her tenure in the Senate. To be fair, many of these companies are headquartered in California and pump billions of dollars into the state’s economy annually, putting Harris in a tricky position politically. Of course, such pieces should be taken with a grain of salt since it may have been suggested or planted by one of Harris’ rivals for the vice president nomination or someone looking to settle a score.
  • Unwanted Truths: Inside Trump’s Battles With U.S. Intelligence Agencies” by Robert Draper – The New York Times. A deeply sourced article on the outright antipathy between President Donald Trump and Intelligence Community officials, particularly over the issue of how deeply Russia interfered in the election in 2016. A number of former officials have been fired or forced out because they refused to knuckle under to the White House’s desire to soften or massage conclusions of Russia’s past and current actions to undermine the 2020 election in order to favor Trump.
  • Huawei says it’s running out of chips for its smartphones because of US sanctions” By Kim Lyons – The Verge and “Huawei: Smartphone chips running out under US sanctions” by Joe McDonald – The Associated Press. United States (U.S.) sanctions have started biting the Chinese technology company Huawei, which announced it will likely run out of processor chips for its smartphones. U.S. sanctions bar any company from selling high technology items like processors to Huawei, and this capability is not independently available in the People’s Republic of China (PRC) at present.
  • Targeting WeChat, Trump Takes Aim at China’s Bridge to the World” By Paul Mozur and Raymond Zhong – The New York Times. This piece explains WeChat, the app, the Trump Administration is trying to ban in the United States (U.S.) without any warning. It is like a combination of Facebook, WhatsApp, news app, and payment platform and is used by more than 1.2 billion people.
  • This Tool Could Protect Your Photos From Facial Recognition” By Kashmir Hill – The New York Times. Researchers at the University of Chicago have found a method of subtly altering photos of people that appears to foil most facial recognition technologies. However, a number of experts interviewed said it is too late to stop companies like AI Clearview.
  • I Tried to Live Without the Tech Giants. It Was Impossible.” By Kashmir Hill – The New York Times. This New York Times reporter tried living without the products of large technology companies, which involved some fairly obvious challenges and some that were not so obvious. Of course, it was hard for her to skip Facebook, Instagram, and the like, but cutting out Google and Amazon proved hardest and basically impossible because of the latter’s cloud presence and the former’s web presence. The fact that some of the companies cannot be avoided if one wants to be online likely lends weight to those making the case these companies are anti-competitive.
  • To Head Off Regulators, Google Makes Certain Words Taboo” by Adrianne Jeffries – The Markup. Apparently, in what is a standard practice at large companies, employees at Google were coached to avoid using certain terms or phrases that antitrust regulators would take notice of such as: “market,” “barriers to entry,” and “network effects.” The Markup obtained a 16 August 2019 document titled “Five Rules of Thumb For Written Communications” that starts by asserting “[w]ords matter…[e]specially in antitrust laws” and goes on to advise Google’s employees:
    • We’re out to help users, not hurt competitors.
    • Our users should always be free to switch, and we don’t lock anyone in.
    • We’ve got lots of competitors, so don’t assume we control or dominate any market.
    • Don’t try and define a market or estimate our market share.
    • Assume every document you generate, including email, will be seen by regulators.
  • Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment” By Craig Silverman and Ryan Mac – BuzzFeed News. A Facebook engineer was fired after adducing proof in an internal communications system that the social media platform is more willing to change false and negative ratings to claims made by conservative outlets and personalities than any other viewpoint. If this is true, it would be opposite to the narrative spun by the Trump Administration and many Republicans in Congress. Moreover, Facebook’s incentives would seem to align with giving conservatives more preferential treatment because many of these websites advertise on Facebook, the company probably does not want to get crosswise with the Administration, sensational posts and content drive engagement which increases user numbers that allows for higher ad rates, and it wants to appear fair and impartial.
  • How Pro-Trump Forces Work the Refs in Silicon Valley” By Ben Smith – The New York Times. This piece traces the nearly four decade old effort of Republicans to sway mainstream media and now Silicon Valley to its viewpoint.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2020. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo credit: Gerd Altmann on Pixabay