Further Reading, Other Developments, and Coming Events (18 February 2021)

Further Reading

  • Google, Microsoft, Qualcomm Protest Nvidia’s Acquisition of Arm Ltd.” By  David McLaughlin, Ian King, and Dina Bass — Bloomberg. Major United States (U.S.) tech multinationals are telling the U.S. government that Nvidia’s proposed purchase of Arm will hurt competition in the semi-conductor market, an interesting position for an industry renowned for being acquisition hungry. The British firm, Arm, is a key player in the semi-conductor business that deals with all companies, and the fear articulated by firms like Qualcomm, Microsoft, and Google is that Nvidia will cut supply and increase prices once it controls Arm. According to one report, Arm has made something like 95% of the chip architecture for the world’s smartphones and 95% of the chips made in the People’s Republic of China (PRC). The deal has to clear U.S., British, EU, and PRC regulators. In the U.S., the Federal Trade Commission (FTC) has reportedly made very large document requests, which indicates their interest in digging into the deal and suggests the possibility they may come out against the acquisition. The FTC may also be waiting to read the mood in Washington as there is renewed, bipartisan concern about antitrust and competition and about the semi-conductor industry. Finally, acting FTC Chair Rebecca Kelly Slaughter has come out against a lax approach to so-called vertical mergers such as the proposed Nvidia-Arm deal, which may well be the ultimate position of a Democratic FTC.
  • Are Private Messaging Apps the Next Misinformation Hot Spot?” By Brian X. Chen and Kevin Roose — The New York Times. The conclusion these two tech writers reach is that, on balance, private messaging apps like Signal and Telegram, are better for society than not. Moreover, they reason it is better to have extremists migrate from platforms like Facebook to ones where it is much harder to spread their views and proselytize.
  • Amazon Has Transformed the Geography of Wealth and Power” By Vauhini Vara — The Atlantic. A harrowing view of the rise of Amazon cast against the decline of the middle class and the middle of the United States (U.S.) Correlation is not causation, of course, but the company has sped the decline of a number of industries and arguably a number of cities.
  • Zuckerberg responds to Apple’s privacy policies: “We need to inflict pain” By Samuel Axon — Ars Technica. Relations between the companies have worsened as their CEO have taken personal shots at each other in public and private culminating in Apple’s change to its iOS requiring users to agree to being tracked by apps across the internet, which is Facebook’s bread and butter. Expect things to get worse as both Tim Cook and Mark Zuckerberg think augmented reality or mixed reality are the next major frontiers in tech, suggesting the competition may intensify.
  • Inside the Making of Facebook’s Supreme Court” By Kate Klonik — The New Yorker. A very immersive piece on the genesis and design of the Facebook Oversight Board, originally conceived of as a supreme court for content moderation. However, not all content moderation decisions can be referred to the Board; in fact, only when Facebook decides to take down content does a person have a right to appeal. Otherwise, one must depend on the company’s beneficence. So, for example, if Facebook decided to leave up content that is racist toward Muslims, a Facebook user could not appeal the decision. Additionally, Board decisions are not precedential, which, in plain English means, if the Board decides a take down of, say, Nazi propaganda comports with Facebook’s rules, the company would not be obligated to take down similar Nazi content thereafter. This latter wrinkle will ultimately serve to limit the power of the Board. The piece quotes critics, including many involved with the design and establishment of the Board, who see the final form as being little more than a fig leaf for public relations.

Other Developments

  • The Department of Health and Human Services (HHS) was taken to task by a federal appeals court in a blunt opinion decrying the agency’s failure to articulate even the most basic rationale for a multi-million dollar fine of a major Houston hospital for its data security and data privacy violations. HHS’ Office of Civil Rights had levied $4.348 million find on  the University of Texas M.D. Anderson Cancer Center (M.D. Anderson) for violations of the regulations promulgated pursuant to the “Health Insurance Portability and Accountability Act of 1996” (P.L. 104–191) and “Health Information Technology for Economic and Clinical Health Act” (HITECH Act) (P.L. 111-5) governing the security and privacy of certain classes of health information. M.D. Anderson appealed the decision, losing at each stage, until it reached the United States Court of Appeals for the Fifth Circuit (Fifth Circuit.) In its ruling, the Fifth Circuit held that OCR’s “decision  was  arbitrary,  capricious,  and contrary to law.” The Fifth Circuit vacated the penalty and sent the matter back to HHS for further consideration.
    • In its opinion, the Fifth Circuit explained the facts:
      • First, back in 2012, an M.D. Anderson faculty member’s laptop was stolen. The laptop was not encrypted or password-protected but contained “electronic protected health information (ePHI) for 29,021 individuals.” Second, also in 2012, an M.D. Anderson trainee lost an unencrypted USB thumb drive during her evening commute. That thumb drive contained ePHI for over 2,000 individuals. Finally, in 2013, a visiting researcher at M.D. Anderson misplaced another unencrypted USB thumb drive, this time containing ePHI for nearly 3,600 individuals.
      • M.D. Anderson disclosed these incidents to HHS. Then HHS determined that M.D. Anderson had violated two federal regulations. HHS promulgated both of those regulations under the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and the Health Information Technology for Economic and Clinical Health Act of 2009 (the “HITECH Act”). The first regulation requires entities covered by HIPAA and the HITECH Act to “[i]mplement a mechanism to encrypt” ePHI or adopt some other “reasonable and appropriate” method to limit access to patient data. 45 C.F.R. §§ 164.312(a)(2)(iv), 164.306(d) (the “Encryption Rule”). The second regulation prohibits the unpermitted disclosure of protected health information. Id. § 164.502(a) (the “Disclosure Rule”).
      • HHS also determined that M.D. Anderson had “reasonable cause” to know that it had violated the rules. 42 U.S.C. § 1320d-5(a)(1)(B) (setting out the “reasonable cause” culpability standard). So, in a purported exercise of its power under 42 U.S.C. § 1320d-5 (HIPAA’s enforcement provision), HHS assessed daily penalties of $1,348,000 for the Encryption Rule violations, $1,500,000 for the 2012 Disclosure Rule violations, and $1,500,000 for the 2013 Disclosure Rule violations. In total, HHS imposed a civil monetary penalty (“CMP” or “penalty”) of $4,348,000.
      • M.D. Anderson unsuccessfully worked its way through two levels of administrative appeals. Then it petitioned our court for review. See 42 U.S.C. § 1320a-7a(e)  (authorizing  judicial  review).  After  M.D.  Anderson  filed  its  petition, the Government conceded that it could not defend its penalty and asked us to reduce it by a factor of 10 to $450,000. 
  • The Australian Senate Standing Committee for the Scrutiny of Bills has weighed in on both the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 and the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, two major legislative proposals put forth in December 2020. This committee plays a special role in legislating in the Senate, for it must “scrutinise each bill introduced into the Parliament as to whether the bills, by express words or otherwise:
    • (i)  trespass unduly on personal rights and liberties;
    • (ii)  make rights, liberties or obligations unduly dependent upon insufficiently defined administrative powers;
    • (iii)  make rights, liberties or obligations unduly dependent upon non- reviewable decisions;
    • (iv)  inappropriately delegate legislative powers; or
    • (v)  insufficiently subject the exercise of legislative power to parliamentary scrutiny.
    • Regarding the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 (see here for analysis), the committee explained:
      • The bill seeks to amend the Surveillance Devices Act 2004 (SD Act), the Crimes Act 1914 (Crimes Act) and associated legislation to introduce three new types of warrants available to the Australian Federal Police (AFP) and the Australian Criminal Intelligence Commission (ACIC) for investigating and disrupting online crime. These are:
        • data disruption warrants, which enable the AFP and the ACIC to modify, add, copy or delete data for the purposes of frustrating the commission of serious offences online;
        • network activity warrants, which permit access to devices and networks used by suspected criminal networks, and
        • account takeover warrants, which provide the AFP and the ACIC with the ability to take control of a person’s online account for the purposes of gathering evidence to further a criminal investigation.
    • The committee flagged concerns about the bill in these categories:
      • Authorisation of coercive powers
        • Issuing authority
        • Time period for warrants
        • Mandatory considerations
        • Broad scope of offences
      • Use of coercive powers without a warrant
        • Emergency authorisations
      • Innocent third parties
        • Access to third party computers, communications in transit and account-based data
        • Compelling third parties to provide information
        • Broad definition of ‘criminal network of individuals’
      • Use of information obtained through warrant processes
        • Prohibitions on use
        • Storage and destruction of records
      • Presumption of innocence—certificate constitutes prima facie evidence
      • Reversal of evidential burden of proof
      • Broad delegation of administrative powers
        • Appropriate authorising officers of the ACIC
    • The committee asked for the following feedback from the government on the bill:
      • The committee requests the minister’s detailed advice as to:
        • why it is considered necessary and appropriate to enable law enforcement officers to disrupt or access data or takeover an online account without a warrant in certain emergency situations (noting the coercive and intrusive nature of these powers and the ability to seek a warrant via the telephone, fax or email);
        • the appropriateness of retaining information obtained under an emergency authorisation that is subsequently not approved by a judge or AAT member;
        • and the appropriateness of enabling law enforcement agencies to act to conceal any thing done under a warrant after the warrant has ceased to be in force, and whether the bill could be amended to provide a process for obtaining a separate concealment of access warrant if the original warrant has ceased to be in force.
      • The committee requests the minister’s detailed advice as to:
        • the effect of Schedules 1-3 on the privacy rights of third parties and a detailed justification for the intrusion on those rights, in particular:
        • why proposed sections 27KE and 27KP do not specifically require the judge or nominated AAT member to consider the privacy implications
        • for third parties of authorising access to a third party computer or
        • communication in transit;
        • why the requirement that an issuing authority be satisfied that an assistance order is justifiable and proportionate, having regard to the offences to which it would relate, only applies to an assistance order with respect to data disruption warrants, and not to all warrants; and
        • whether the breadth of the definitions of ‘electronically linked group of individuals’ and ‘criminal network of individuals’ can be narrowed to reduce the potential for intrusion on the privacy rights of innocent third parties.
    • The committee requests the minister’s detailed advice as to:
      • whether all of the exceptions to the restrictions on the use, recording or disclosure of protected information obtained under the warrants are appropriate and whether any exceptions are drafted in broader terms than is strictly necessary; and
      • why the bill does not require review of the continued need for the retention of records or reports comprising protected information on a more regular basis than a period of five years.
    • As the explanatory materials do not adequately address these issues, the committee requests the minister’s detailed advice as to:
      • why it is considered necessary and appropriate to provide for evidentiary certificates to be issued in connection a data disruption warrant or emergency authorisation, a network access warrant, or an account takeover warrant;
      • the circumstances in which it is intended that evidentiary certificates would be issued, including the nature of any relevant proceedings; and
      • the impact that issuing evidentiary certificates may have on individuals’ rights and liberties, including on the ability of individuals to challenge the lawfulness of actions taken by law enforcement agencies.
    • As the explanatory materials do not address this issue, the committee requests the minister’s advice as to why it is proposed to use offence-specific defences (which reverse the evidential burden of proof) in this instance. The committee’s consideration of the appropriateness of a provision which reverses the burden of proof is assisted if it explicitly addresses relevant principles as set out in the Guide to Framing Commonwealth Offences.
    • The committee requests the minister’s advice as to why it is considered necessary to allow for executive level members of staff of the ACIC to be ‘appropriate authorising officers’, in particular with reference to the committee’s scrutiny concerns in relation to the use of coercive powers without judicial authorisation under an emergency authorisation.
    • Regarding the Treasury Laws Amendment (News Media and Digital Platforms Mandatory Bargaining Code) Bill 2020, the committee asserted the bill “seeks to establish a mandatory code of conduct to support the sustainability of the Australian news media sector by addressing bargaining power imbalances between digital platforms and Australian news businesses.” The committee requested less input on this bill:
      • requests the Treasurer’s advice as to why it is considered necessary and appropriate to leave the determination of which digital platforms must participate in the News Media and Digital Platforms Mandatory Bargaining Code to delegated legislation.
      • If it is considered appropriate to leave this matter to delegated legislation, the committee requests the Treasurer’s advice as to whether the bill can be amended to require the positive approval of each House of the Parliament before determinations made under proposed section 52E come into effect.
  • The European Data Protection Board (EDPB) issued a statement “on new draft provisions of the second additional protocol to the Council of Europe Convention on Cybercrime (Budapest Convention),” the second time it has weighed in on the rewrite of “the first international treaty on crimes committed via the Internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security.” The EDPB took issue with the process of meeting and drafting new provisions:
    • Following up on the publication of new draft provisions of the second additional protocol to the Budapest Convention , the EDPB therefore, once again, wishes to provide an expert and constructive contribution with a view to ensure that data protection considerations are duly taken into account in the overall drafting process of the additional protocol, considering that the meetings dedicated to the preparation of the additional protocol are being held in closed sessions and that the direct involvement of data protection authorities in the drafting process has not been foreseen in the T-CY Terms of Reference
    • The EDPB offered itself again as a resource and key stakeholder that needs to be involved with the effort:
      • In November 2019, the EDPB also published its latest contribution to the consultation on a draft second additional protocol, indicating that it remained available for further contributions and called for an early and more proactive involvement of data protection authorities in the preparation of these specific provisions, in order to ensure an optimal understanding and consideration of data protections safeguards (emphasis in the original).
    • The EDPB further asserted:
      • The EDPB remains fully aware that situations where judicial and law enforcement authorities are faced with a “cross-border situation” with regards to access to personal data as part of their investigations can be a challenging reality and recognises the legitimate objective of enhancing international cooperation on cybercrime and access to information. In parallel, the EDPB reiterates that the protection of personal data and legal certainty must be guaranteed, thus contributing to the objective of establishing sustainable arrangements for the sharing of personal data with third countries for law enforcement purposes, which are fully compatible with the EU Treaties and the Charter of Fundamental Rights of the EU. The EDPB furthermore considers it essential to frame the preparation of the additional protocol within the framework of the Council of Europe core values and principles, and in particular human rights and the rule of law.
  • The European Commission (EC) published a statement on how artificial intelligence (AI) “can transform Europe’s health sector.” The EC sketched out legislation it hopes to introduce soon on regulating AI in the European union (EU). The EC asserted:
    • A high-standard health system, rich health data and a strong research and innovation ecosystem are Europe’s key assets that can help transform its health sector and make the EU a global leader in health-related artificial intelligence applications. 
    • The use of artificial intelligence (AI) applications in healthcare is increasing rapidly.
    • Before the COVID-19 pandemic, challenges linked to our ageing populations and shortages of healthcare professionals were already driving up the adoption of AI technologies in healthcare. 
    • The pandemic has all but accelerated this trend. Real-time contact tracing apps are just one example of the many AI applications used to monitor the spread of the virus and to reinforce the public health response to it.
    • AI and robotics are also key for the development and manufacturing of new vaccines against COVID-19.
    • The European Commission is currently preparing a comprehensive package of measures to address issues posed by the introduction of AI, including a European legal framework for AI to address fundamental rights and safety risks specific to the AI systems, as well as rules on liability related to new technologies.
  • The House Energy and Commerce Committee Chair Frank Pallone, Jr. (D-NJ) and Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) wrote to Apple CEO Tim Cook “urging review and improvement of Apple’s new App Privacy labels in light of recent reports suggesting they are often misleading or inaccurate.” Pallone and Schakowsky are working from a Washington Post article, in which the paper’s tech columnist learned that Apple’s purported ratings system to inform consumers about the privacy practices of apps is largely illusory and possibly illegally deceptive. Pallone and Schakowsky asserted:
    • According to recent reports, App Privacy labels can be highly misleading or blatantly false. Using software that logs data transmitted to trackers, a reporter discovered that approximately one third of evaluated apps that said they did not collect data had inaccurate labels. For example, a travel app labeled as collecting no data was sending identifiers and other data to a massive search engine and social media company, an app-analytics company, and even a Russian Internet company. A ‘slime simulator’ rated for ages 4 and older had a ‘Data Not Collected’ label, even though the app shares identifying information with major tech companies and shared data about the phone’s battery level, storage, general location, and volume level with a video game software development company.
    • Simplifying and enhancing privacy disclosures is a laudable goal, but consumer trust in privacy labeling approaches may be undermined if Apple’s App Privacy labels disseminate false and misleading information. Without meaningful, accurate information, Apple’s tool of illumination and transparency may become a source of consumer confusion and harm. False and misleading privacy labels can dupe privacy-conscious consumers into downloading data-intensive apps, ultimately eroding the credibility and integrity of the labels. A privacy label without credibility and integrity also may dull the competitive forces encouraging app developers to improve their data practices.
    • A privacy label is no protection if it is false. We urge Apple to improve the validity of its App Privacy labels to ensure consumers are provided meaningful information about their apps’ data practices and that consumers are not harmed by these potentially deceptive practices.
    • Pallone and Schakowsky stated “[t]o better understand Apple’s practices with respect to the privacy labels, we request that you provide written response to the following questions by February 23, 2021:
      • 1. Apple has stated that it conducts routine and ongoing audits of the information provided by developers and works with developers to correct any inaccuracies.
        • a. Please detail the process by which Apple audits the privacy information provided by app developers. Please explain how frequently audits are conducted, the criteria by which Apple selects which apps to audit, and the methods for verifying the accuracy of the privacy information provided by apps.
        • b. How many apps have been audited since the implementation of the App Privacy label? Of those, how many were found to have provided inaccurate or misleading information? 
      • 2. Does Apple ensure that App Privacy labels are corrected upon the discovery of inaccuracies or misleading information? If not, why not? For each app that has been found to have provided inaccurate or misleading information, how quickly was that label corrected?
      • 3. Please detail Apple’s enforcement policies when an app fails to provide accurate privacy information for the App Privacy label.
      • 4. Does Apple require more in-depth privacy disclosures and conduct more stringent oversight of apps targeted to children under the age of 13? If not, why not? If so, please describe the additional disclosures required and the oversight actions employed for these apps.
      • 5. Providing clear and easily comprehendible privacy information at the point of sale is certainly valuable, but privacy policies are not static. Does Apple notify users when one of their app’s privacy labels has materially changed? If not, why not. If so, how are users notified of such changes.
  • The United Kingdom’s Department for Digital, Culture, Media & Sport (DCMS) “published its draft rules of the road for governing the future use of digital identities…[and] [i]t is part of plans to make it quicker and easier for people to verify themselves using modern technology and create a process as trusted as using passports or bank statements” according to its press release. The DCMS wants feedback by 11 March 2021 on the draft trust framework. The DCMS stated:
    • Digital identity products allow people to prove who they are, where they live or how old they are. They are set to revolutionise transactions such as buying a house, when people are often required to prove their identity multiple times to a bank, conveyancer or estate agent, and buying age-restricted goods online or in person.
    • The new ‘trust framework’ lays out the draft rules of the road organisations should follow. It includes the principles, policies, procedures and standards governing the use of digital identity to allow for the sharing of information to check people’s identities or personal details, such as a user’s address or age, in a trusted and consistent way. This will enable interoperability and increase public confidence.
    • The framework, once finalised, is expected to be brought into law. It has specific standards and requirements for organisations which provide or use digital identity services including:
      • Having a data management policy which explains how they create, obtain, disclose, protect, and delete data;
      • Following industry standards and best practice for information security and encryption;
      • Telling the user if any changes, for example an update to their address, have been made to their digital identity;
      • Where appropriate, having a detailed account recovery process and notifying users if organisations suspect someone has fraudulently accessed their account or used their digital identity;
      • Following guidance on how to choose secure authenticators for their service.
  • The European Commission (EC) “opened infringement procedures against 24 Member States for failing to enact new EU telecom rules.”
    • The EC asserted:
      • The European Electronic Communications Code modernises the European regulatory framework for electronic communications, to enhance consumers’ choices and rights, for example by ensuring clearer contracts, quality of services, and competitive markets. The Code also ensures higher standards of communication services, including more efficient and accessible emergency communications. Furthermore, it allows operators to benefit from rules incentivising investments in very-high capacity networks, as well as from enhanced regulatory predictability, leading to more innovative digital services and infrastructures.
      • The European Electronic Communications Code that brings the regulatory framework governing the European telecom sector up to date with the new challenges came into force in December 2018, and Member States have had two years to implement its rules. It is a central piece of legislation to achieve Europe’s Gigabit society and ensure full participation of all EU citizens in the digital economy and society.

Coming Events

  • On 18 February, the House Financial Services will hold a hearing titled “Game Stopped? Who Wins and Loses When Short Sellers, Social Media, and Retail Investors Collide” with Reddit Co-Founder and Chief Executive Officer Steve Huffman testifying along with other witnesses.
  • The U.S.-China Economic and Security Review Commission will hold a hearing titled “Deterring PRC Aggression Toward Taiwan” on 18 February.
  • On 24 February, the House Energy and Commerce Committee’s Communications and Technology Subcommittee will hold a hearing titled “Fanning the Flames: Disinformation and Extremism in the Media.”
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Estúdio Bloom on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s