Further Reading, Other Developments, and Coming Events (2 February 2021)

Further Reading

  • I checked Apple’s new privacy ‘nutrition labels.’ Many were false.” By Geoffrey Fowler — The Washington Post. It turns out the blue check mark in Apple’s App Store signifying that an app does not collect personal data is based on the honor system. As the Post’s technology columnist learned, Apple tells users this in very small print: “This information has not been verified by Apple.” And so, as Fowler explains, this would seem contrary to the company’s claims of making user privacy a core value. Also, Apple’s definition of tracking is narrow, suggesting the company may be defining its way to being a champion of privacy. Finally, Apple’s practices in light of the coming changes to its iOS to defeat Facebook and others’ tracking people across digital space seem to belie the company’s PR and branding. It would seem like the Federal Trade Commission (FTC) and its overseas counterparts would be interested in such deceptive and unfair practices.
  • Lawmakers Take Aim at Insidious Digital ‘Dark Patterns’” By Tom Simonite — WIRED. Language in the “California Privacy Rights Act” (CPRA) makes consent gained through the use of “dark patterns” (i.e., all those cognitive tricks online and real-life entities use to slant the playing field against consumers) invalid. However, lest one celebrate that policymakers are addressing these underhanded means of gaining consent or selling things, the to be established California Privacy Protection Agency will need to define what dark patterns are and write the regulations barring whatever those will be. In Washington state, the sponsors of the Washington Privacy Act (SB 5062) copied the CPRA language, setting up the possibility Washington state could follow California. It remains to be seen how, or even if, federal privacy legislation proposals deal with dark patterns. And it well may considering that Senators Mark Warner (D-VA) and Deb Fischer (R-NE) introduced the “Deceptive Experiences To Online Users Reduction (DETOUR) Act” (S.1084) in 2019. Moreover, again, as in the previous article, one might think the Federal Trade Commission (FTC) and its overseas counterparts might be interested in policing dark patterns.
  • A PR screwup draws unwanted attention to Google’s Saudi data centers” By Issie Lapowsky — Protocol. The best case scenario is that Google and Snap misstated what cloud infrastructure and content are in the Kingdom of Saudi Arabia. And in this case, privacy and civil liberties groups are unfairly pouncing on the companies over essentially garbling the truth. On the other hand, it may turn out that the companies are routing traffic and content through the repressive regime, allowing a government with an abysmal human rights record to access the data of people. Time may tell what is actually happening, but the two companies are furiously telling the world that there’s nothing to see here.
  • China’s Leader Attacks His Greatest Threat” By John Pomfret — The Atlantic. Xi Jinping, President of the People’s Republic of China (PRC) and Chairman of the Chinese Communist Party (CCP) has accelerated a crack down on entrepreneurs and technology companies started by his predecessors. This would ultimately impair the PRC’s ambitions of becoming the world’s dominant power through technological superiority.
  • Why Is Big Tech Policing Speech? Because the Government Isn’t” By Emily Bazelon — The New York Times. The First Amendment to the United States (U.S.) Constitution is invariably cited in the online speech debate as a reason why people cannot be silenced and as to why social media platforms can silence whom they like. This is an interesting survey of this right in the U.S. and how democracies in Europe have a different understanding of permissible speech.

Other Developments

  • In a recent press conference, White House Press Secretary Jen Psaki shed light on how the Biden Administration will change United States (U.S.) policy towards the People’s Republic of China (PRC). In response to a question about how the U.S. government will deal with TikTok and the PRC generally, Psaki stated:
    • I think our approach to China remains what it has been since — for the last months, if not longer.  We’re in a serious competition with China.  Strategic competition with China is a defining feature of the 21st century.  China is engaged in conduct that it hurts American workers, blunts our technological edge, and threatens our alliances and our influence in international organizations.  
    • What we’ve seen over the last few years is that China is growing more authoritarian at home and more assertive abroad.  And Beijing is now challenging our security, prosperity, and values in significant ways that require a new U.S. approach. 
    • And this is one of the reasons, as we were talking about a little bit earlier, that we want to approach this with some strategic patience, and we want to conduct reviews internally, through our interagency….We wanted to engage more with Republicans and Democrats in Congress to discuss the path forward.  And most importantly, we want to discuss this with our allies. 
    • We believe that this moment requires a strategic and a new approach forward.
    • [T]echnology, as I just noted, is, of course, at the center of the U.S.-China competition.  China has been willing to do whatever it takes to gain a technological advantage — stealing intellectual property, engaging in industrial espionage, and forcing technology transfer.
    • Our view — the President’s view is we need to play a better defense, which must include holding China accountable for its unfair and illegal practices and making sure that American technologies aren’t facilitating China’s military buildup.
    • So he’s firmly committed to making sure that Chinese companies cannot misappropriate and misuse American data.  And we need a comprehensive strategy, as I’ve said, and a more systematic approach that actually addresses the full range of these issues.
    • So there is, again, an ongoing review of a range of these issues.  We want to look at them carefully, and we’ll be committed to approaching them through the lens of ensuring we’re protecting U.S. data and America’s technological edge. 
  • The top Republican on the House Foreign Affairs Committee is calling on Senate Republicans to block Governor Gina Raimondo’s nomination to be the Secretary of Commerce until the White House indicates whether they will keep Huawei on a list of entities to whom the United States (U.S.) restricts exports. Ranking Member Michael McCaul (R-TX) asserted:
    • It is incredibly alarming the Biden Administration has refused to commit to keeping Huawei on the Department of Commerce’s Entity List. Huawei is not a normal telecommunications company – it is a CCP military company that threatens 5G security in our country, steals U.S. intellectual property, and supports the Chinese Communist Party’s genocide in Xinjiang and their human rights abuses across the country. We need a Commerce Department with strong national security credentials and a Secretary with a clear understanding of the CCP threat. Saying people should not use Huawei and actually keeping them on the Entity List are two very different things that result in very different outcomes. I again strongly urge the Biden Administration to reconsider this dangerous position. Until they make their intentions clear on whether they will keep Huawei on the Entity List, I urge my Senate colleagues to hold Ms. Raimondo’s confirmation.
    • McCaul added this background:
      • After the Biden Administration’s nominee for Commerce Secretary, Gina Raimondo, caused heads to turn by refusing to commit to keeping Huawei on the Entity List, White House Press Secretary Jen Psaki seemed to double down by declining on two separate occasions when directly asked to say where President Biden stood on the issue.
      • Huawei was placed on the Commerce Department’s Entity List in August of 2019. Their addition to the Entity List was also one of the recommendations of the [House Republican’s] China Task Force Report.
  • The National Highway Traffic Safety Administration (NHTSA), an agency of the United States (U.S.) Department of Transportation (DOT) is asking for comment “on the Agency’s updated draft cybersecurity best practices document titled Cybersecurity Best Practices for the Safety of Modern Vehicles” according to the notice published in the Federal Register. Comments are due by 15 March 2021. NHTSA explained:
    • In October 2016, NHTSA issued its first best practices document focusing on the cybersecurity of motor vehicles and motor vehicle equipment.Cybersecurity Best Practices for Modern Vehicles (“2016 Best Practices”) was the culmination of years of extensive engagement with public and private stakeholders and NHTSA research on vehicle cybersecurity and methods of enhancing vehicle cybersecurity industry-wide. As explained in the accompanying Federal Register document, NHTSA’s 2016 Best Practices was released with the goal of supporting industry-led efforts to improve the industry’s cybersecurity posture and provide the Agency’s views on how the automotive industry could develop and apply sound risk-based cybersecurity management processes during the vehicle’s entire lifecycle.
    • The 2016 Best Practices leveraged existing automotive domain research as well as non-automotive and IT-focused standards such as the National Institute of Standards and Technology (NIST) Cybersecurity Framework and the Center for internet Security’s Critical Security Controls framework. NHTSA considered these sources to be reasonably applicable and appropriate to augment the limited industry-specific guidance that was available at the time. At publication, NHTSA noted that the 2016 Best Practices were intended to be updated with new information, research, and other cybersecurity best practices related to the automotive industry. NHTSA invited comments from stakeholders and interested parties in response to the document.
    • NHTSA is docketing a draft update to the agency’s 2016 Best Practices, titled Cybersecurity Best Practices for the Safety of Modern Vehicles (2020 Best Practices) for public comments. This update builds upon agency research and industry progress since 2016, including emerging voluntary industry standards, such as the ISO/SAE Draft International Standard (DIS) 21434, “Road Vehicles—Cybersecurity Engineering.” In addition, the draft update references a series of industry best practice guides developed by the Auto-ISAC through its members.
    • The 2020 Best Practices also reflect findings from NHTSA’s continued research in motor vehicle cybersecurity, including over-the-air updates, encryption methods, and building our capability in cybersecurity penetration testing and diagnostics, and the new learnings obtained through researcher and stakeholder engagement. Finally, the updates included in the 2020 Best Practices incorporate insights gained from public comments received in response to the 2016 guidance and from information obtained during the annual SAE/NHTSA Vehicle Cybersecurity Workshops.
  • Ireland’s Data Protection Commission (DPC) has released a draft Fundamentals for a Child-Oriented Approach to Data Processing Draft Version for Consultation (Fundamentals) for consultation until 31 March 2021. The DPC asserted the
    • Fundamentals have been drawn up by the Data Protection Commission (DPC) to drive improvements in standards of data processing. They introduce child-specific data protection interpretative principles and recommended measures that will enhance the level of protection afforded to children against the data processing risks posed to them by their use of/ access to services in both an online and offline world. In tandem, the Fundamentals will assist organisations that process children’s data by clarifying the principles, arising from the high-level obligations under the GDPR, to which the DPC expects such organisations to adhere.
    • The DPC “identified the following 14 Fundamentals that organisations should follow to enhance protections for children in the processing of their personal data:
      • 1. FLOOR OF PROTECTION: Online service providers should provide a “floor” of protection for all users, unless they take a risk-based approach to verifying the age of their users so that the protections set out in these Fundamentals are applied to all processing of children’s data (Section 1.4 “Complying with the Fundamentals”).
      • 2. CLEAR-CUT CONSENT: When a child has given consent for their data to be processed, that consent must be freely given, specific, informed and unambiguous, made by way of a clear statement or affirmative action (Section2.4 “Legal bases for processing children’s data”).
      • 3. ZERO INTERFERENCE: Online service providers processing children’s data should ensure that the pursuit of legitimate interests do not interfere with, conflict with or negatively impact, at any level, the best interests of the child (Section 2.4 “Legal bases for processing children’s data”).
      • 4. KNOW YOUR AUDIENCE: Online service providers should take steps to identify their users and ensure that services directed at/ intended for or likely to be accessed by children have child-specific data protection measures in place (Section 3.1 “Knowing your audience”)
      • 5. INFORMATION IN EVERY INSTANCE: Children are entitled to receive information about the processing of their own personal data irrespective of the legal basis relied on and even if consent was given by a parent on their behalf to the processing of their personal data (Section 3 “Transparency and children”).
      • 6. CHILD-ORIENTED TRANSPARENCY: Privacy information about how personal data is used must be provided in a concise, transparent, intelligible and accessible way, using clear and plain language that is comprehensible and suited to the age of the child (Section 3 “Transparency and children”).
      • 7 .LET CHILDREN HAVE THEIR SAY: Online service providers shouldn’t forget that children are data subjects in their own right and have rights in relation to their personal data at any age. The DPC considers that a child may exercise these rights at any time, as long as they have the capacity to do so and it is in their best interests. (Section 4.1 “The position of children as rights holders”)
      • 8. CONSENT DOESN’T CHANGE CHILDHOOD: Consent obtained from children or from the guardians/ parents should not be used as a justification to treat children of all ages as if they were adults (Section 5.1 “Age of digital consent”).
      • 9. YOUR PLATFORM, YOUR RESPONSIBILITY: Companies who derive revenue from providing or selling services through digital and online technologies pose particular risks to the rights and freedoms of children. Where such a company uses age verification and/ or relies on parental consent for processing, the DPC will expect it to go the extra mile in proving that its measures around age verification and verification of parental consent are effective. (Section 5.2 “Verification of parental consent)
      • 10. DON’T SHUT OUT CHILD USERS OR DOWNGRADE THEIR EXPERIENCE: If your service is directed at, intended for, or likely to be accessed by children, you can’t bypass your obligations simply by shutting them out or depriving them of a rich service experience. (Section 5.4 “Age verification and the child’s user experience”)
      • 11. MINIMUM USER AGES AREN’T AN EXCUSE: Theoretical user age thresholds for accessing services don’t displace the obligations of organisations to comply with the controller obligations under the GDPR and the standards and expectations set out in these Fundamentals where “underage” users are concerned. (Section 5.5 “Minimum user ages”)
      • 12. PROHIBITION ON PROFILING: Online service providers should not profile children and/ or carry out automated decision making in relation to children, or otherwise use their personal data, for marketing/advertising purposes due to their particular vulnerability and susceptibility to behavioural advertising, unless they can clearly demonstrate how and why it is in the best interests of the child to do so (Section 6.2 “Profiling and automated decision making”).
      • 13. DO A DPIA: Online service providers should undertake data protection impact assessments to minimise the data protection risks of their services, and in particular the specific risks to children which arise from the processing of their personal data. The principle of the best interests of the child must be a key criterion in any DPIA and must prevail over the commercial interests of an organisation in the event of a conflict between the two sets of interests (Section 7.1 “Data Protection Impact Assessments”).
      • 14. BAKE IT IN: Online service providers that routinely process children’s personal data should, by design and by default, have a consistently high level of data protection which is “baked in” across their services (Section 7.2 “Data Protection by Design and Default”)
  • The United Kingdom’s (UK) Competition and Markets Authority (CMA) “is now seeking evidence from academics and industry experts on the potential harms to competition and consumers caused by the deliberate or unintended misuse of algorithms…[and] is also looking for intelligence on specific issues with particular firms that the CMA could examine and consider for future action.” CMA stated “[t]he research and feedback will inform the CMA’s future work in digital markets, including its programme on analysing algorithms and the operation of the new Digital Markets Unit (DMU), and the brand-new regulatory regime that the DMU will oversee.” The CMA stated:
    • Algorithms can be used to personalise services in ways that are difficult to detect, leading to search results that can be manipulated to reduce choice or artificially change consumers’ perceptions. An example of this is misleading messages which suggest a product is in short supply.
    • Companies can also use algorithms to change the way they rank products on websites, preferencing their own products and excluding competitors. More complex algorithms could aid collusion between businesses without firms directly sharing information. This could lead to sustained higher prices for products and services.
    • The majority of algorithms used by private firms online are currently subject to little or no regulatory oversight and the research concludes that more monitoring and action is required by regulators, including the CMA. The CMA has already considered the impact of algorithms on competition and consumers in previous investigations, for example monitoring the pricing practices of online travel agents.
    • In the algorithms paper, the CMA explained:
      • The publication of this paper, and the accompanying call for information mark the launch of a new CMA programme of work on analysing algorithms, which aims to develop our knowledge and help us better identify and address harms. This paper reviews the potential harms to competition and consumers from the use of algorithms, focussing on those the CMA or other national competition or consumer authorities may be best placed to address.
      • We first describe direct harms to consumers, many of which involve personalisation. Personalisation can be harmful because it is difficult to detect either by consumers or others, targets vulnerable consumers or has unfair distributive effects. These harms often occur through the manipulation of consumer choices, without the awareness of the consumer.
      • The paper then explores how the use of algorithms can exclude competitors and so reduce competition (for example, a platform preferencing its own products). We outline the most recent developments in the algorithmic collusion literature; collusion appears an increasingly significant risk if the use of more complex pricing algorithms becomes widespread. We also describe how using ineffective algorithms to oversee platform activity fails to prevent harm.
      • Next, we summarise techniques that could be used to analyse algorithmic systems. Potentially problematic systems can be identified even without access to underlying algorithms and data. However, to understand fully how an algorithmic system works and whether consumer or competition law is being breached, regulators need appropriate methods to audit the system. We finally discuss the role of regulators. Regulators can help to set standards and facilitate better accountability of algorithmic systems, including support for the development of ethical approaches, guidelines, tools and principles. They can also use their information gathering powers to identify and remedy harms on either a case-by-case basis or as part of an ex-ante regime overseen by a regulator of technology firms, such as the proposed Digital Markets Unit (DMU) in the UK.
  • The National Institute of Standards and Technology (NIST) is making available for comment a draft of NIST Special Publication (SP) 800-47 Revision 1, Managing the Security of Information Exchanges, that “provides guidance on identifying information exchanges; risk-based considerations for protecting exchanged information before, during, and after the exchange; and example agreements for managing the protection of the exchanged information.” NIST is accepting comments through 12 March 2021. The agency stated:
    • Rather than focus on any particular type of technology-based connection or information access, this draft publication has been updated to define the scope of information exchange, describe the benefits of securely managing the information exchange, identify types of information exchanges, discuss potential security risks associated with information exchange, and detail a four-phase methodology to securely manage information exchange between systems and organizations. Organizations are expected to further tailor the guidance to meet specific organizational needs and requirements.
    • NIST is specifically interested in feedback on:
      • Whether the agreements addressed in the draft publication represent a comprehensive set of agreements needed to manage the security of information exchange.
      • Whether the matrix provided to determine what types of agreements are needed is helpful in determining appropriate agreement types.
      • Whether additional agreement types are needed, as well as examples of additional agreements.
      • Additional resources to help manage the security of information exchange.

Coming Events

  • On 3 February, the Senate Commerce, Science, and Transportation Committee will consider the nomination of Rhode Island Governor Gina Raimondo to be the Secretary of Commerce.
  • On 17 February, the Federal Communications Commission (FCC) will hold an open meeting, its first under acting Chair Jessica Rosenworcel, with this tentative agenda:
    • Presentation on the Emergency Broadband Benefit Program. The Commission will hear a presentation on the creation of an Emergency Broadband Benefit Program. Congress charged the FCC with developing a new $3.2 billion program to help Americans who are struggling to pay for internet service during the pandemic.
    • Presentation on COVID-19 Telehealth Program. The Commission will hear a presentation about the next steps for the agency’s COVID-19 Telehealth program. Congress recently provided an additional $249.95 million to support the FCC’s efforts to expand connected care throughout the country and help more patients receive health care safely.
    • Presentation on Improving Broadband Mapping Data. The Commission will hear a presentation on the work the agency is doing to improve its broadband maps. Congress directly appropriated $65 million to help the agency develop better data for improved maps.
    • Addressing 911 Fee Diversion. The Commission will consider a Notice of Proposed Rulemaking that would implement section 902 of the Don’t Break Up the T-Band Act of 2020, which requires the Commission to take action to help address the diversion of 911 fees by states and other jurisdictions for purposes unrelated to 911. (PS Docket Nos. 20-291, 09-14)
    • Implementing the Secure and Trusted Communications Networks Act. The Commission will consider a Third Further Notice of Proposed Rulemaking that proposes to modify FCC rules consistent with changes that were made to the Secure and Trusted Communications Networks Act in the Consolidated Appropriations Act, 2021. (WC Docket No. 18-89)
  • On 27 July 2021, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Image by John Howard from Pixabay

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s