Other Developments, Further Reading, and Coming Events (27 April 2021)

Other Developments

  • The Australian Competition & Consumer Commission (ACCC) lauded its partial victory in Australian federal court against Google that found the company “misled consumers about personal location data collected through Android mobile devices between January 2017 and December 2018.” The ACCC conceded the court “dismissed the ACCC’s allegations about certain statements Google made about the methods by which consumers could prevent Google from collecting and using their location data, and the purposes for which personal location data was being used by Google.” The ACCC asserted:
    • The Court ruled that when consumers created a new Google Account during the initial set-up process of their Android device, Google misrepresented that the ‘Location History’ setting was the only Google Account setting that affected whether Google collected, kept or used personally identifiable data about their location. In fact, another Google Account setting titled ‘Web & App Activity’ also enabled Google to collect, store and use personally identifiable location data when it was turned on, and that setting was turned on by default.
    • The Court also found that when consumers later accessed the ‘Location History’ setting on their Android device during the same time period to turn that setting off, they were also misled because Google did not inform them that by leaving the ‘Web & App Activity’ setting switched on, Google would continue to collect, store and use their personally identifiable location data.
    • Similarly, between 9 March 2017 and 29 November 2018, when consumers later accessed the ‘Web & App Activity’ setting on their Android device, they were misled because Google did not inform them that the setting was relevant to the collection of personal location data.
    • The Court also found that Google’s conduct was liable to mislead the public.
  • The European Commission (EC) rolled out its proposed legislation for regulating artificial intelligence (AI). The EC asserted:
    • The combination of the first-ever legal framework on AI and a new Coordinated Plan with Member States will guarantee the safety and fundamental rights of people and businesses, while strengthening AI uptake, investment and innovation across the EU. New rules on Machinery will complement this approach by adapting safety rules to increase users’ trust in the new, versatile generation of products.
    • The new rules will be applied directly in the same way across all Member States based on a future-proof definition of AI. They follow a risk-based approach:
      • Unacceptable risk: AI systems considered a clear threat to the safety, livelihoods and rights of people will be banned. This includes AI systems or applications that manipulate human behaviour to circumvent users’ free will (e.g. toys using voice assistance encouraging dangerous behaviour of minors) and systems that allow ‘social scoring’ by governments.
    • High-risk: AI systems identified as high-risk include AI technology used in:
      • Critical infrastructures (e.g. transport), that could put the life and health of citizens at risk;
      • Educational or vocational training, that may determine the access to education and professional course of someone’s life (e.g. scoring of exams);
      • Safety components of products (e.g. AI application in robot-assisted surgery);
      • Employment, workers management and access to self-employment (e.g. CV-sorting software for recruitment procedures);
      • Essential private and public services (e.g. credit scoring denying citizens opportunity to obtain a loan);
      • Law enforcement that may interfere with people’s fundamental rights (e.g. evaluation of the reliability of evidence);
      • Migration, asylum and border control management (e.g. verification of authenticity of travel documents);
      • Administration of justice and democratic processes (e.g. applying the law to a concrete set of facts).
      • High-risk AI systems will be subject to strict obligations before they can be put on the market:
    • Adequate risk assessment and mitigation systems;
      • High quality of the datasets feeding the system to minimise risks and discriminatory outcomes;
      • Logging of activity to ensure traceability of results;
      • Detailed documentation providing all information necessary on the system and its purpose for authorities to assess its compliance;
      • Clear and adequate information to the user;
      • Appropriate human oversight measures to minimise risk;
      • High level of robustness, security and accuracy.
    • In particular, all remote biometric identification systems are considered high risk and subject to strict requirements. Their live use in publicly accessible spaces for law enforcement purposes is prohibited in principle. Narrow exceptions are strictly defined and regulated (such as where strictly necessary to search for a missing child, to prevent a specific and imminent terrorist threat or to detect, locate, identify or prosecute a perpetrator or suspect of a serious criminal offence). Such use is subject to authorisation by a judicial or other independent body and to appropriate limits in time, geographic reach and the data bases searched.
    • Limited risk, i.e. AI systems with specific transparency obligations: When using AI systems such as chatbots, users should be aware that they are interacting with a machine so they can take an informed decision to continue or step back.
    • Minimal risk: The legal proposal allows the free use of applications such as AI-enabled video games or spam filters. The vast majority of AI systems fall into this category. The draft Regulation does not intervene here, as these AI systems represent only minimal or no risk for citizens’ rights or safety.
    • In terms of governance, the Commission proposes that national competent market surveillance authorities supervise the new rules, while the creation of a European Artificial Intelligence Board will facilitate their implementation, as well as drive the development of standards for AI. Additionally, voluntary codes of conduct are proposed for non-high-risk AI, as well as regulatory sandboxes to facilitate responsible innovation.
  • A class action was settled against Disney, Viacom, Twitter, and other companies regarding the advertising and tracking software they placed in popular free children’s’ games. The companies have agreed to discontinue the practice. As explained in the notice of class action settlement:
    • Separate class action settlements (the “Settlements”) have been reached with 15 Defendants affecting parents and guardians of children, including teens, who have played certain games on smartphones and other mobile devices. The Settlements resolve three separate lawsuits alleging that Defendants used or allowed tracking technology included in various mobile gaming apps played by children, including teens, to serve them targeted advertisements. Defendants deny the allegations in the lawsuits, and the Court has not made a determination regarding Plaintiffs’ allegations. In each Settlement, the Defendant agrees to implement or continue certain business practices for covered gaming apps on mobile devices. The Settlements do not provide money compensation to the class members, and class members do not release any claims for monetary damages. Class Counsel will request that the Court award them reasonable attorneys’ fees and expenses as compensation for their obtaining Defendants’ agreements to make certain changes to their business practices.
    • Developer Defendant Settlements – each Developer Defendant in the Kiloo Action, Disney Action, and Viacom Action has agreed to implement or continue certain business practices to the apps at issue in order to better ensure that children do not have any of their data collected for certain advertising purposes.
    • SDK Defendant Settlements – each SDK Defendant in the Kiloo Action, Disney Action, and Viacom Action has agreed to make changes to its data practices and its client onboarding processes and/or dashboards (and, in certain instances, to continue to engage in its current best practices with the apps at issue) to further enhance the ability of the SDK or its developer client to limit the collection of data from children, including teens, under certain ages (which varies depending upon the specific Settlement) for certain advertising purposes.
  • The United States (U.S.) Department of Commerce’s Bureau of Industry and Security (BIS) issued a final rule that eased some of the export controls on “mass market” encryption products. BIS explained it was working from agreement reached under the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies. The agency noted:
    • BIS published a final rule on October 25, 2020 (85 FR 62583) implementing certain new controls on emerging technologies, as approved at the December 2019 WA Plenary meeting. The changes in this rule, which represent the remaining approved changes to the WA control lists, update the corresponding items listed in the EAR and reflect recent technical advancements and clarifications. Unless explicitly discussed below, the revisions made by this rule will not impact the number of license applications submitted to BIS.
    • BIS is amending various provisions in the EAR related to items in Category 5—Part 2, including by eliminating reporting requirements in order to reduce exporters’ regulatory burdens. In summary, this rule makes the following changes: (1) Eliminates the email notification requirement for `publicly available’ encryption source code and beta test encryption software, except for `publicly available’ encryption source code and beta test encryption software implementing “non-standard cryptography”; (2) eliminates the self-classification reporting requirement for certain `mass market’ encryption products under § 740.17(b)(1); and (3) allows self-classification reporting for ECCN 5A992.c or 5D992.c components of `mass market’ products (and their `executable software’). This rule moves “mass market” “components,” `executable software’, toolsets, and toolkits out of § 740.17(b)(3)(i) and into (b)(1). Of those four items, only “mass market” “components” and `executable software’ are subject to self-classification reporting. Mass market toolsets and toolkits are not subject to self-classification reporting.
    • This rule does not change any of the License Exception ENC requirements for any non-`mass market’ encryption item, or for any encryption item (`mass market’ or not) that implements “non-standard cryptography”.
  • The European Commission (EC) issued “a new EU Strategy to tackle Organised Crime” that focuses on “boosting law enforcement and judicial cooperation, tackling organized crime structures and high priority crimes, removing criminal profits and ensuring a modern response to technological developments.” The EC further claimed “[t]he Strategy aims to: 
    • Boost law enforcement and judicial cooperation: With 65% of the criminal groups active in the EU composed of multiple nationalities, effective exchange of information among law enforcement and judicial authorities across the EU is key to effectively tackle organised crime. The Commission will expand, modernise and reinforce funding for the European multidisciplinary platform against criminal threats (EMPACT), the structure that since 2010 brings together all relevant European and national authorities to identify priority crime threats and address them collectively. The Commission will propose to upgrade the ‘Prüm’ framework for exchanging information on DNA, fingerprints and vehicle registration. To make sure that law enforcement across the EU can work together better under a modern rulebook, the Commission will propose an EU Police Cooperation Code which will streamline the current patchwork of various EU tools and multi-lateral cooperation agreements. Achieving the 2023 objective to make information systems for security, border and migration management interoperable will help law enforcement better detect and combat identity fraud often used by criminals. Finally, to better tackle criminal networks operating internationally, the Commission is also proposing today to start negotiating a cooperation agreement with Interpol.  
    • Support more effective investigations to disrupt organised crime structures and focusing on high and specific priority crimes: There is a need to step up cooperation at EU level to dismantle organised crime structures. To ensure an effective response to specific forms of crime, the Commission will propose to revise the EU rules against environmental crime and will establish an EU toolbox against counterfeiting, notably of medical products. It will present measures to address the illicit trade in cultural goods. The Commission is also presenting today a Strategy dedicated to combatting trafficking in human beings. 
    • Make sure crime does not pay: Over 60% of criminal networks active in the EU engage in corruption and more than 80% use legitimate businesses as a front for their activities, while only 1% of criminal assets is confiscated. Tackling criminal finances is key to uncover, punish and deter crime. The Commission will propose to revise the EU rules on confiscating criminal profits, develop the EU anti-money laundering rules, promote the early launch of financial investigations and assess the existing EU anti-corruption rules. This will also help prevent infiltration into the legal economy.  
    • Make law enforcement and the judiciary fit for the digital age: Criminals communicate and commit crimes online and leave digital traces online. With 80% of crimes having a digital component, law enforcement and the judiciary need swift access to digital leads and evidence. They also need to use modern technology and be equipped with tools and skills to keep up with modern crime modi operandi. 
    • The Commission will analyse and outline possible approaches to data retention as well as propose a way forward to address a lawful and targeted access to encrypted information in the context of criminal investigations and prosecutions that would also protect security and the confidentiality of communications. The Commission will also work with relevant EU Agencies to provide national authorities with the tools, knowledge and operational expertise needed to conduct digital investigations.  
  • The Meeting of Attorneys-General (MAG) which “comprises Attorneys-General from the Australian Government, all states and territories, and the New Zealand Minister for Justice” has published a discussion paper as part of the second stage of its work on “Model Defamation Provisions” “focusing on the responsibilities and liability of digital platforms for defamatory content published online, as well as any other issues relating to defamation law.” MAG added:
    • This Discussion Paper is the first step in the second stage of the review of the MDPs. It comprises two parts:
      • Part A addresses the question of internet intermediary liability in defamation for the publication of third-party content. It suggests options for reform that reflect the potential spectrum of liability for internet intermediaries.
      • Part B considers whether defamation law is having a chilling effect on reports of alleged criminal conduct to police and statutory investigative bodies and on reports of unlawful conduct to disciplinary bodies and employers. It includes a series of questions for stakeholders about the potential benefits and risks of extending absolute privilege to these circumstances.
  • The Federal Communications Commission (FCC) has opened a docket on Verizon’s proposed purchase of TracFone from América Móvil S.A.B. de C.V. The agency explained “Verizon filed the application as part of its acquisition of TracFone, a mobile virtual network operator that offers prepaid services aimed at value-conscious consumers, and serves almost 21 million customers, including 1.7 million Lifeline customers.”
  • The Office of the Privacy Commissioner of Canada posted on its blog about Privacy-Enhancing Technologies (PET) and how their use may aid data privacy:
    • Your business might rely on the cloud or on Internet-enabled technologies—such as tablets, mobile phones, and smart devices (part of the Internet of Things) to deliver your services, analyze your data, and inform your business decisions. Ensuring that your organization’s and clients’ data remains private and secure is of the utmost importance. There are a number of tools you can use to protect the data you create and collect. These tools are known as “privacy-enhancing technologies,” or PETs.
    • Since the release of our last report on PETs, there have been several significant technical developments in the field. In the coming months, we’ll focus some of our Tech-Know blogs on a few of the PETs that have emerged since that report, including:
      • federated learning
      • differential privacy
      • homomorphic encryption
      • secure multiparty computation
    • This post examines federated learning and differential privacy. These PETs are still in the process of being refined and developed for wide-spread use, and very few organizations have implemented them.
    • Our upcoming blog posts will offer businesses some background information about these new PETs, and how they might be useful for better data privacy. If you hope to implement these emerging PETs in your business, we recommend following their development at academic and industry events.

Further Reading

  • The Ease of Tracking Mobile Phones of U.S. Soldiers in Hot Spots” By Byron Tau — The Wall Street Journal. If it is not alarming enough a former defense contractor inadvertently figured out how to track United States (U.S.) military personnel across the globe, the Wall Street Journal repeated the feat with data bought from a data broker. The People’s Republic of China (PRC) treats personal data as a national asset while the U.S. does not. This does not augur well for national security.
  • As schools experiment to close the homework gap, will new E-rate funding help?” By Issie Lapowsky and Penelope Blackwell — Protocol. How the Federal Communication Commission (FCC) implements the new Emergency Connectivity Fund (and its $7.2 billion in funding) may determine if the digital homework gap will narrow. Internet service providers and telecommunications companies are deadest against school districts spending the funds building out their own networks and want them to spend the funds on equipment. The FCC has committed to rolling out the details of how these funds can be spent in the next few weeks.
  • Report: China, Russia fueling QAnon conspiracy theories” By Michael Isikoff — yahoo! news. Based on the Soufan Center’s research, this article argues that as much as a fifth of QAnon content comes from outside the United States (U.S.) and quite likely from the Russian Federation and the People’s Republic of China (PRC). The degree to which the governments of each nation are implicated is unclear.
  • Wrongfully arrested man sues Detroit police over false facial recognition match” By Drew Harwell — The Washington Post. A man falsely accused of shoplifting a watch mostly on account of a misidentification from facial recognition technology is suing the Detroit Police Department. This is the third such suit that resulted after a misidentification led to an arrest and detention.
  • Big Tech Is Pushing States to Pass Privacy Laws, and Yes, You Should Be Suspicious” By Todd Feathers — The Markup. There are new fronts in the fight to enact privacy legislation in the United States (U.S.) and industry seems to be winning thus far. After the lightning quick enactment of the weak bill in Virginia, the tech industry is pushing for similar if not weaker bills in other states. This trend may strengthen industry’s hand in Congress.
  • Quantum technology emerges from the lab to spark a mini start-up boom” By Jeanne Whalen — The Washington Post. It appears we are at the dawn of the quantum computing age which may revolutionize life in the coming years the same way current computers and the internet did. And, as this article explains, Chicago may be ground zero.
  • Google’s Secret ‘Project Bernanke’ Revealed in Texas Antitrust Case” By Jeff Horwitz and Keach Hagey — The Wall Street Journal. One group of attorneys general suing Google for antitrust and anti-competitive conduct in online advertising markets filed documents showing the company had a secret program to mine past bids to aid its clients and possibly harm competing ad exchanges and publishers.

Coming Events

  • On 27 April, the Senate Homeland Security and Governmental Affairs Committee’s Emerging Threats and Spending Oversight Subcommittee will hold a hearing titled “Controlling Federal Legacy IT Costs and Crafting 21st Century IT Management Solutions.”
  • The Senate Commerce, Science, and Transportation Committee’s Consumer Protection, Product Safety, and Data Security Subcommittee will hold a hearing titled “Curbing COVID Cons: Warning Consumers about Pandemic Frauds, Scams, and Swindles” on 27 April.
  • On 27 April, the Senate Commerce, Science, and Transportation Committee’s Surface Transportation, Maritime, Freight, and Ports Subcommittee will hold a hearing titled “Driving Innovation: the Future of Automotive Mobility, Safety, and Technology.”
  • The Senate Judiciary Committee’s Privacy, Technology, and the Law Subcommittee will hold a hearing titled “Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds” on 27 April.
  • On 27 April, the House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee will hold a hearing titled “The Consumer Protection and Recovery Act: Returning Money to Defrauded Consumers.”
  • On 27 April, the House Natural Resources Committee’s Water, Oceans, and Wildlife Subcommittee will hold a hearing titled “Wildlife Trafficking and the Growing Online Marketplace.”
  • On 28 April, the House Science, Space, and Technology Committee’s Research and Technology Subcommittee will hold a hearing titled “National Science Foundation: Advancing Research for the Future of U.S. Innovation.”
  • On 28 April, the Senate Commerce, Science, and Transportation Committee will mark up the following bills:
    •  S.120, Safe Connections Act; Sponsors: Sens. Brian Schatz (D-HI), Deb Fischer (R-NE), Rick Scott (R-FL), Richard Blumenthal (D-CT), Jacky Rosen (D-NV), Shelley Moore Capito (R-WV)
    • S.163, Telecommunications Skilled Workforce Act; Sponsors: Sens. John Thune, (R-SD) Jon Tester (D-MT), Gary Peters (D-MI), Roger Wicker (D-MS), Jerry Moran (R-KS)
    • S.198, Data Mapping to Save Mom’s Lives Act; Sponsors: Sens. Jacky Rosen (D-NV), Deb Fischer (R-NE), Todd Young (IN), Brian Schatz (D-HI), Ed Markey (D-MA), Richard Blumenthal (D-CT), Amy Klobuchar (D-MN), Gary Peters (D-MI)
    • S.326, Measuring the Economic Impact of Broadband Act; Sponsors: Sens. Amy Klobuchar (D-MN), Shelley Moore Capito (R-WV), Dan Sullivan (R-AK)
    • S.735, Advanced Technological Manufacturing Act; Sponsors: Sens. Roger Wicker (R-MS), Maria Cantwell (D-WA), Jacky Rosen (D-NV)
    • S.1260, Endless Frontier Act; Sponsors: Sens. Chuck Schumer (D-NY), Todd Young (R-IN)
  • On 28 April, the Senate Appropriations Committee’s Military Construction, Veterans Affairs, and Related Agencies Subcommittee will hold a hearing titled “VA Telehealth Program: Leveraging Recent Investments to Build Future Capacity.”
  • On 29 April, the Senate Armed Services Committee will hold open and closed hearings on worldwide threats.   
  • On 29 April, the Commerce, Science, and Transportation Committee will consider the nomination of Eric Lander to be Director of the Office of Science and Technology Policy (OSTP).
  • The Federal Trade Commission (FTC) will hold a workshop titled “Bringing Dark Patterns to Light” on 29 April.
  • The Department of Commerce’s National Telecommunications and Information Administration (NTIA) will hold “a virtual meeting of a multistakeholder process on promoting software component transparency” on 29 April
  • On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Duranbah, Australia; Photo by Brandon Compagne on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s