EDPB Weighs In On Virtual Voice Assistants

The EU’s GDPR watchdog details its views on how the Siris, Alexas, and Assistants can stay on the right side of EU law.

The European Data Protection Board (EDPB) is asking for feedback on draft guidance on “virtual voice assistants” (VVA) like Apple’s Siri, Amazon’s Alexa, and Google’s Assistant, and how the General Data Protection Regulation (GDPR) and e-Privacy Directive apply to such services.

Twitter

Alexa, delete my personal data. Siri, I do not consent to Apple sharing my info. Google, why did you share my personal data with data brokers?

Cocktail Party

The EDPB is tackling the data protection aspects of the growing universe of VVA present on smartphones, laptops, and dedicated devices such as Alexa or Google’s competing speaker. Thus far, this is a corner of the technology world that has not endured much in the way of scrutiny, but as the use of voice activated and directed devices grow, one can expect regulators to pay closer attention.

Meeting

In its draft guidance, the EDPB is hoping to shape both how the European Union’s data protection authorities (DPA) will apply the GDPR and how technology companies will develop and utilize VVAs. Additionally, the EDPB is construing the e-Privacy Directive in the context of VVAs as these devices are engaged in electronic communication. There is a bit of a wild card in that the EU is currently considering a rewrite of electronic communication privacy regulation in the long awaited ePrivacy Regulation, which could render some of the EDPB’s guidance moot.

Nevertheless, the EDPB continues to be at the front of regulators in addressing new technological frontiers even though VVAs have been widely used for some time now, which may be a commentary on the bandwidth, focus, and resources of regulators around the globe.

Geek Out

In “Guidelines 02/2021 on Virtual Voice Assistants,” the EDPB laid out its views on VVAs and how they should meet EU law. It bears note that while the EDPB’s views carry great weight, the EU DPAs that will actually enforce the GDPR may not agree and may read the law differently. Likewise, the Court of Justice for the European Union and other EU courts are not required to follow EDPB guidance. Having said that, until there is more case law or enforcement actions, those in the VVA field may be wise to heed to EDPB and fail to do so at possible legal jeopardy.

As a threshold matter, the EDPB noted:

  • Recent technological advances have greatly increased the accuracy and popularity of VVA. Among other devices, VVAs have been integrated in smartphones, connected vehicles, smart speakers and smart TVs. This integration has given the VVAs access to information of an intimate nature that could, if not properly managed, harm the individuals’ rights to data protection and privacy. Consequently, VVAs and the devices integrating them have been under the scrutiny of different data protection authorities.
  • There are currently more than 3 billion smartphones and all of them have integrated VVAs, most of them switched on by default. Some of the most widespread operating systems in personal computers and laptops also integrate VVAs. The recent rise of smart speakers (147 million were sold in 2019) is bringing VVAs to millions of homes and offices. However, current VVA designs do not offer by default authentication or access control mechanisms.

Consequently, the EDPB feels compelled to establish guidance on VVA since the vast majority of these devices are collecting and processing the personal data of EU residents, and this implicates the GDPR and the e-Privacy Directive. The EDPB makes a number of recommendations to DPAs and VVA manufacturers to ensure current practices comply with EU law. Failing to heed these recommendations could leave a VVA maker or developer open to an enforcement action under the GDPR with fines as high €20 million or 4% of worldwide revenue at the high end.

The EDPB asserts the GDPR, first and foremost, applies to VVAs. Consequently, the large developers and manufacturers of VVAs will face the same responsibilities, obligations, and liability they currently face for virtually all of their other products in the EU. The EDPB goes further and reads the e-Privacy Directive’s definition of “terminal equipment” as including VVAs, which is not much of a leap given smartphones, smart televisions, and a number of Internet of Things (IOT) devices have been found to be terminal equipment.

The EDPB begins with possible legal bases for processing personal data and notes that consent under Article 6 is the likely legal grounds most VVAs can use. Undoubtedly, when people buy and start using a VVA, they are greeted with dense, lengthy, almost impenetrable agreements that include consent to voluminous data collection and processing. However, because VVAs are terminal equipment under the e-Privacy Directive, there are additional legal hurdles, namely those in Article 5(3) of the e-Privacy Directive that protect the confidentiality of communications. Access to these communications generally requires consent of the person, but there are two significant exceptions. The EDPB claimed:

current VVAs require access to the voice data stored by the VVA device. Therefore, Article 5(3) e-Privacy Directive applies. The applicability of Article 5(3) e-Privacy Directive means that the storing of information as well as the accessing to information already stored in a VVA requires, as a rule, end-user’s prior consent but allows for two exceptions: first, carrying out or facilitating the transmission of a communication over an electronic communications network, or, second, as strictly necessary in order to provide an information society service explicitly requested by the subscriber or user.

The EDPB further explained the second aforementioned exception (i.e., fulfilling a user’s request for a service) and suggested in the instance where a person using someone else’s VVA a data controller would be able to automatically collect and process her personal data for so-called “user profiling” (i.e., the collection, processing, and development of a profile that can then be used to target a person for advertising or offer differential pricing among many other applications). The EDPB stated:

The second exception (“strictly necessary in order to provide an information society service explicitly requested by the subscriber or user”) would allow a VVA service provider to process users’ data to execute users’ requests (see par. 72 in section 3.4.1) without the consent foreseen in Article 5(3) e-Privacy Directive. Conversely, such consent as required by Article 5(3) e-Privacy Directive would be necessary for the storing or gaining of access to information for any purpose other than executing users’ request (e.g. user profiling). Data controllers would need to attribute consent to specific users. Consequently, data controllers should only process non-registered users data to execute their requests.

The EDPB even contemplates a scenario in which a VVA records and possibly processes the voice or data of a person accidentally. One can envision a scenario where the owner of a VVA is using the device while there are background voices, and it is these and similar circumstances the EDPB is paying attention to. The EDPB expresses skepticism that accidental recording or “activation” could pass as consent under the GDPR and advises controllers to delete such personal data if there is no valid legal basis for processing in light of a likely lack of consent.

Moreover, the EDPB observed that voice data is a special class of personal data: biometric personal data. At present, this class of information requires more than just consent to be processed, for a controller must also “offer an alternative identification method to biometrics, with regard to the free nature of consent.” Moreover, controllers would need “to make transparent where biometric identification is used and how voiceprints (biometric templates) are stored and propagated across devices.”

The EDPB engages in a more in-depth analysis and interspersed its recommendations throughout the document, many of which will require VVA makers and developers to spend more time and resources to ensure compliance with EU law:

  • When users are informed about the VVA processing of personal data using a user account’s privacy policy and the account is linked to other independent services (e.g. email or online purchases), the EDPB recommends the privacy policy to have a clearly separated section regarding the VVA processing of personal data.
  • The information provided to the user should match the exact collection and processing that is carried out. While some meta-information is contained in a voice sample (e.g. stress level of the speaker), it is not automatically clear, whether such analysis is performed. It is crucial that controllers are transparent on what specific aspects of the raw data they process.
  • Furthermore it should at all times be apparent which state the VVA is in. Users should be able to determine whether a VVA is currently listening on its closed-loop circuit and especially whether it is streaming information to its back-end. This information should also be accessible for people with disabilities such as colour blindness (daltonism), deafness (anaccousia). Specific care should be given to the fact that VVAs suggest a usage scenario where eye-contact to the device is not necessary. So, all user feedback, including state changes should be available in visual and acoustic form at least.
  • Particular consideration should be applied if devices allow adding third party functionality (“apps” for VVAs). While some general information can be given to the user when they are the ones adding such functionality (given that it is the user’s choice), during normal use of the device, the boundaries between the various controllers involved can be much less clear, i.e. the user might be not sufficiently informed how and by whom their data is processed (and to which extent) in a specific query.
  • All information about processing based on data collected and derived from the processing of recorded voice should also be available to users according to Article 12 GDPR.
  • VVA controllers should make transparent what kind of information a VVA can derive about its surroundings, such as but not limited to other people in the room, music running in the background, any processing of the voice for medical or marketing other reasons, pets, etc.
  • From a user’s perspective, the main purpose of processing their data is querying and receiving responses and/or triggering actions like playing music or turning on or off lights. After a query has been answered or a command executed, the personal data should be deleted unless the VVA designer or developer has a valid legal basis to retain them for a specific purpose.
  • Before considering anonymization as means for fulfilling the data storage limitation principle, VVA providers and developers should check the anonymization process renders the voice unidentifiable.
  • Configuration defaults should reflect these requirements by defaulting to an absolute minimum of stored user information. If these options are presented as part of a setup wizard, the default setting should reflect this, and all options should be presented as equal possibilities without visual discrimination.
  • When during the review process the VVA provider or developer detects a recording originated on a mistaken activation, the recording and all the associated data should be immediately deleted and not used for any purpose.
  • VVA designers and application developers should provide secure state-of-the-art authentication procedures to users.
  • Human reviewers should always receive the strictly necessary pseudonymised data. The legal agreements governing the review should expressly forbid any processing that could lead to the identification of the data subject.
  • If emergency calling is provided as a service through the VVA, a stable uptime should be guaranteed.
  • Voice templates should be generated, stored and matched exclusively on the local device, not in remote servers.
  • Due to the sensitiveness of the voiceprints, standards such as ISO/IEC 24745 and techniques of biometric template protection should be thoroughly applied.
  • If a VVA uses voice based biometric identification VVA providers should:
    • Ensure that the identification is accurate enough to reliably associate personal data to the right data subjects.
    • Ensure that the accuracy is similar for all user groups by checking that there is no substantial bias towards different demographic groups.
  • In order to avoid recording background voices and situational information, VVA service providers should apply automated background-noise filtering.
  • VVA designers should consider technologies deleting the background noise and conversations ensuring that only the user voice is recorded.
  • If voice messages are to be used to inform users according to Article 13, the data controllers should publish such messages on their website so they are accessible to the users and the data protection authorities.

© Michael Kans, Michael Kans Blog and michaelkans.blog, 2019-2021. Unauthorized use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Michael Kans, Michael Kans Blog, and michaelkans.blog with appropriate and specific direction to the original content.

Photo by Ivan Bandura on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s