As Voice Recognition Technology Market Surges, Organizations Face Privacy and Cybersecurity Concerns

A new report released by Global Market Insights, Inc. last month estimates that the global market valuation for voice recognition technology will reach approximately $7 billion by 2026, in main part due to the surge of AI and machine learning across a wide array of devices including smartphones, healthcare apps, banking apps and connected cars, just to name a few. Whether performing a quick handsfree search on your phone or car command while driving, voice recognition technology has enhanced the effortlessness of consumer use. Particularly in the wake of the COVID-19 pandemic, companies that may never have considered voice-recognition technology are now rethinking their employee access control systems, and considering touchless authorization technologies, like voice recognition, as the main form of entry into their workspace, as opposed to fingerprint scanners or keypads that increase the risk of germs or virus spreading. 

But while the ease and efficiency of voice recognition technology is clear, the privacy and security obligations associated with this technology cannot be overlooked. Voice recognition is generally classified as a biometric technology which allows the identification of a unique human characteristic (e.g. voice, speech, gait, fingerprints, iris or retina patterns), and as a result voice related data qualifies biometric information and in turn personal information under various privacy and security laws. For businesses that want to deploy voice recognition technology, whether for use by their employees to access systems or when manufacturing a smart device for consumers or patients, there are a number of privacy and security compliance obligations to consider. Here are just a few: 

  • EU’s General Data Protection Regulation (GDPR)  
    • The GDPR, effective since May of 2018, classifies “voice” as “personal data”. While GDPR Article 4.1 which defines “personal data” does not specifically refer to “voice” but rather, “one or several properties unique to their physical, physiological identity…”, the European Data Protect Board has taken the position that “voice recognition” is an example of a physical or physiological biometric identification technique. For businesses that process the personal data of data subjects (EU residents), those data subjects are granted an array of rights (e.g. right to access, right to delete) along with significant privacy and security obligations on the controllers and processors of that data. 
  • California Consumer Privacy Act (CCPA)  
    • The recently enacted California Consumer Privacy Act (CCPA) may apply to a business that collects the personal data of a California resident, regardless of whether the organization is located in California. Under the Act, a covered business must provide a resident with information about its data collection practices including the personal information it collects, discloses, and sells, as well as the right to delete to this data and object to its sale. Notably, the Act prohibits an individual from waiving these rights.  The CCPA includes “biometric information” as an enumerated category of “personal information.”. In the Act’s definition of “biometric information” it states that “[b]iometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted”. 
  • Biometric Information Privacy Act (BIPA)  
    • The BIPA sets forth a comprehensive set of rules for companies doing business in Illinois when collecting biometric identifiers or information of state residents. The BIPA has several key features: • Informed consent prior to collection • Limited right of disclosure of biometric information • Written policy requirement addressing retention and data destruction guidelines • Prohibition on profiting from biometric data. The definition of “biometric identifiers” under the BIPA includes a “voiceprint” (using voice to verify an individual’s identity). Voiceprinting has been the subject of significant BIPA litigation of late, particularly in the context of virtual assistants. While these cases have been tossed for reasons unrelated to voiceprinting itself (e.g. lack of personal jurisdiction), as plaintiffs continue to expand the scope of BIPA targets, companies utilizing voiceprinting will increasingly face exposure to BIPA ligation. 
  • Children’s Online Privacy Protection Act (COPPA)  
    • Under COPPA there are strict consent requirements for collection and storage of data of children under 13. That said, in 2017, the Federal Trade Commission issued guidance on COPPA in the context of voice recordings, relaxing the rule a bit, “The Commission recognizes the value of using voice as a replacement for written words in performing search and other functions on internet-connected devices. Verbal commands may be a necessity for certain consumers, including children who have not yet learned to write or the disabled… as such when a covered operator collects an audio file containing a child’s voice solely as a replacement for written words, such as to perform a search or fulfill a verbal instruction or request, but only maintains the file for the brief time necessary for that purpose, the FTC would not take an enforcement action against the operator on the basis that the operator collected the audio file without first obtaining verifiable parental consent. Such an operator, however, must provide the notice required by the COPPA Rule, including clear notice of its collection and use of audio files and its deletion policy, in its privacy policy.” While the FTC has to-date not issued any COPPA violations in the context of voice recordings, its requirements should not be ignored. 
  • State Statutory and Common Law Mandates to Safeguard Personal Data  
    • Multiple states impose an affirmative duty to use reasonable measures to safeguard personal data that an organization collects or owns, which increasingly includes biometric information. The applicability of these laws may depend on the location of the organization’s facilities and the consumer/employee/patient’s state of residency. Many of these safeguarding laws provide a general framework for compliance, without mandating specific measures. However, “reasonable” generally implies safeguards appropriate to the sensitivity of the data, and one need only look to more robust data security frameworks, such as under HIPAA and the Massachusetts data security regulations, to get a sense of what safeguards may be appropriate. These statutory duties to safeguard are driving increased contractual obligations between businesses exchanging personal information to carry out the terms of the agreement. At the same time, some courts have identified common law duties to safeguard personal data. 
  • State Mandates Regarding Data Destruction and Disposal  
    • Currently, more than thirty states have data destruction and disposal laws. These laws require taking reasonable steps to securely dispose of records containing personal information by shredding, erasing or other methods. States such as Massachusetts include biometric information as a category of personal information subject to these requirements. Organizations should also implement a data retention schedule that ensures the destruction of biometric information, including voiceprints, once it is no longer needed as part of meaningful data destruction practices. 
  • State Data Breach Notification Laws  
    • All fifty U.S. states have data breach notification laws. In general, these laws require an entity that owns or licenses personal information about a state resident to report a data breach to individuals whose personal information is affected and, in some cases, the state attorney general or other agencies, the media, and credit reporting agencies. Each state has its own definition of personal information, and states such as California, Texas, Florida, and Arizona include health, medical, and/or biometric information. Unauthorized acquisition or access to such personal information, whether by hackers or employee error, can require notifications to individuals creating significant exposure and reputational harm to the organization. Perhaps a greater concern from such a compromise is the exfiltration of voiceprint data that could be used by hackers as credentials to access other user accounts, etc. 
  • Vendor Contract Statutes  
    • An increasing number of states including California, Massachusetts, New York, and Oregon statutorily require a business to conduct due diligence before sharing or disclosing certain categories of personal information to a third-party service provider, which likely include biometric information. Many of these statutes also require contractually obligating the vendor to maintain safeguards appropriate to the sensitivity of the data, which is a good practice even if a written agreement is not mandated by the statute. 


Voice recognition technology is booming, and continues to infiltrate different facets of life that are hard to even contemplate. The technology brings innumerable potential benefits as well as significant data privacy and cybersecurity risks. Organizations that collect, use, and store voice data increasingly face compliance obligations as the law attempts to keep pace with technology, cybersecurity crimes and public awareness of data privacy and security. Creating a robust data protection program or regularly reviewing an existing one is a critical risk management and legal compliance step. 

This article is republished with permission of the firm’s Workplace Privacy, Data Management & Security Report.