Skip to main content

Last week, Q5id was part of an International Association of Privacy Professionals (IAPP) panel discussing biometrics and privacy from a global perspective. Panelists included French and German lawyers versed in GDPR regulations, as well as US lawyers accustomed to CCPA, BIPA, and other state-specific types of privacy law.

The benefit of this varied panel was a robust discussion around privacy protections and incorporating biometrics as both a protected set of data as well as a method of protecting data.

What situations are Biometrics unsuited for?

The majority of regulations and public discourse around biometrics tend to center on facial recognition, for valid reasons. Common concerns with facial recognition include surveillance, misidentification, or other abuses that can cause significant hardships or ruin lives for those incorrectly identified.

Part of the problem is that many algorithms used for facial recognition have been documented to be biased against people of color and/or women. While top-notch algorithms in recent years have improved significantly in this regard, the question stands whether law enforcement or government agencies are using best-in-class or leading-edge algorithms. The best of the best algorithms have a 99% accuracy rate regardless of race or gender, but are those algorithms what are being used for government or law enforcement applications? How would we know?

Algorithms aside, there are also concerns over how ethical it is to use facial recognition technology on the general public without their knowledge or consent in the first place. China’s use of facial recognition to monitor the general populace is widely considered invasive. Avoiding a similar invasive creep into our personal lives and privacy tends to be at the heart of many laws and restrictions around biometric technology. Public opinion, in general, tends to favor protecting privacy first, and in the EU, this is backed up by GDPR and similar laws. In the US, laws such as CCPA and BIPA have been passed in specific states, and more states each year are passing new laws protecting privacy.

As a result, one way to navigate the issue is to not use biometrics at all as a method for tracking or searching for individuals. Mass surveillance using biometrics is the sort of oversight and intense scrutiny into our day-to-day activities that we want to avoid.  

Gray Areas for Biometric Use

While mass surveillance and monitoring of the general populace are wildly undesirable, what about when it comes to finding someone in a crowd? Are there situations where biometrics can or should be used to identify one person among many?

The negative impact is when this type of identification is used and a false positive leads to someone innocent being arrested. When used by law enforcement or government agencies to enforce laws, the negative impact of a false positive is much higher than in a typical enterprise application. Being arrested has long-lasting and severe negative impacts, while being denied access to a business system is at worst, inconvenient.

False Positive: When the system declares a positive match incorrectly, i.e., the system declares a face matches an enrolled image, but they are not the same person.

False Negative: When the system declares no match exists, but there is one; i.e. the system says a face does not match an enrolled image, but the image and face are the same person.

While the idea of falsely identifying someone when it comes to who to arrest is unacceptable, the opposite is likely true in the case of a missing person. If your child is missing, which scenario seems more appealing: multiple possible children that match yours are identified, including your child, or none are identified, including your child? 

That is the threshold of false positive/false negative calibration calculations, as well as the importance of context. When the negative repercussions of identifying someone incorrectly are potentially negative or life-altering, calibrating matches to favor a false negative, rather than a false positive, should be the goal. Conversely, when the repercussions are a mild inconvenience, such as checking with families at a mall whose children look close to a missing child, favoring a false positive is more likely to have an optimal end outcome.

Ideally, searching for a missing child or loved one would be able to rely on local community efforts for a speedy and positive resolution, rather than combing security footage for a potentially matching face. Utilizing a network of proven identities to help search for someone missing, rather than biometrics, both improves the intelligence of the search, and uses a search method that is much less intrusive than mass scanning security footage.

When is it Ideal to Use Biometrics?

While biometrics as technology has potential issues, it also can solve some significant challenges in technology and cybersecurity. Identity and access management is an entire field within the technology industry, and it centers around ensuring that only the people who should have access do get access.

In situations where access needs to be gated, such as to protect sensitive health data, personally identifiable information, or critical infrastructure elements, the use of biometrics as a highly accurate identity verification tool makes sense.

Secure identity proofing that relies on biometrics, particularly a variety of them, adds a level of security to access control systems that can prevent or significantly reduce risks from phishing or credential theft. By removing usernames and passwords in the way that Q5id operates, organizations can guard their systems against the biggest risk in a digital system: the human factor. Even when users are trained and on guard, phishing scams continue to get increasingly sophisticated. By limiting systems to require at least one method of biometric identification (or better yet, two), organizations can improve their cybersecurity stance with little to no additional friction to the user’s login flow.

When a proven identity is centrally managed by the owner of that identity, it can become a useful way for individuals to prove themselves for high-stakes transactions. Biometrics is also a secure, accurate way to identify people joining a service or community, such as one who would help find a missing child in the example given earlier. If you were a parent and relying on a local network of good Samaritans to help find your child, wouldn’t you feel more confident if all people helping in the search had first had their identities proven and verified before they could see any information about your child?

Where biometrics serve to improve security, rather than collect or process biometric data simply for the sake of using biometrics, they make an ideal addition to your business’s technology toolkit. Using biometrics to guard sensitive data and information can become a significant benefit, demonstrating how seriously your company takes data protection.

Biometrics and Privacy Can Coexist

There are ways that you can both utilize the security benefits of biometric authentication as well as protect the privacy of users that are in your system.

As we’ve discussed before, there are a handful of key best practices that help protect data.

You’ll want to detach any personally identifiable data from the biometrics being gathered, which means storing the PII separately from any biometric data required to be kept on hand.

If you’re legally permitted, delete the original images taken at enrollment for calculating the algorithmic template. This ensures you’re not keeping any potentially identifiable information unnecessarily.

Have a robust internal cybersecurity policy, and ensure that any technology partners you work with have at least SOC 2 compliance or better.

And last, but not least, to ensure that you are utilizing biometrics with privacy by design, ensure that biometrics are only used to authenticate or verify identities. It should go without saying that you can’t use biometrics in surveillance and expect to also protect privacy.

If you’d like to discuss how to use biometrics in your organization in a privacy-minded fashion, set up a time to talk with our team by emailing us at


Request Demo

"*" indicates required fields