Face/Off is a 1997 Academy Award nominated movie about cybersecurity. Well, technically, it is about an FBI agent (played by John Travolta) and a terrorist (played by Nicolas Cage) swapping faces. Cage’s character plants an explosive and Travolta’s character needs to find it, so the FBI unceremoniously removes Cage’s face while he is unconscious and transplants it onto Travolta’s body. Travolta (with Cage’s face) then attempts to fool Cage’s accomplice into disclosing the bomb’s location. Cage, in the meantime, wakes up to steal Travolta’s unguarded face. Spoiler alert: confusion ensues. It is everything you want in a late 90’s blockbuster.
The action in Face/Off is exactly like a cyberattack. Well, not exactly, but it is comparable. Many cyberattacks involve bad guys stealing user credentials (e.g., username and password) to fool applications into allowing unauthorized access to sensitive information. Faces, though, are tougher to steal than a username and password.
Following Face/Off’s logic, companies are relying more on biometric indicators, like a fingerprint or face, to authenticate users for critical applications. That heightened security comes with a host of legal issues. Complexities that Face/Off neglected to address. This article will introduce methods of authenticating with biometrics, roughly describe the relevant legal landscape, and ultimately suggest a risk-averse approach when using these tools.
I. What are Biometrics?
Biometrics are the measure of an individual’s physical and/or behavioral characteristics. They can be used for many purposes. For example: fingerprint scans may initiate time clocks to prevent workers from prematurely clocking-in their friends; smart speakers could recognize user voices for accurate direction; fitness apps monitor heart rate to track user progress; and thermal imaging can gauge body temperature to detect symptoms, avoiding a health crisis. This Section provides a brief summary of three methods for authenticating using biometrics.
A. Fingerprint Scanners
Fingerprint analysis is an age-old method of authentication. People leave fingerprints everywhere, and police have used fingerprint matching systems for decades. Scanners introduced automation to this process. For authentication, scanners match a user’s fingerprint with an image already registered on a particular device, thereby validating the user before granting access to his or her device or application.
“Touch ID” for Apple’s iPhone, iPad, and MacBook Pro utilize fingerprint scanners for this purpose. The result is heightened security and increased efficiency compared to a password alone. According to Apple, sensors take high-resolution images of each user’s fingerprint, the device creates a mathematical representation of that biometric indicator, and its tech works to match that mathematical representation to the user’s registered fingerprint data. Devices are then unlocked for use.
Fingerprint scanners are not perfect. Optical scanners take two-dimensional images of a fingerprint, which can be spoofed by simply printing an image of the valid fingerprint. Advanced scanners relying on electronical conductivity and ultrasonic waves are more secure. However, accurate three-dimensional printed fingerprint replicas might beat ultrasonic sensors and dirty surfaces reduce their overall effectiveness. Additional concerns arise when law enforcement in some jurisdictions can force criminal defendants to open a locked smartphone protected by a fingerprint scanner.
B. Facial Recognition
Facial recognition appears to be more secure than fingerprint scanners. While both use images to match biometric indicators to an enrolled user, studies show that the False Acceptance Rate for facial recognition is around 1/1,000,000 as opposed to 1/50,000 for fingerprint scanners. One reason is that facial recognition generally extracts more data points. Facial geometry like the relative positioning, size and shape of the eyes, nose and mouth—and the distance measured between those features—allows for greater accuracy when comparing subjects.
Again, this method of authentication is not flawless. Facial recognition has been defeated by a high-resolution photo, a three-dimensional printed face cast, and as shown in Face/Off, through a highly experimental surgical procedure. Obtaining such precise data may be a significant practical barrier for a would-be attacker. Additional “liveliness” factors recognizing if a user’s eyes are open and directed toward their device increase security against photos or casts. Plus, the science behind face transplantation is not as advanced as the creators of Face/Off would have you believe.
Facial recognition has its own unique issues. For one, a data breach could mean each user’s face is released to criminals. That happened when the Department of Homeland Security was recently hacked, with the release compromising of approximately 184,000 images. That breach highlights an important contrast among services when handling biometric data. DHS aggregated its data in a central database vulnerable to attack, but best practice is to store sensitive data on the user’s device similar to Apple’s Touch ID (discussed above) and “Face ID” features.
Racial bias is another major concern. An exam proctoring software widely used during the COVID-19 pandemic has been criticized for its facial recognition technology allegedly failing to recognize people of color. Indeed, independent research found that leading facial recognition software had a 34% higher error rate when used on darker-skinned females as opposed to lighter-skinned males. Companies cannot rely on software that simply fails a vital section of the workforce.
C. Vein Scanners
Vein scanners are a relatively new form of biometric authentication. This technology harnesses infrared to see beyond a user’s skin to verify unique vein patterns. Wax molds might work to fool a vein scanner. Once more, the practical hacker-hurdle is high given limited real-world access to an individual’s veins (unless he or she is unconscious like Cage in Face/Off).
Amazon is raising the bar by combining fingerprint and vein scan technologies. Adding a “liveness” check for pulse, electric conductivity, or eye movement would provide even stronger protection. Writers for the impending Face/Off reboot should take this technology into consideration.
II. The Law that Face/Off Overlooked
Various laws worldwide attempt to protect unauthorized access and use of biometrics. Odds are that only a subset apply to any particular use-case. Nevertheless, it is worth reviewing some core legislation to provide a holistic overview no matter where collection and processing occur.
A. Would Face/Off have violated GDPR?
The European Union’s General Data Protection Regulation (“GDPR”) applies generally to processing personal data with higher standards for “special categories,” including biometrics. Its detailed provisions are treated by many as the baseline for protecting personal information. Any company doing business in the EU is well-aware of that law’s costly sanctions. A single transgression may result in a fine equaling the greater of €20 million or four percent of the offender’s global annual revenue.
The GDPR generally prohibits collecting and using biometrics without the subject’s explicit consent. Consent is defined by the GDPR as “any freely given, specific, informed and unambiguous indication of the  subject’s wishes.” Explicit consent is not defined in the GDPR, but there is published guidance deducing that explicit consent must require the subject to “give an express statement of consent,” such as a writing, a click-to-accept form, an email, or electronic signature. It is safe to say that the antics in Face/Off would have violated the GDPR, given that neither Travolta nor Cage consented to the face swap.
B. Would Face/Off Have Fared Better Under U.S. Law?
Readers may be thinking, “Good thing Face/Off took place in the United States, given the GDPR’s strict requirements.” Not so fast. Face/Off’s slipshod approach to face-swapping would have violated U.S. law too. Below are a few notable U.S. laws governing biometrics:
- The Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) does not specifically address biometrics. Rather, biometrics generally fall under the definition of “Individually Identifiable Health Information” when collected for healthcare purposes by a “covered entity” (health plans, health care providers, and their “business associates”). Such information must be protected and may only be disclosed for limited purposes, including with written consent for any use unrelated to treatment, payment, or health care operations.
- The California Consumer Protection Act (“CCPA”) is broader than HIPAA because it expressly lists biometrics as “personal information” and impacts more than just “covered entities.” It also gives California residents similar rights that Europeans enjoy under the GDPR, except without added restrictions around using biometrics.
- California recently enacted the California Privacy Rights Act (“CPRA”), in part to enhance protections for biometrics classified as “sensitive personal information.” The CPRA differs from the GDPR in two material aspects related to biometrics: (A) the CPRA only requires informed consent, not explicit consent, and (B) the CPRA has more avenues for consumers to limit unauthorized use and disclosure.
- Illinois’s Biometric Information Privacy Act (“BIPA”) is the foundation of most current biometric-related litigation. The BIPA requires informed consent by written release prior to the collection of biometric information, mandates written policies to protect such data, prohibits profiting from it, and arguably most importantly, violators are subject to statutory damages. That last feature sets the BIPA apart because individuals do not need to prove they suffered actual damages if a company is found to misuse their biometric data. Instead, individuals may collect $1,000 per negligent violation and $5,000 per intentional or reckless violation, plus reasonable attorney fees and costs. Potential exposure can be massive: Facebook recently settled a BIPA case for $650 million.
III. What Can Face/Off Teach Us About Cybersecurity?
Face/Off is full of cybersecurity lessons. One is the importance of protecting user credentials. Travolta’s face was left unprotected, so anyone, including his arch nemesis, could steal it. The resulting mayhem taught another lesson. Biometric authenticators are strong, but not without weakness. Travolta’s wife eventually figured out Cage was not her husband by recognizing that he lacked Travolta’s mannerisms, habits, and detailed personal memories (think mother’s maiden name).
The film further teaches a masterclass in misconduct. Failing to secure the faces was an obvious violation. Nevertheless, Face/Off’s fundamental legal problem is lack of consent. Cage’s face was taken while unconscious. Travolta’s face was stolen. Neither had a meaningful opportunity to give consent at all, much less informed or explicit consent.
Consent is key to compliant biometric authentication. For example: the CPRA requires informed consent, the BIPA requires a written release, and the GDPR requires explicit consent. Prudent companies looking to secure an enterprise with biometric authentication tools must understand Face/Off’s fatal flaw and strive for informed consent evidenced in writing. That would satisfy the consent requirements found in each law recited above. This begs the question: what constitutes informed consent?
In the EU, published guidance dictates that, at a minimum, informed consent requires identifying the company requesting biometric authentication, the reason why consent is sought, details on the data being collected, the existence of the right to withdraw consent, and potential risks. Providing such detailed information is critical. Vigilant EU regulators fined Google €50 million because users were not adequately informed about processing their personal data.
U.S. biometric law is not as strict as the EU guidance. Vigil v. Take-Two Interactive Software, Inc., 235 F. Supp. 3d 499 (S.D.N.Y. 2017) is instructive. There, plaintiffs sued a videogame maker alleging improper consent under the BIPA when the product scanned their faces to create an in-game avatar with matching features. Before initiating the scan, users were required to click-accept the following statement with a link to the applicable terms and conditions: “Your face scan will be visible to you and others you play with and may be recorded or screen captured during gameplay. By proceeding you agree and consent to such uses and other uses pursuant to the End User License Agreement.” The court dismissed the case partly because it was implausible for plaintiffs to click-accept the terms without “understand[ing] that their faces would be scanned, and that those face scans would be used to create personalized  avatars.”
Readers can now imagine Face/Off if it were written by lawyers. Travolta and Cage are lying on hospital beds next to each other when a dapper attorney approaches, handing each a clipboard and pen. They wait patiently as each character carefully studies the release. Then, almost simultaneously, Travolta and Cage respectfully decline. The movie ends without confusion or mayhem. Pretty boring. So are most things that lawyers write. And that is fine. Companies looking to utilize biometrics for authentication should want boring. Boring can be good. Boring is certainly preferred over a cyberattack, government investigation, or lawsuit.
Mike Serra is product counsel at Cisco Systems, Inc. focusing his practice on intellectual property, data privacy, and general corporate matters for Cisco’s cloud-based cyber security offerings. The opinions expressed in this blog are his own views and not those of Cisco. Mike would like to thank his colleagues Raj Dhaliwal and Jeremy Erickson for providing subject matter expertise, intern Christy Bonner for cite checking, and the YJOLT’s Editor-in-Chief, Ben Rashkovich, for his support with this project.
 For Sound Effects Editing. Did you think it was Best Picture?
 Joe Lazzarotti and Nadine C. Abrahams, Illinois Biometric Information Privacy Act FAQs, Jackson Lewis, https://www.jacksonlewis.com/sites/default/files/docs/IllinoisBiometrics… (last visited March 12, 2021).
 About Touch ID Advanced Security Technology, Apple Support (Sept. 11, 2017), https://support.apple.com/en-us/HT204587#:~:text=On%20iPhone%20and%20iPa….
 Some readers may argue that converting biometric data to a cryptographic key may avoid biometric regulations. The author asserts that such arguments are misplaced in circumstances when those keys can be traced back to an identifiable individual. See Frank Ready, Despite Patchwork Regulatory Landscape Companies Aren’t Backing Away From Biometric Identifiers, Law.com: Corporate Counsel (Sept. 17, 2020, 3:09 PM), https://www.law.com/corpcounsel/2020/09/17/despite-patchwork-regulatory-…
 Jeremy Erickson, The Good and Bad of Biometrics, Duo Security (Mar. 11, 2020), https://duo.com/labs/research/the-good-and-bad-of-biometrics.
 See State v. Diamond, 905 N.W.2d 870 (Minn. 2018).
 Erickson, supra note 6.
 False Acceptance Rate (FAR) is the percentage of identification instances in which unauthorized persons are incorrectly accepted.
Brendon Wilson, Worried about Face ID?, Brandonwilsonblog.com (Oct. 3, 2017), https://www.brendonwilson.com/blog/2017/10/03/worried-about-face-id/.
 Erickson, supra note 6.
 Tomas Foltyn, Face Unlock on Many Android Smartphones Falls for a Photo, welivesecurity.com (Jan. 10, 2019), https://www.welivesecurity.com/2019/01/10/face-unlock-many-android-smart….
 Andy Greenberg, Hackers just broke the iPhone X’s Face ID using a 3D-printed mask, Wired (Nov. 13, 2017), https://www.wired.co.uk/article/hackers-trick-apple-iphone-x-face-id-3d-….
 Erickson, supra note 6.
 Review of CBP’s Major Cybersecurity Incident during a 2019 Biometric Pilot, Dept. of Homeland Sec. Office of Inspector General, 1, 6 (Sept. 21, 2020), https://www.oig.dhs.gov/sites/default/files/assets/2020-09/OIG-20-71-Sep….
 See About Face ID Advanced Technology, Apple Support (Feb. 26, 2020), https://support.apple.com/en-us/HT208108 (“Face ID data doesn’t leave your device and is never backed up to iCloud or anywhere else.”).
 See generally Irina Ivanova, Why Face-Recognition Technology has a Bias Problem, CBS New (June 12, 2020, 7:57 AM), https://www.cbsnews.com/news/facial-recognition-systems-racism-protests-….
 Monica Chin, ExamSoft’s Proctoring Software Has a Face-Detection Problem, The Verge (Jan. 5, 2021, 9:21 PM), https://www.theverge.com/2021/1/5/22215727/examsoft-online-exams-testing….
 Joy Buolamwini and Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Proceedings of Machine Learning Research 81:1, 8, 2018.
 Erickson, supra note 6.
 Bill Toulas, Amazon Files Patent for Non-Contact Biometric ID System, TechNadu (Jan. 2, 2020), https://www.technadu.com/amazon-patent-non-contact-biometric-id-system/8….
 See Erickson, supra note 6.
 It is unclear at the time of writing whether the next Face/Off movie will be a remake or sequel. Compare Kyle Anderson, Paramount Remaking FACE/OFF For Some Reason, Nerdist (Sept. 9, 2019, 3:50 PM), https://nerdist.com/article/face-off-remake-paramount-nicolas-cage-john-… with Nicholas Rice, John Travolta and Nicolas Cage’s 1997 Action Classic Face/Off to Get a ‘Direct Sequel’, People (Feb. 12, 2021, 11:25 AM), https://people.com/movies/face-off-sequel-in-the-works/.
 See Directive 2002/58/EC and Regulation 2016/679 (EU) of the European Parliament and of the Council on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, art. 9, 2016 O.J. (L 119) (General Data Protection Regulation).
 Id. at art. 83(5).
 Id. at art. 9(2)(a).
 Id. at art. 4(11).
 Such guidance is provided by the European Data Protection Board (“EDPB”), an EU body charged with applying the GDPR. The EDPB is made up of the head of each Data Protection Authority of the European Data Protection Supervisor or their representatives. See What is the European Data Protection Board (EDPB)?, European Commission, https://ec.europa.eu/info/law/law-topic/data-protection/reform/rules-bus… (last visited Mar. 12, 2021).
 Eur. Data Prot. Bd., Guidelines 05/2020 on Consent Under Regulation 2016/679 ver. 1.1, at 20-21 (May 4, 2020), https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_2020….
 Health Insurance Portability and Accountability Act of 1996, Pub. L. 104-191. 110 Stat. 1936. Web. 11 Aug. 2014.
 See 42 U.S.C. § 1320d(6).
 45 C.F.R. §§ 164.502, 164.506.
 See California Consumer Privacy Act of 2018, Cal. Civ. Code §§ 1798.100 to 1798.199 (West 2018).
 See id. § 1798.140(15), incorporating “Biometric information” under the definition of “Personal information.”
 Anna Daniels, Top-10 Operational Impacts of the CPRA: Part 3 – Right to Correct and Treatment of Sensitive Personal Data, International Association of Privacy Professionals (Jan. 13, 2021), https://iapp.org/news/a/top-10-operational-impacts-of-the-cpra-part-3-th… see also Cal. Civ. Code §§ 1798.100; 1798.121; 1798.135.
 740 Ill. Comp. Stat. §§ 14/1(2008) et seq.
 See Lazzarotti & Abrahams, supra note 2.
 See Rosenbach v. Six Flags Ent. Corp., 2019 IL 123186, 129 N.E.3d 1197, 1207 (2019) (holding that the “unambiguous language of the law” requires redress regardless of whether plaintiff can prove “they sustained some actual injury or damage beyond infringement of the rights afforded them under the law”).
 See Jessica Robles, Patel v. Facebook, Inc.: The Collection, Storage, and Use of Biometric Data as a Concrete Injury under BIPA, 50 Golden Gate U. L. Rev. 61, 62-3 (2020); see also 740 Ill. Comp. Stat. § 14/20 (2008).
 David Oberly, Impact of Facebook $650 Million Patel BIPA Settlement, BiometricUpdate.com (Aug. 20, 2020), https://www.biometricupdate.com/202008/impact-of-facebook-650-million-pa….
 She also happened to be a medical doctor and knew that the main characters had different blood types. Testing Cage’s blood (while he had Travolta’s face) was another way of uncovering the truth.
 740 Ill. Comp. Stat. § 14/15(b) (2008); supra note 29.
 Supra note 32, at 15.
 The CNIL’s Restricted Committee Imposes A Financial Penalty of 50 Million Euros Against Google LLC, Commission Nationale de l’Informatique et des Libertes (Jan. 21, 2019), https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-pena…
 Vigil at 505, aff’d in part, vacated in part, remanded sub nom, Santana v. Take-Two Interactive Software, Inc., 717 F. App’x 12 (2d Cir. 201). The basis for remand was for the lower court to amend its judgment and enter dismissal without prejudice. Santana, 717 F. App’x at 17.
 Id. at 514.