Face/Off: The Battle To Authenticate With Biometrics

By: 

Mike Serra

Authored on: 
Thursday, April 15, 2021

Introduction

Face/Off is a 1997 Academy Award nominated movie[1] about cybersecurity. Well, technically, it is about an FBI agent (played by John Travolta) and a terrorist (played by Nicolas Cage) swapping faces. Cage’s character plants an explosive and Travolta’s character needs to find it, so the FBI unceremoniously removes Cage’s face while he is unconscious and transplants it onto Travolta’s body. Travolta (with Cage’s face) then attempts to fool Cage’s accomplice into disclosing the bomb’s location. Cage, in the meantime, wakes up to steal Travolta’s unguarded face. Spoiler alert: confusion ensues. It is everything you want in a late 90’s blockbuster.

The action in Face/Off is exactly like a cyberattack. Well, not exactly, but it is comparable. Many cyberattacks involve bad guys stealing user credentials (e.g., username and password) to fool applications into allowing unauthorized access to sensitive information. Faces, though, are tougher to steal than a username and password.

Following Face/Off’s logic, companies are relying more on biometric indicators, like a fingerprint or face, to authenticate users for critical applications. That heightened security comes with a host of legal issues. Complexities that Face/Off neglected to address. This article will introduce methods of authenticating with biometrics, roughly describe the relevant legal landscape, and ultimately suggest a risk-averse approach when using these tools.

I. What are Biometrics?

Biometrics are the measure of an individual’s physical and/or behavioral characteristics.[2] They can be used for many purposes. For example: fingerprint scans may initiate time clocks to prevent workers from prematurely clocking-in their friends;[3] smart speakers could recognize user voices for accurate direction; fitness apps monitor heart rate to track user progress; and thermal imaging can gauge body temperature to detect symptoms, avoiding a health crisis. This Section provides a brief summary of three methods for authenticating using biometrics.

A. Fingerprint Scanners

Fingerprint analysis is an age-old method of authentication. People leave fingerprints everywhere, and police have used fingerprint matching systems for decades. Scanners introduced automation to this process. For authentication, scanners match a user’s fingerprint with an image already registered on a particular device, thereby validating the user before granting access to his or her device or application.

“Touch ID” for Apple’s iPhone, iPad, and MacBook Pro utilize fingerprint scanners for this purpose. The result is heightened security and increased efficiency compared to a password alone. According to Apple, sensors take high-resolution images of each user’s fingerprint, the device creates a mathematical representation of that biometric indicator, and its tech works to match that mathematical representation to the user’s registered fingerprint data.[4] Devices are then unlocked for use.[5]

Fingerprint scanners are not perfect. Optical scanners take two-dimensional images of a fingerprint, which can be spoofed by simply printing an image of the valid fingerprint.[6] Advanced scanners relying on electronical conductivity and ultrasonic waves are more secure. However, accurate three-dimensional printed fingerprint replicas might beat ultrasonic sensors and dirty surfaces reduce their overall effectiveness.[7] Additional concerns arise when law enforcement in some jurisdictions can force criminal defendants to open a locked smartphone protected by a fingerprint scanner.[8]

B. Facial Recognition

Facial recognition appears to be more secure than fingerprint scanners. While both use images to match biometric indicators to an enrolled user,[9] studies show that the False Acceptance Rate[10] for facial recognition is around 1/1,000,000 as opposed to 1/50,000 for fingerprint scanners.[11] One reason is that facial recognition generally extracts more data points. Facial geometry like the relative positioning, size and shape of the eyes, nose and mouth—and the distance measured between those features—allows for greater accuracy when comparing subjects.[12]

Again, this method of authentication is not flawless. Facial recognition has been defeated by a high-resolution photo,[13] a three-dimensional printed face cast,[14] and as shown in Face/Off, through a highly experimental surgical procedure. Obtaining such precise data may be a significant practical barrier for a would-be attacker.[15] Additional “liveliness” factors recognizing if a user’s eyes are open and directed toward their device increase security against photos or casts.[16] Plus, the science behind face transplantation is not as advanced as the creators of Face/Off would have you believe.

Facial recognition has its own unique issues.  For one, a data breach could mean each user’s face is released to criminals. That happened when the Department of Homeland Security was recently hacked, with the release compromising of approximately 184,000 images.[17] That breach highlights an important contrast among services when handling biometric data. DHS aggregated its data in a central database vulnerable to attack, but best practice is to store sensitive data on the user’s device similar to Apple’s Touch ID (discussed above) and “Face ID” features.[18]

Racial bias is another major concern.[19] An exam proctoring software widely used during the COVID-19 pandemic has been criticized for its facial recognition technology allegedly failing to recognize people of color.[20] Indeed, independent research found that leading facial recognition software had a 34% higher error rate when used on darker-skinned females as opposed to lighter-skinned males.[21] Companies cannot rely on software that simply fails a vital section of the workforce.

C. Vein Scanners

Vein scanners are a relatively new form of biometric authentication. This technology harnesses infrared to see beyond a user’s skin to verify unique vein patterns.[22] Wax molds might work to fool a vein scanner.[23] Once more, the practical hacker-hurdle is high given limited real-world access to an individual’s veins (unless he or she is unconscious like Cage in Face/Off).

Amazon is raising the bar by combining fingerprint and vein scan technologies.[24] Adding a “liveness” check for pulse, electric conductivity, or eye movement would provide even stronger protection.[25] Writers for the impending Face/Off reboot should take this technology into consideration.[26]

II. The Law that Face/Off Overlooked

Various laws worldwide attempt to protect unauthorized access and use of biometrics. Odds are that only a subset apply to any particular use-case. Nevertheless, it is worth reviewing some core legislation to provide a holistic overview no matter where collection and processing occur.

A. Would Face/Off have violated GDPR?

The European Union’s General Data Protection Regulation (“GDPR”) applies generally to processing personal data with higher standards for “special categories,” including biometrics.[27] Its detailed provisions are treated by many as the baseline for protecting personal information. Any company doing business in the EU is well-aware of that law’s costly sanctions. A single transgression may result in a fine equaling the greater of €20 million or four percent of the offender’s global annual revenue.[28]

The GDPR generally prohibits collecting and using biometrics without the subject’s explicit consent.[29] Consent is defined by the GDPR as “any freely given, specific, informed and unambiguous indication of the [] subject’s wishes.”[30] Explicit consent is not defined in the GDPR, but there is published guidance[31] deducing that explicit consent must require the subject to “give an express statement of consent,” such as a writing, a click-to-accept form, an email, or electronic signature.[32] It is safe to say that the antics in Face/Off  would have violated the GDPR, given that neither Travolta nor Cage consented to the face swap.

B. Would Face/Off Have Fared Better Under U.S. Law?

Readers may be thinking, “Good thing Face/Off took place in the United States, given the GDPR’s strict requirements.” Not so fast. Face/Off’s slipshod approach to face-swapping would have violated U.S. law too. Below are a few notable U.S. laws governing biometrics:

  • The Health Insurance Portability and Accountability Act of 1996 (“HIPAA”)[33] does not specifically address biometrics. Rather, biometrics generally fall under the definition of “Individually Identifiable Health Information” when collected for healthcare purposes by a “covered entity” (health plans, health care providers, and their “business associates”).[34] Such information must be protected and may only be disclosed for limited purposes, including with written consent for any use unrelated to treatment, payment, or health care operations.[35]
  • The California Consumer Protection Act (“CCPA”) is broader than HIPAA because it expressly lists biometrics as “personal information” and impacts more than just “covered entities.”[36] It also gives California residents similar rights that Europeans enjoy under the GDPR, except without added restrictions around using biometrics.[37]
  • California recently enacted the California Privacy Rights Act (“CPRA”), in part to enhance protections for biometrics classified as “sensitive personal information.”[38] The CPRA differs from the GDPR in two material aspects related to biometrics: (A) the CPRA only requires informed consent, not explicit consent, and (B) the CPRA has more avenues for consumers to limit unauthorized use and disclosure.[39]
  • Illinois’s Biometric Information Privacy Act (“BIPA”)[40] is the foundation of most current biometric-related litigation. The BIPA requires informed consent by written release prior to the collection of biometric information, mandates written policies to protect such data, prohibits profiting from it, and arguably most importantly, violators are subject to statutory damages.[41] That last feature sets the BIPA apart because individuals do not need to prove they suffered actual damages if a company is found to misuse their biometric data.[42] Instead, individuals may collect $1,000 per negligent violation and $5,000 per intentional or reckless violation, plus reasonable attorney fees and costs.[43] Potential exposure can be massive: Facebook recently settled a BIPA case for $650 million.[44]

III. What Can Face/Off Teach Us About Cybersecurity?

Face/Off is full of cybersecurity lessons. One is the importance of protecting user credentials. Travolta’s face was left unprotected, so anyone, including his arch nemesis, could steal it. The resulting mayhem taught another lesson. Biometric authenticators are strong, but not without weakness. Travolta’s wife eventually figured out Cage was not her husband by recognizing that he lacked Travolta’s mannerisms, habits, and detailed personal memories (think mother’s maiden name).[45]

The film further teaches a masterclass in misconduct. Failing to secure the faces was an obvious violation. Nevertheless, Face/Off’s fundamental legal problem is lack of consent. Cage’s face was taken while unconscious. Travolta’s face was stolen. Neither had a meaningful opportunity to give consent at all, much less informed or explicit consent.

Consent is key to compliant biometric authentication. For example: the CPRA requires informed consent, the BIPA requires a written release, and the GDPR requires explicit consent.[46] Prudent companies looking to secure an enterprise with biometric authentication tools must understand Face/Off’s fatal flaw and strive for informed consent evidenced in writing. That would satisfy the consent requirements found in each law recited above. This begs the question: what constitutes informed consent?

In the EU, published guidance dictates that, at a minimum, informed consent requires identifying the company requesting biometric authentication, the reason why consent is sought, details on the data being collected, the existence of the right to withdraw consent, and potential risks.[47] Providing such detailed information is critical. Vigilant EU regulators fined Google €50 million because users were not adequately informed about processing their personal data.[48]  

U.S. biometric law is not as strict as the EU guidance. Vigil v. Take-Two Interactive Software, Inc., 235 F. Supp. 3d 499 (S.D.N.Y. 2017) is instructive. There, plaintiffs sued a videogame maker alleging improper consent under the BIPA when the product scanned their faces to create an in-game avatar with matching features. Before initiating the scan, users were required to click-accept the following statement with a link to the applicable terms and conditions: “Your face scan will be visible to you and others you play with and may be recorded or screen captured during gameplay. By proceeding you agree and consent to such uses and other uses pursuant to the End User License Agreement.”[49] The court dismissed the case partly because it was implausible for plaintiffs to click-accept the terms without “understand[ing] that their faces would be scanned, and that those face scans would be used to create personalized [] avatars.”[50]

Readers can now imagine Face/Off if it were written by lawyers. Travolta and Cage are lying on hospital beds next to each other when a dapper attorney approaches, handing each a clipboard and pen. They wait patiently as each character carefully studies the release. Then, almost simultaneously, Travolta and Cage respectfully decline. The movie ends without confusion or mayhem. Pretty boring. So are most things that lawyers write. And that is fine. Companies looking to utilize biometrics for authentication should want boring. Boring can be good. Boring is certainly preferred over a cyberattack, government investigation, or lawsuit.

Mike Serra is product counsel at Cisco Systems, Inc. focusing his practice on intellectual property, data privacy, and general corporate matters for Cisco’s cloud-based cyber security offerings. The opinions expressed in this blog are his own views and not those of Cisco.  Mike would like to thank his colleagues Raj Dhaliwal and Jeremy Erickson for providing subject matter expertise, intern Christy Bonner for cite checking, and the YJOLT’s Editor-in-Chief, Ben Rashkovich, for his support with this project. 

***

[1] For Sound Effects Editing.  Did you think it was Best Picture?

[2] Joe Lazzarotti and Nadine C. Abrahams, Illinois Biometric Information Privacy Act FAQs, Jackson Lewis, https://www.jacksonlewis.com/sites/default/files/docs/IllinoisBiometrics… (last visited March 12, 2021).

[3] Id.

[4] About Touch ID Advanced Security Technology, Apple Support (Sept. 11, 2017), https://support.apple.com/en-us/HT204587#:~:text=On%20iPhone%20and%20iPa….

[5] Some readers may argue that converting biometric data to a cryptographic key may avoid biometric regulations. The author asserts that such arguments are misplaced in circumstances when those keys can be traced back to an identifiable individual. See Frank Ready, Despite Patchwork Regulatory Landscape Companies Aren’t Backing Away From Biometric Identifiers, Law.com: Corporate Counsel (Sept. 17, 2020, 3:09 PM), https://www.law.com/corpcounsel/2020/09/17/despite-patchwork-regulatory-…

[6] Jeremy Erickson, The Good and Bad of Biometrics, Duo Security (Mar. 11, 2020), https://duo.com/labs/research/the-good-and-bad-of-biometrics.

[7] Id.

[8] See State v. Diamond, 905 N.W.2d 870 (Minn. 2018).

[9] Erickson, supra note 6.

[10] False Acceptance Rate (FAR) is the percentage of identification instances in which unauthorized persons are incorrectly accepted.

[11]Brendon Wilson, Worried about Face ID?, Brandonwilsonblog.com (Oct. 3, 2017), https://www.brendonwilson.com/blog/2017/10/03/worried-about-face-id/.

[12] Erickson, supra note 6.

[13] Tomas Foltyn, Face Unlock on Many Android Smartphones Falls for a Photo, welivesecurity.com (Jan. 10, 2019), https://www.welivesecurity.com/2019/01/10/face-unlock-many-android-smart….

[14] Andy Greenberg, Hackers just broke the iPhone X’s Face ID using a 3D-printed mask, Wired (Nov. 13, 2017), https://www.wired.co.uk/article/hackers-trick-apple-iphone-x-face-id-3d-….

[15] Erickson, supra note 6.

[16] Id.

[17] Review of CBP’s Major Cybersecurity Incident during a 2019 Biometric Pilot, Dept. of Homeland Sec. Office of Inspector General, 1, 6 (Sept. 21, 2020),  https://www.oig.dhs.gov/sites/default/files/assets/2020-09/OIG-20-71-Sep….

[18] See About Face ID Advanced Technology, Apple Support (Feb. 26, 2020), https://support.apple.com/en-us/HT208108 (“Face ID data doesn’t leave your device and is never backed up to iCloud or anywhere else.”).

[19] See generally Irina Ivanova, Why Face-Recognition Technology has a Bias Problem, CBS New (June 12, 2020, 7:57 AM), https://www.cbsnews.com/news/facial-recognition-systems-racism-protests-….

[20] Monica Chin, ExamSoft’s Proctoring Software Has a Face-Detection Problem, The Verge (Jan. 5, 2021, 9:21 PM), https://www.theverge.com/2021/1/5/22215727/examsoft-online-exams-testing….

[21] Joy Buolamwini and Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Proceedings of Machine Learning Research 81:1, 8, 2018.

[22] Erickson, supra note 6.

[23] Id.

[24] Bill Toulas, Amazon Files Patent for Non-Contact Biometric ID System, TechNadu (Jan. 2, 2020), https://www.technadu.com/amazon-patent-non-contact-biometric-id-system/8….

[25] See Erickson, supra note 6.

[26] It is unclear at the time of writing whether the next Face/Off movie will be a remake or sequel. Compare Kyle Anderson, Paramount Remaking FACE/OFF For Some Reason, Nerdist (Sept. 9, 2019, 3:50 PM), https://nerdist.com/article/face-off-remake-paramount-nicolas-cage-john-… with Nicholas Rice, John Travolta and Nicolas Cage’s 1997 Action Classic Face/Off to Get a ‘Direct Sequel’, People (Feb. 12, 2021, 11:25 AM), https://people.com/movies/face-off-sequel-in-the-works/.

[27] See Directive 2002/58/EC and Regulation 2016/679 (EU) of the European Parliament and of the Council on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, art. 9, 2016 O.J. (L 119) (General Data Protection Regulation).

[28] Id. at art. 83(5).

[29] Id. at art. 9(2)(a).

[30] Id. at art. 4(11).

[31] Such guidance is provided by the European Data Protection Board (“EDPB”), an EU body charged with applying the GDPR. The EDPB is made up of the head of each Data Protection Authority of the European Data Protection Supervisor or their representatives. See What is the European Data Protection Board (EDPB)?, European Commission, https://ec.europa.eu/info/law/law-topic/data-protection/reform/rules-bus… (last visited Mar. 12, 2021).

[32] Eur. Data Prot. Bd., Guidelines 05/2020 on Consent Under Regulation 2016/679 ver. 1.1, at 20-21 (May 4, 2020), https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_2020….

[33] Health Insurance Portability and Accountability Act of 1996, Pub. L. 104-191. 110 Stat. 1936. Web. 11 Aug. 2014.

[34] See 42 U.S.C. § 1320d(6).

[35] 45 C.F.R. §§ 164.502, 164.506.

[36] See California Consumer Privacy Act of 2018, Cal. Civ. Code §§ 1798.100 to 1798.199 (West 2018).

[37] See id. § 1798.140(15), incorporating “Biometric information” under the definition of “Personal information.”

[38] Anna Daniels, Top-10 Operational Impacts of the CPRA: Part 3 – Right to Correct and Treatment of Sensitive Personal Data, International Association of Privacy Professionals (Jan. 13, 2021), https://iapp.org/news/a/top-10-operational-impacts-of-the-cpra-part-3-th… see also Cal. Civ. Code §§ 1798.100; 1798.121; 1798.135.

[39] Id.

[40] 740 Ill. Comp. Stat. §§ 14/1(2008) et seq.

[41] See Lazzarotti & Abrahams, supra note 2.

[42] See Rosenbach v. Six Flags Ent. Corp., 2019 IL 123186, 129 N.E.3d 1197, 1207 (2019) (holding that the “unambiguous language of the law” requires redress regardless of whether plaintiff can prove “they sustained some actual injury or damage beyond infringement of the rights afforded them under the law”).

[43] See Jessica Robles, Patel v. Facebook, Inc.: The Collection, Storage, and Use of Biometric Data as a Concrete Injury under BIPA, 50 Golden Gate U. L. Rev. 61, 62-3 (2020); see also 740 Ill. Comp. Stat. § 14/20 (2008).

[44] David Oberly, Impact of Facebook $650 Million Patel BIPA Settlement, BiometricUpdate.com (Aug. 20, 2020), https://www.biometricupdate.com/202008/impact-of-facebook-650-million-pa….

[45] She also happened to be a medical doctor and knew that the main characters had different blood types. Testing Cage’s blood (while he had Travolta’s face) was another way of uncovering the truth.

[46] 740 Ill. Comp. Stat. § 14/15(b) (2008); supra note 29.

[47] Supra note 32, at 15.

[48] The CNIL’s Restricted Committee Imposes A Financial Penalty of 50 Million Euros Against Google LLC, Commission Nationale de l’Informatique et des Libertes (Jan. 21, 2019), https://www.cnil.fr/en/cnils-restricted-committee-imposes-financial-pena…

[49] Vigil at 505, aff’d in part, vacated in part, remanded sub nom, Santana v. Take-Two Interactive Software, Inc., 717 F. App’x 12 (2d Cir. 201).  The basis for remand was for the lower court to amend its judgment and enter dismissal without prejudice. Santana, 717 F. App’x at 17.

[50] Id. at 514.