Facial recognition technology (FRT) is a biometric technology that compares two or more images of faces to determine whether they represent the same individual. Automated FRT is increasingly used by law enforcement to help identify criminal suspects and other persons of interest. Law enforcement may use FRT and associated image databases to compare and match face images taken from a diverse range of sources, including mugshots, driver’s licenses, images from police body cameras, and video stills taken from public surveillance footage. Images might also be compared to nongovernment sources, such as those posted on social media.
Currently, there is no overarching federal framework regulating the use of FRT, though a number of federal statutes addressing privacy or data collection and storage may be relevant. Some federal statutes also address or encourage the use of biometrics more specifically, including those calling for the collection of biometric data from foreign travelers entering or exiting the United States. At the state level, most regulation has been focused on the collection and storage of biometric information by private industry. Regulation of law enforcement use of FRT varies among states and localities. While FRT is used by many state and municipal law enforcement agencies, some states and localities have placed restrictions on its use.
The Constitution provides baseline parameters governing FRT’s use by government actors. For example, law enforcement’s use of FRT, in combination with photographic or video surveillance, may raise Fourth Amendment considerations. The Fourth Amendment protects against unreasonable searches and seizures. Government observation of individuals in public generally is not a “search” under the Fourth Amendment. But the Supreme Court recently indicated in Carpenter v. United States that the use of advanced technologies to engage in the prolonged and sustained surveillance of a person’s public activities may prompt Fourth Amendment concerns, when such surveillance becomes so pervasive as to provide “an intimate window into a person’s life.” Carpenter suggests some constraints on the ability of the government to engage in continuousand prolonged FRT-enhanced surveillance of a person’s public movements, even while more limited use of FRT may be permitted. There also may be Fourth Amendment implications if an FRT system is unreliable and leads to the mistaken arrest of misidentified persons. To date, it seems that few courts have considered probable cause challenges to purportedly unreliable FRT. But other situations involving potentially unreliable sources, such as informants and canine alerts, suggest that the reliability of a specific FRT system may be subject to scrutiny by a reviewing court when assessing the basis for a law enforcement search or arrest. For example, a court may consider whether the system’s accuracy was meaningfully affected by factors that could result in misidentification.
Some commentators have suggested that FRT-enhanced public surveillance may impermissibly chill the exercise of free speech and other rights protected by the First Amendment, if, for example, such surveillance enables the government to easily identify those participating in public demonstrations. The Supreme Court has held that government surveillance of speech, without more, may not provide a plaintiff with standing to bring suit alleging a First Amendment violation, meaning that any claim that surveillance infringed a plaintiff’s First Amendment rights would need to claim such surveillance was connected to additional government action causing injury.
Equal protection concerns under the Fifth and Fourteenth Amendments might also be implicated. While FRT has the potential to reduce the likelihood that human error leads to mistaken arrest, some contend that algorithmic biases or other factors may lead to the erroneous matching of images of persons belonging to certain racial and ethnic groups. This misidentification, critics contend, may lead law enforcement to wrongfully target those persons for investigation or arrest. Under current case law, a claim of racially selective law enforcement requires a showing that law enforcement action had a discriminatory effect and was taken with a discriminatory purpose. This framework does not translate easily to automated, algorithmic-based systems like those frequently employed by FRT, which make independent determinations without close human involvement.
Several bills have been introduced in the 116th Congress that address FRT, with most bills focused on constraining its use by law enforcement or private entities.
“Facial Recognition Technology and Law Enforcement: Select Constitutional Considerations,” CRS Report R46541, September 24, 2020 (35-page PDF)
For more than 40 years, TheCapitol.Net and its predecessor, Congressional Quarterly Executive Conferences, have been teaching professionals from government, military, business, and NGOs about the dynamics and operations of the legislative and executive branches and how to work with them.
Our custom on-site and online training, publications, and audio courses include congressional operations, legislative and budget process, communication and advocacy, media and public relations, testifying before Congress, research skills, legislative drafting, critical thinking and writing, and more.
TheCapitol.Net is on the GSA Schedule, MAS, for custom on-site and online training. GSA Contract GS02F0192X
TheCapitol.Net is a non-partisan small business.
Teaching how Washington and Congress work ™