Facial Recognition Technology: Federal Law Enforcement Agencies Should Better Assess Privacy and Other Risks

GAO-21-518 Published: Jun 03, 2021. Publicly Released: Jun 29, 2021.
Jump To:
Fast Facts

We surveyed 42 federal agencies that employ law enforcement officers about their use of facial recognition technology.

  • 20 reported owning such systems or using systems owned by others
  • 6 reported using the technology to help identify people suspected of violating the law during the civil unrest, riots, or protests following the death of George Floyd in May 2020
  • 3 acknowledged using it on images of the U.S. Capitol attack on Jan. 6
  • 15 reported using non-federal systems

We recommended that 13 agencies track employee use of non-federal systems and assess the risks these systems can pose regarding privacy, accuracy, and more.

Facial recognition technology

Skip to Highlights
Highlights

What GAO Found

GAO surveyed 42 federal agencies that employ law enforcement officers about their use of facial recognition technology. Twenty reported owning systems with facial recognition technology or using systems owned by other entities, such as other federal, state, local, and non-government entities (see figure).

Ownership and Use of Facial Recognition Technology Reported by Federal Agencies that Employ Law Enforcement Officers

HLP_5 - 103705

Note: For more details, see figure 2 in GAO-21-518.

Agencies reported using the technology to support several activities (e.g., criminal investigations) and in response to COVID-19 (e.g., verify an individual's identity remotely). Six agencies reported using the technology on images of the unrest, riots, or protests following the death of George Floyd in May 2020. Three agencies reported using it on images of the events at the U.S. Capitol on January 6, 2021. Agencies said the searches used images of suspected criminal activity.

All fourteen agencies that reported using the technology to support criminal investigations also reported using systems owned by non-federal entities. However, only one has awareness of what non-federal systems are used by employees. By having a mechanism to track what non-federal systems are used by employees and assessing related risks (e.g., privacy and accuracy-related risks), agencies can better mitigate risks to themselves and the public.

Why GAO Did This Study

Federal agencies that employ law enforcement officers can use facial recognition technology to assist criminal investigations, among other activities. For example, the technology can help identify an unknown individual in a photo or video surveillance.

GAO was asked to review federal law enforcement use of facial recognition technology. This report examines the 1) ownership and use of facial recognition technology by federal agencies that employ law enforcement officers, 2) types of activities these agencies use the technology to support, and 3) the extent that these agencies track employee use of facial recognition technology owned by non-federal entities.

GAO administered a survey questionnaire to 42 federal agencies that employ law enforcement officers regarding their use of the technology. GAO also reviewed documents (e.g., system descriptions) and interviewed officials from selected agencies (e.g., agencies that owned facial recognition technology). This is a public version of a sensitive report that GAO issued in April 2021. Information that agencies deemed sensitive has been omitted.

Skip to Recommendations

Recommendations

GAO is making two recommendations to each of 13 federal agencies to implement a mechanism to track what non-federal systems are used by employees, and assess the risks of using these systems. Twelve agencies concurred with both recommendations. U.S. Postal Service concurred with one and partially concurred with the other. GAO continues to believe the recommendation is valid, as described in the report.

 

Recommendations for Executive Action

Agency Affected Recommendation Status
Bureau of Alcohol, Tobacco, Firearms and Explosives The Director of the Bureau of Alcohol, Tobacco, Firearms and Explosives should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 1)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address this recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
Bureau of Alcohol, Tobacco, Firearms and Explosives The Director of the Bureau of Alcohol, Tobacco, Firearms and Explosives should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 2)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
Drug Enforcement Administration The Administrator for the Drug Enforcement Administration should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 3)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
Drug Enforcement Administration The Administrator for the Drug Enforcement Administration should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 4)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
Federal Bureau of Investigation The Director of the FBI should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 5)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
Federal Bureau of Investigation The Director of the FBI should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 6)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
United States Marshals Service The Director of the U.S. Marshals Service should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 7)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
United States Marshals Service The Director of the U.S. Marshals Service should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 8)
Open
The Department of Justice concurred with this recommendation. As of April 2022, DOJ has not provided an update on the actions taken to address the recommendation. When we confirm what actions the agency has taken in response to this recommendation, we will provide updated information.
United States Customs and Border Protection The Commissioner of CBP should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 9)
Open
In December 2021, U.S. Customs and Border Protection (CBP) reported having a privacy directive stating that personnel are responsible for notifying the CBP Privacy Office about the implementation, or proposed implementation, of technologies that may impact individuals' privacy, including personally identifiable information (PII). According to the agency, the directive, which broadly defines PII, did not specifically identify facial recognition technologies as privacy-invasive but requires all programs using tools, systems, and technologies to complete a privacy threshold analysis to determine whether PII is involved. In response to GAO's recommendation, in December 2021, CBP said its privacy officer began circulating an updated version of this privacy directive to agency leadership for review and concurrence. The revised directive more explicitly identifies the applicability of requirements for facial recognition technologies. As of December 2021, CBP reported that the document was still under review, but once approved by the agency Commissioner, it will be distributed to the workforce. The agency also reported that the CBP Privacy Office would create and circulate outreach messaging specific to using and implementing facial recognition technologies across the CBP workforce. As of April 2022, this recommendation remains open, and CBP reported an estimated completion date of March 31, 2023.
United States Customs and Border Protection The Commissioner of CBP should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 10)
Open
In December 2021, CBP reported that its privacy compliance process includes reviews of IT systems and technologies to determine what, if any, PII is collected, used, or shared. According to the agency, these efforts allow it to determine whether additional safeguards are necessary to ensure that data is protected from unauthorized use or disclosure. The agency also reported that the increased awareness and understanding of requirements brought on by the updated privacy directive would present new opportunities to review use cases and tools associated with facial recognition technologies that the Privacy Office may not have previously known. In addition, the Privacy Office has begun privacy training sessions for individuals who can procure access to technologies via a government purchase card outside of the typical IT acquisition review process. As of April 2022, this recommendation remains open. The action plan reported by the agency has an estimated closure date of March 31, 2023.
United States Secret Service The Director of the Secret Service should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 11)
Closed – Implemented
In June 2021, we reported on the use of facial recognition technology by agencies that employ federal law enforcement officers (GAO-21-518). We found that between April 2018 and March 2020, the U.S. Secret Service used non-federal facial recognition systems to support investigative activities, without a mechanism to track which systems employees were using. Consequently, we recommended that the Secret Service implement a mechanism to track what non-federal systems employees use to support investigative activities. In January 2022, the Secret Service issued a requirement to its Office of Investigations and field offices, among other divisions. The agency said that the purpose of the requirement was to capture the use of facial recognition systems owned, contracted, and/or operated by partnering law enforcement agencies during an investigation. For example, the requirement states that all personnel who access a partnering law enforcement agency's system must capture that usage within the Secret Service's Incident Based Reporting (IBR) application. In March 2022, the agency provided us evidence of its IBR application updates, including new fields to capture the facial recognition system's name and owner, and the date the system was used. Furthermore, the agency provided a copy of the IBR user guide instructing staff on how to record the use of facial recognition systems. As a result, this recommendation is closed as implemented.
United States Secret Service The Director of the Secret Service should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 12)
Open
According to the agency, once a tracking mechanism is implemented for recommendation 11, the Investigative Support Division and the Criminal Investigative Division would work with the Privacy Office to determine the appropriate frequency of internal case management system audits to review the usage of facial recognition technology by Secret Service employees. In addition to these audits, the agency reported that its Investigative Support Division would collaborate with the Privacy Office to draft a mandatory privacy compliance document. According to the agency, the compliance document would include Privacy Threshold Analyses and Privacy Impact Assessments, allowing privacy risks associated with facial recognition technologies and mitigation strategies to be assessed and documented. The agency reported an estimated completion date of April 29, 2022.
United States Fish and Wildlife Service The Director of the U.S. Fish and Wildlife Service should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 13)
Closed – Implemented
In June 2021, we reported on the use of facial recognition technology by agencies that employ federal law enforcement officers (GAO-21-518). We found that between April 2018 and March 2020, the U.S. Fish and Wildlife Service used non-federal facial recognition technology to support investigative activities, without a mechanism to track which systems employees were using. Consequently, we recommended that the U.S. Fish and Wildlife Service implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. In July 2021, the agency reported the use of a single non-government technology for all of its facial recognition searches, with access limited to two licensed users. In January 2022, the Assistant Director for Law Enforcement issued a directive providing guidance to the Office of Law Enforcement (OLE) on the use of facial recognition technology for investigative purposes. The directive states that the non-government provider is the only approved system for OLE officers. The directive also requires OLE officers to submit requests to the Wildlife Intelligence Unit for facial recognition searches via an internal case management system. The agency also provided documentation showing the dissemination of this directive to OLE officers. As a result, this recommendation is closed as implemented.
United States Fish and Wildlife Service The Director of the U.S. Fish and Wildlife Service should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 14)
Open
Following the implementation of a mechanism to track non-federal systems, the agency said it would issue a Chief's Directive requiring a risk assessment of any facial recognition technology that employees may use, to include an assessment of privacy and accuracy risks. As of April 2022, this recommendation remains open.
United States Park Police The Chief of the U.S. Park Police should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 15)
Open
As of April 2022, this recommendation remains open. In June 2021, the U.S. Park Police reported that it would establish a policy outlining a tracking mechanism for the use of non-federal systems with facial recognition technology used by its employees to support investigative activities. The agency reported that this action has a target implementation date of November 1, 2022.
United States Park Police The Chief of the U.S. Park Police should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 16)
Open
As of April 2022, this recommendation remains open. In December 2021, the Chief of the U.S. Park Police reported that, after implementing a mechanism to track non-federal systems, the agency plans to issue a policy requiring a risk assessment (including privacy and accuracy risks) of any facial recognition technology the agency may use. According to the agency, this action has a target implementation date of November 1, 2022.
Bureau of Diplomatic Security The Assistant Secretary of the Bureau of Diplomatic Security should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 17)
Open
In November 2021, the Assistant Secretary for Diplomatic Security reported that the Bureau is developing internal controls and standard operating procedures to ensure that agents' and analysts' access to non-federal systems with facial recognition technology is adequately vetted. Furthermore, the bureau said these controls and procedures would ensure agent and analyst accounts are managed through a centralized account management process. As of April 2022, this recommendation remains open, and the agency estimated it would complete the actions to address this recommendation by the end of 2022.
Bureau of Diplomatic Security The Assistant Secretary of the Bureau of Diplomatic Security should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 18)
Open
In November 2021, the Assistant Secretary for Diplomatic Security reported that the Bureau has ongoing efforts to vet agents' access to non-federal facial recognition technology. In addition to these efforts, the Department said it intends to establish an internal review panel to evaluate and review any non-federal systems with facial recognition technology that employees might use. According to the Bureau, the panel would, for example, assess the provider's privacy assessments and practices and the internal processes for data collection to evaluate the risks of using a system. The Bureau said it also intends to centralize control over contracts and funding to ensure users are not inadvertently encouraged by providers to utilize non-federal systems with facial recognition technology that lack approval and oversight. As of April 2022, this recommendation remains open.
Food and Drug Administration The Assistant Commissioner of the Food and Drug Administration's Office of Criminal Investigations should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 19)
Closed – Implemented
In June 2021, we reported on the use of facial recognition technology by agencies that employ law enforcement officers (GAO-21-518). We found that between April 2018 and March 2020, Food and Drug Administration's (FDA) Office of Criminal Investigations (OCI) used non-federal facial recognition technology to support investigative activities, but did not have a mechanism to track which systems employees were using. Consequently, we recommended that FDA implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. In December 2021, the U.S. Department of Health and Human Services reported that OCI had deployed a module in its primary system of records for investigations to track employee use of facial recognition technology. The agency also said it notified OCI employees of the requirement to use the module to document information about their use of facial recognition technology in an investigation. The notice informed employees that they would be required to include the system name, owner of the system, and how the system supported OCI's mission. As a result, this recommendation is closed as implemented.
Food and Drug Administration The Assistant Commissioner of the Food and Drug Administration's Office of Criminal Investigations should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 20)
Open
In December 2021, the U.S. Department of Health and Human Services reported that OCI had limited use of facial recognition technology over the previous two years. The department said that although OCI's use of this technology has been infrequent, it would review previous and potential future use to identify potential risks. According to the department, if the review results warrant it, OCI will issue guidance to investigators to minimize those risks. Moreover, the agency reported plans to review relevant policies and procedures from other federal law enforcement agencies on their use of facial recognition technology to leverage any existing risk mitigation practices. The agency did not report a timeframe for its reviews and did not provide an estimated closure date for this recommendation. As of April 2022, this recommendation remains open. To fully address this recommendation, FDA should review the facial recognition tracking module implemented in response to recommendation 19 to identify systems used by employees, and assess the risks of using the systems.
Internal Revenue Service The Chief of the Internal Revenue Service's Criminal Investigation Division should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 21)
Open
In March 2022, the agency reported that its Laboratory Information Management System (LIMS) could be configured to track non-federal facial recognition technology used by employees. According to the agency, LIMS is used to create, collect, and store case-related data. However, the agency has not yet taken steps to configure the system because employees are not using facial recognition technology at this time. As of April 2022, this recommendation remains open, and we will continue monitoring the agency's progress.
Internal Revenue Service The Chief of the Internal Revenue Service's Criminal Investigation Division should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 22)
Open
As of April 2022, this recommendation remains open. In October 2021, the Chief of the Internal Revenue Service's Criminal Investigation Division reported that the agency's subject matter experts serve as members of professional organizations and working groups. According to the agency, participation in these groups, which are responsible for disseminating accurate information regarding the proper application of facial identification and facial recognition technologies, allows it to continually assess risks and best practices for using non-federal facial recognition technology systems. The agency also reported that its Technology Operations Section is currently conducting a review to monitor and evaluate risks associated with all IT systems used by employees to support investigative activities. The agency also reported that its market research on facial recognition technology systems would include an assessment of security and other risks associated with the systems. The agency said it also plans to adopt best practices learned from working groups to successfully address and mitigate facial recognition technology's potential risks. The agency reported that its estimated implementation date for this action is October 15, 2022.
Inspection Service The Chief Postal Inspector of the U.S. Postal Inspection Service should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 23)
Open
In response to this recommendation, the U.S. Postal Inspection Service reported implementing a mechanism to track non-federal systems used by the agency. The agency reported also implementing a policy, effective October 31, 2021, to enhance an existing agency platform that tracks facial recognition service records. According to the agency, all employees will be required to report which non-federal systems they use to conduct facial recognition searches, the date of the searches, their name, and the case number for the associated investigation. As of April 2022, this recommendation remains open, and GAO is working with the agency to confirm that the stated actions have been implemented.
Inspection Service The Chief Postal Inspector of the U.S. Postal Inspection Service should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 24)
Open
As of April 2022, this recommendation remains open. After implementing a mechanism on October 31, 2021, to track employee use of non-federal systems, the agency plans to analyze the systems used by employees for investigative activities. Specifically, after one year, the agency plans to review and analyze data generated by its tracking mechanism and assess the risks associated with the systems. The agency reported that its planned completion date is December 31, 2022.
U.S. Capitol Police The Chief of Police, U.S. Capitol Police, should implement a mechanism to track what non-federal systems with facial recognition technology are used by employees to support investigative activities. (Recommendation 25)
Open
In December 2021, the U.S. Capitol Police reported that it does not possess facial recognition software nor have access to similar software through a third-party vendor; however, the agency said that personnel may use the technology to increase the solvability of investigations. In March 2022, the agency said it issued interim guidance outlining protocols employees must adhere to when requesting the use of facial recognition technology. These protocols include, for example, submitting an email to a direct supervisor explaining the need for facial recognition and a copy of the image in question, according to the agency. Furthermore, the agency said the agent responsible for the investigation would be required to document the use of facial recognition in the records management system, which captures the information for tracking purposes. As of April 2022, this recommendation remains open, and GAO is working with the agency to confirm that the stated actions have been implemented and that the guidance will be finalized (i.e., not temporary).
U.S. Capitol Police The Chief of Police, U.S. Capitol Police, should, after implementing a mechanism to track non-federal systems, assess the risks of using such systems, including privacy and accuracy-related risks. (Recommendation 26)
Open
In March 2022, U.S. Capitol Police reported that it issued interim guidance outlining protocols employees must adhere to when requesting the use of facial recognition technology. These protocols require agents responsible for investigations to document the use of facial recognition in its records management system, which according to the agency, captures the information for tracking purposes. The agency also said that since the interim guidance was issued, it has not used facial recognition technology, and therefore has not yet been able to assess the associated risks. As of April 2022, this recommendation remains open.

Full Report

GAO Contacts