Face Recognition Technologies
Designing Systems that Protect Privacy and Prevent Bias
ResearchPublished May 14, 2020
Face recognition technologies (FRTs) have many practical security-related purposes, but advocacy groups and individuals have expressed apprehensions about their use. This report highlights the high-level privacy and bias implications of FRT systems. The authors propose a heuristic with two dimensions — consent status and comparison type — to help determine a proposed FRT's level of privacy and accuracy. They also identify privacy and bias concerns.
Designing Systems that Protect Privacy and Prevent Bias
ResearchPublished May 14, 2020
The objective of face recognition technologies (FRTs) is to efficiently detect and recognize people captured on camera. Although these technologies have many practical security-related purposes, advocacy groups and individuals have expressed apprehensions about their use. The research reported here was intended to highlight for policymakers the high-level privacy and bias implications of FRT systems. In the report, the authors describe privacy as a person's ability to control information about them. Undesirable bias consists of the inaccurate representation of a group of people based on characteristics, such as demographic attributes. Informed by a literature review, the authors propose a heuristic with two dimensions: consent status (with or without consent) and comparison type (one-to-one or some-to-many). This heuristic can help determine a proposed FRT's level of privacy and accuracy. The authors then use more in-depth case studies to identify "red flags" that could indicate privacy and bias concerns: complex FRTs with unexpected or secondary use of personal or identifying information; use cases in which the subject does not consent to image capture; lack of accessible redress when errors occur in image matching; the use of poor training data that can perpetuate human bias; and human interpretation of results that can introduce bias and require additional storage of full-face images or video. This report is based on an exploratory project and is not intended to comprehensively introduce privacy, bias, or FRTs. Future work in this area could include examinations of existing systems, reviews of their accuracy rates, and surveys of people's expectations of privacy in government use of FRTs.
This independent research was conducted using internal funding generated from operations of the Homeland Security Research Division (HSRD) and within the HSRD Acquisition and Development Program.
This publication is part of the RAND research report series. Research reports present research findings and objective analysis that address the challenges facing the public and private sectors. All RAND research reports undergo rigorous peer review to ensure high standards for research quality and objectivity.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
RAND is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.