Law enforcement has transformed drastically by advances in technology. Law enforcement bodies around the world have adopted facial recognition capabilities powered by artificial intelligence and contend that facial recognition technology is an effective tool in preventing, disrupting, investigating, and responding to crime. As the practice has grown, so have criticisms of its use and policing outcomes. Criticisms relate to the violation of civil liberties, namely the potential for abuse, propensity for inaccuracies, and improper use. In an effort to assess the validity of these criticisms, this paper examines the link between facial recognition technology and racial bias through an analysis of existing research and the use of a case study of an American municipality that has banned the use of facial recognition technology by police. Studies to date demonstrate a propensity for algorithms to mirror the biases of the datasets on which they are trained, including racial and gender biases; rates of match inaccuracy were consistently seen in relation to black persons, particularly black females. In addition to academic research, multiple examples of misidentifications of black citizens in the United States, along with related commentary from human rights and civil liberties groups, suggests that these concerns are translating into real world injustices. This paper validates concerns with the use of facial recognition technology for law enforcement purposes in the absence of adequate governance mechanisms.
Published in | American Journal of Artificial Intelligence (Volume 7, Issue 1) |
DOI | 10.11648/j.ajai.20230701.13 |
Page(s) | 17-23 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2023. Published by Science Publishing Group |
Artificial Intelligence, Policing, Facial Recognition, Crime, Law Enforcement, Justice, Race
[1] |
Bajkowski, Julian, ‘Human Right Commission wants moratorium on expanding facial recognition’, itnews (online, 17 December 2019) |
[2] |
Brown, Stephen Rex, ‘A $50 crack sale in Florida could yield significant ruling on facial recognition’, NY Daily News (online, 22 March 2018) |
[3] |
Brown, Stephen Rex, ‘NYPD claws back documents on facial recognition it accidentally disclosed to privacy researchers’, NY Daily News (online, 14 April 2019) |
[4] | Chan, Janet, ‘The technological game: How information technology is transforming police practice’ (2001) 1 (2) Criminal Justice 139. |
[5] | Crehan, A Corbo, ‘“Appropriate” Police Discretion and Indigenous Over-Representation in the Criminal Justice System’ (2010) 11 (1) Australian Journal of Professional and Applied Ethics 1. |
[6] |
Conger, Kate, Fausset, Richard and Kovaleski, Serge F, ‘San Francisco Bans Facial Recognition Technology’, The New York Times (online, 14 May 2019) |
[7] |
Conarck, Ben, ‘Court denies facial recognition evidence appeal’, The Florida Times-Union (online, 23 January 2019) |
[8] | Ericson, Richard V and Haggerty, Kevin D, Policing the Risk Society (University of Toronto Press, 1997). |
[9] | Fagan, Jeffrey, Braga, Anthony A, Brunson, Rod K and Pattavina, April, ‘Stops and Stares: Street Stops, Surveillance, and Race in the New Policing’ (2016) 43 Fordham Urban Law Journal 539. |
[10] | Ferguson, Andrew, The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (NYU Press, 2017). |
[11] |
Ford, Trenton W, ‘It’s time to address facial recognition, the most troubling law enforcement AI tool’, Bulletin of the Atomic Scientists (online, 10 November 2021) |
[12] |
Garvie, Clare and Frankle, Jonathan, ‘Facial-Recognition Software Might Have a Racial Bias Problem’, The Atlantic (online, 8 April 2016) |
[13] |
General, John and Sarlin, Jon, ‘A false facial recognition match sent this innocent Black man to jail’, CNN (online, 29 April 2021) |
[14] | Grother, Patrick, Ngan, Mei and Hanaoka, Kayee, ‘Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects’, National Institute of Standards and Technology (Report, 19 December 2019). |
[15] |
Hamilton, Brent and Berry, Kate, ‘Portland Becomes First Jurisdiction to Ban Certain Uses of Facial Recognition by Private Businesses’, Davis Wright Tremaine LLP (Web Page, 21 January 2021) |
[16] |
Hendry, Justin, ‘Human Rights Commission calls for temporary ban on ‘high-risk’ govt facial recognition’, itnews (online, 28 May 2021) |
[17] |
Hill, Kashmir, ‘Wrongfully Accused by an Algorithm’, The New York Times (online, 24 June 2020) |
[18] |
Kidd, Jessica, ‘NSW Pubs and clubs to install facial recognition technology to help stop self-excluded gamblers’, ABC News (online, 19 October 2022) |
[19] | Klare, Brendan, Burge, Mark J, Klontz, Joshua C, Vorder Bruegge, Richard W and Jain, Anil K, ‘Face Recognition Performance: Role of Demographic Information’ (2012) 7 (6) IEEE Transactions on Information Forensics and Security 1789. |
[20] | Lundman, Richard J and Kaufman, Robert L, ‘Driving While Black: Effects of Race, Ethnicity, and Gender on Citizen Self-Reports of Traffic Stops and Police Actions’ (2003) 41 (1) Criminology 195. |
[21] | Mitchell, Ojmarrh and Caudy, Michael, ‘Examining Racial Disparities in Drug Arrests’ (2015) 32 (2) Justice Quarterly 288. |
[22] | Nieva, Richard, ‘Google vows not to sell its facial recognition technology for now’, CNET (online, 13 December 2018). |
[23] | Parmar, Divyarajsinh N and Mehta, Brijesh B, ‘Face Recognition Methods & Applications’ (2014) 4 (1) International Journal of Computer Applications in Technology 84. |
[24] | Phillips, P Jonathan, Jiang, Fang, Narvekar, Abhijit, Ayyad, Julianne and O'Toole, Alice J, ‘An Other-Race Effect for Face Recognition Algorithms’ (2011) 8 (2) ACM Transactions on Applied Perception 1. |
[25] | Phillips, P Jonathan and O’Toole, Alice J, ‘Comparison of human and computer performance across face recognition experiments’ (2014) 32 Image and Vision Computing 74. |
[26] | Phillips. P Jonathan, Yates, Amy N, Hu, Ying and O’Toole, Alice J, ‘Face recognition accuracy of forensic examiners, superrecognizers, and face recognition algorithms’ (2018) 115 (24) Psychological and Cognitive Sciences 6171. |
[27] |
Rainie, Lee, Funk, Cary, Anderson, Monica and Tyson, Alec, ‘Public more likely to see facial recognition use by police as good, rather than bad for society’, Pew Research Centre (online, 17 March 2022) |
[28] | Roscigno, Vincent J and Preito-Hodge, Kayla, ‘Racist Cops, Vested “Blue” Interests, or Both? Evidence from Four Decades of the General Social Survey’ (2021) 7 Socius: Sociological Research for a Dynamic World 1. |
[29] | Rossler, Michael, ‘The Impact of Police Technology Adoption on Social Control, Police Accountability, and Police Legitimacy’ in Cara Rabe-Hemp & Nancy Lind (ed), Political Authority, Social Control, and Public Policy (Emerald Publishing, 2019) 209. |
[30] |
Simmons, Amy, ‘‘Over-policing to blame’ for Indigenous prison rates’, ABC News (online, 25 June 2009) |
[31] |
Simonite, Tom, ‘When It Comes to Gorillas, Google Photos Remains Blind’, Wired (online, 11 January 2010) |
[32] |
Smith, Brad, ‘Facial recognition technology: The need for public regulation and corporate responsibility’, Microsoft On the Issues (Web Page, 13 July 2018) |
[33] |
Smith, Brad, ‘Facial recognition: It’s time for action’, Microsoft On the Issues (Web Page, 6 December 2018) |
[34] |
Smyton, Robin, ‘How Racial Segregation and Policing Intersect in America’, TuftsNow (online, 17 June 2020) |
[35] |
Stokes, Elaisha, ‘Wrongful arrest exposes racial bias in facial recognition technology’, CBS News (online, 19 November 2020) |
[36] |
Valentino-DeVries, Jennifer, ‘How the Police Use Facial Recognition, and Where It Falls Short’, The New York Times (online, 12 January 2020) |
[37] |
Wood, Matt, ‘Thought on Recent Research Paper and Associated Article on Amazon Rekognition’, Amazon Web Services (Web Page, 26 January 2019) |
[38] |
‘Acquisition of Surveillance Technology’ (City Ordinance, 5 June 2019) |
[39] |
Buolamwini, Joy, ‘Response: Racial and Gender bias in Amazon Rekognition — Commercial AI System for Analyzing Faces.’ Hackernoon (Web Page, 26 January 2019) |
[40] | Buolamwini, Joy and Gebru, Timnit, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ (Conference Paper, Conference on Fairness, Accountability, and Transparency, 23 February 2018). |
[41] |
Buolamwini, Joy, Gebru, Timnit, Raynham, Helen, Raji, Deborah, and Zuckerman, Ethan, ‘Gender Shades’, MIT Media Lab (Web Page) |
[42] | Buolamwini, Joy and Raji, Inioluwa Deborah, ‘Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products’ (Conference Paper, AAAI/ACM Conference on AI, Ethics, and Society, January 2019). |
[43] |
Burton-Harris, Victoria and Mayor, Philip, ‘Wrongfully Arrested Because Face Recognition Can’t Tell Black People Apart, ACLU (Web Page, 24 June 2020) |
[44] |
‘Face Challenges’, National Institute of Standards and Technologies (Web Page, 27 March 2019) |
[45] |
‘Facial Recognition’, NSW Police Force (Web Page) |
[46] |
‘Face Recognition Vendor Test (FRVT)’, National Institute of Standards and Technologies (Web Page, 30 November 2020) |
[47] | Phillips, P Jonathan, Flynn, Patrick J, Bowyer, Kevin W, Vorder Bruegge, Richard W, Grother, Patrick J, Quinn, George W and Pruitt, Matthew, ‘Distinguishing Identical Twins by Face Recognition’ (Conference Paper, International Conference on Automatic Face and Gesture Recognition, 21 March 2011). |
[48] |
‘Woodrow Bledsoe Originates of Automated Facial Recognition’, History of Information (Web Page) |
APA Style
Seppy Pour. (2023). Police Use of Facial Recognition Technology and Racial Bias – An Assessment of Criticisms of Its Current Use. American Journal of Artificial Intelligence, 7(1), 17-23. https://doi.org/10.11648/j.ajai.20230701.13
ACS Style
Seppy Pour. Police Use of Facial Recognition Technology and Racial Bias – An Assessment of Criticisms of Its Current Use. Am. J. Artif. Intell. 2023, 7(1), 17-23. doi: 10.11648/j.ajai.20230701.13
AMA Style
Seppy Pour. Police Use of Facial Recognition Technology and Racial Bias – An Assessment of Criticisms of Its Current Use. Am J Artif Intell. 2023;7(1):17-23. doi: 10.11648/j.ajai.20230701.13
@article{10.11648/j.ajai.20230701.13, author = {Seppy Pour}, title = {Police Use of Facial Recognition Technology and Racial Bias – An Assessment of Criticisms of Its Current Use}, journal = {American Journal of Artificial Intelligence}, volume = {7}, number = {1}, pages = {17-23}, doi = {10.11648/j.ajai.20230701.13}, url = {https://doi.org/10.11648/j.ajai.20230701.13}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajai.20230701.13}, abstract = {Law enforcement has transformed drastically by advances in technology. Law enforcement bodies around the world have adopted facial recognition capabilities powered by artificial intelligence and contend that facial recognition technology is an effective tool in preventing, disrupting, investigating, and responding to crime. As the practice has grown, so have criticisms of its use and policing outcomes. Criticisms relate to the violation of civil liberties, namely the potential for abuse, propensity for inaccuracies, and improper use. In an effort to assess the validity of these criticisms, this paper examines the link between facial recognition technology and racial bias through an analysis of existing research and the use of a case study of an American municipality that has banned the use of facial recognition technology by police. Studies to date demonstrate a propensity for algorithms to mirror the biases of the datasets on which they are trained, including racial and gender biases; rates of match inaccuracy were consistently seen in relation to black persons, particularly black females. In addition to academic research, multiple examples of misidentifications of black citizens in the United States, along with related commentary from human rights and civil liberties groups, suggests that these concerns are translating into real world injustices. This paper validates concerns with the use of facial recognition technology for law enforcement purposes in the absence of adequate governance mechanisms.}, year = {2023} }
TY - JOUR T1 - Police Use of Facial Recognition Technology and Racial Bias – An Assessment of Criticisms of Its Current Use AU - Seppy Pour Y1 - 2023/06/15 PY - 2023 N1 - https://doi.org/10.11648/j.ajai.20230701.13 DO - 10.11648/j.ajai.20230701.13 T2 - American Journal of Artificial Intelligence JF - American Journal of Artificial Intelligence JO - American Journal of Artificial Intelligence SP - 17 EP - 23 PB - Science Publishing Group SN - 2639-9733 UR - https://doi.org/10.11648/j.ajai.20230701.13 AB - Law enforcement has transformed drastically by advances in technology. Law enforcement bodies around the world have adopted facial recognition capabilities powered by artificial intelligence and contend that facial recognition technology is an effective tool in preventing, disrupting, investigating, and responding to crime. As the practice has grown, so have criticisms of its use and policing outcomes. Criticisms relate to the violation of civil liberties, namely the potential for abuse, propensity for inaccuracies, and improper use. In an effort to assess the validity of these criticisms, this paper examines the link between facial recognition technology and racial bias through an analysis of existing research and the use of a case study of an American municipality that has banned the use of facial recognition technology by police. Studies to date demonstrate a propensity for algorithms to mirror the biases of the datasets on which they are trained, including racial and gender biases; rates of match inaccuracy were consistently seen in relation to black persons, particularly black females. In addition to academic research, multiple examples of misidentifications of black citizens in the United States, along with related commentary from human rights and civil liberties groups, suggests that these concerns are translating into real world injustices. This paper validates concerns with the use of facial recognition technology for law enforcement purposes in the absence of adequate governance mechanisms. VL - 7 IS - 1 ER -