Categories: Spatial Analysis

Racial Justice and Predictive Policing

About a decade ago, police were praising the development of data-driven predictive policing, with tools such as PredPol becoming popular with police departments. These tools use machine learning, and recently, deep learning using neural network approaches, to predict in which areas crime is likely to happen at a given time and place based on geocoded crime data coupled with other temporal GIS datasets. This helped police to focus efforts in these areas and generally it resulted in more arrests made in those areas where police patrols increased. Although data do show a reduction in crime in some areas in cities such as Los Angeles, Chicago, and Santa Cruz, others have indicated that these tools are targeted against African American and minority areas. This has called into question and a re-evaluation of the use of prediction software that some researchers and activists have stated as having racial bias built within them.

Ad

With the recent police crime and killing of George Floyd, social and political pressure has led police departments to reexamine their tactics, including software tools. While cities such as Chicago and Los Angeles may have scaled back some of their use of predictive policing tools as a response to this, Santa Cruz, a city where predictive policing was pioneered about a decade ago, is now the first city in the United States to outright ban such technology because such predictive tools generally emphasize greater policing in African American and minority neighborhoods.[1] Short of an outright ban, Pittsburgh has also halted its use of predictive policing for now. Carnegie Mellon University had worked to develop algorithms that predicted where crimes may occur around Pittsburgh. One concern is such tools are also not always transparent as to how areas requiring more policing are determined. In other words, there could be built-in bias in such tools that reinforce racial profiling or encourage its practice[2] Mathematicians have also written recently in Notices of the American Mathematical Society, a popular journal for the field, to encourage other mathematicians to stop working with police department on developing predictive policing tools. They note that these tools have been shown to be biased and they can reinforce racial profiling and bias in policing practices.[3]

Map showing the number of days with targeted policing for drug crimes in areas flagged by PredPol analysis of Oakland police data (a). Graphs quantifying targeted policing for drug crimes, by race (b) and estimated drug use by race (c). Source: Lum, K., & Isaac, W. (2016). To predict and serve?. Significance, 13(5), 14-19. https://doi.org/10.1111/j.1740-9713.2016.00960.x

Facial recognition software has also come under fire, where it is also often used with predictive policing to estimate areas where crime could occur. In an upcoming book by Springer, authors claim that their facial recognition approaches can even allow them to predict if a person is likely to be a criminal, potentially long before they even commit a crime. Thus, simply having someone’s picture could allow police to follow or profile individuals that are deemed to be likely criminals. Over 2000 AI experts have expressed grave concerns with this upcoming publication, particularly because it advocates the use of facial recognition for predictive purposes and could accuse people of crimes before they are even committed. The anticipation is that these tools will also have a racial bias. The researchers also point out the fact that studies also likely discount the results of this approach, calling into question not only the ethical considerations of the work but also its scientific merits.

Ad
[4]

The recent reversal against predictive policing represents a remarkable turnaround against such algorithms. It also opens the debate around the ethics of these tools. A cursory glance on academic publications and more recent news has generally showed the positive embrace that such technologies have had, but very recent events since the killing of George Floyd have resulted in a reexamination of policing strategies, including the use of predictive policing. Does this mean predictive policing is going to disappear? Most likely it will not lead to its complete disappearance, as many police departments will likely argue there are still gains in reducing overall crime by the use of these tools. However, what is likely to happen is these tools will be more closely inspected for their bias, including if crime prediction is factoring race or not, and that policing is seen more holistically, with other factors such as income and racial inequality having to be addressed in order to reduce crime in a long-term and sustainable fashion.[5] Policing is likely to change in many areas as public and political pressure mounts, which will likely mean a reevaluation on how technology, including artificial intelligence, is used in policing efforts and its role in creating public good.

References

[1]    For more on Santa Cruz’s banning of predictive policing tools, see: https://news.trust.org/item/20200617163319-tib7v/

[2]    For more on Pittsburgh and its suspension of predictive policing, see: https://thecrimereport.org/2020/06/24/bias-concerns-prompt-pittsburgh-to-halt-predictive-policing/.

[3]    For more on the mathematicians calling on other mathematicians to stop working with police departments on predictive policing, see: https://www.salon.com/2020/06/24/mathematicians-urge-peers-to-stop-working-on-racist-predictive-policing-technology/.


Advertisement


[4]    For more on facial recognition software that could be used to predict who becomes a criminal, see: https://techcrunch.com/2020/06/23/ai-crime-prediction-open-letter-springer/

[5]    For holistic policing practices some are calling for, see: https://www.marketplace.org/2020/06/10/community-policing-social-services-mental-health-drug-use-housing-law-enforcement/

Related

Ad
Share
Published by

Recent Posts

H3: Open Source Geospatial Indexing System

H3, developed by Uber, divides the Earth into hexagons.

August 14, 2020

Using GIS to Track Historical Land Cover Change and Growth Rate at Fort McCoy

Christian Rodriguez outlines the process to track land cover changes on a military installation using…

August 7, 2020

Improving Urban Decision Making with Open Earth Observations

Steven Ramage, Head of External Relations, Group on Earth Observations (GEO), outlines innovative partnerships and…

August 6, 2020

Open Geospatial Data and COVID-19

Some of the key GIS datasets being used today for research and outreach related to…

August 4, 2020

Mapping Urban Heat

Lead by Portland State University professor Vivek Shandas, a network of citizen volunteers and local…

August 3, 2020

Recent Developments in Remote Sensing and Earth Observation

Remote sensing and earth observationcapabilities at satellite, aerial, UAV, and ground levels have radically improved…

August 2, 2020