An Extended Abstract for On the Ethics of Predictive Policing Algorithms
- Samuel Checkal
- Sep 2
- 2 min read
Author’s Note:
This is an extended abstract for an Ethics Brief titled: “On the Ethics of Predictive Policing Algorithms.” This brief is being considered for publication elsewhere and is therefore not available in full in The Rambler. If you are interested in reading this brief in its entirety, please contact scheck88@hotmail.com. This page will be updated regarding where the brief will be published.
This Ethics Brief works to investigate Geolitica (formerly known as PredPol) and SoundThinking regarding their predictive policing algorithms (PPAs). This brief focuses on SoundThinking’s trajectory to becoming the “Google” of PPAs by taking over the space through acquisition of competition (e.g. Geolitica, Forensic Logic) and the ever-growing resources and data they are providing to the police. We will explore the ethical effects of having our data and policing practices increasingly influenced, and potentially controlled, by a single private company.
Geolitica was originally developed as PredPol in 2011 by a team of UCLA researchers and the LAPD. It was created to make crime prediction more data-driven and objective. PredPol emerged as one of the first major commercial tools in the field. In general, PPAs analyze massive amounts of data to predict trends in where and when a crime could take place. This data can include the times, locations, and nature of past crimes to provide insight to police. We will also explore how PPAs are being used in other countries, what kinds of data they track, and how that data is used to inform policing.
Police argue that the use of algorithms creates a more effective approach and speeds up the process of policing by sending officers to the place of a potential crime at a certain time. Critics argue that these models promote racial profiling, as the data fed into the algorithm is based on current police data, which tends to target certain minority groups. There are also concerns about the lack of transparency behind how these systems operate and how the predictions are generated. Daniel Susser brings out an interesting point about using this technology differently: putting it in the hands of social workers rather than police. He argues that this approach might create more productive change in communities compared to increased policing.
Pushback has not stopped governments from implementing these systems, however. Some authoritarian regimes have already openly declared their intentions to use predictive policing to eliminate crime altogether. Investigating this technology raises urgent ethical questions about bias, accountability, and how far we are willing to go in the name of safety, especially when private companies like SoundThinking are changing law enforcement.
We will explore Daniel Susser’s proposal of using this technology differently: putting it in the hands of social workers rather than police. He argues that this approach might create more productive change in communities compared to increased policing. This brief applies a deontological framework, arguing that PPAs, when built on racially biased data, violate individuals’ rights to fair and equal treatment under the law, and undermine the moral duties that law enforcement owes to its communities. It also draws on virtue ethics to critique how these tools may cultivate institutional values of secrecy, control, and detachment, rather than the virtues of justice, care, and humility that should underpin ethical policing.
Comments