Goodbye job applications, hello dream career
Seize control of your career and design the future you deserve with LW career

Legal software to predict crimes or enhance police bias?

New software could forecast future criminal activity and while it may not yet be accurate enough to lay charges in advance, it does take this area of law “to the next level” – even so, it could also lead to more issues like police bias and racism.

user iconNaomi Neilson 11 June 2020 Big Law
crimes
expand image

Forecasting criminal activity, or “predictive policing”, is no longer based on just intuition, according to UNSW Law Professor Lyria Bennett Moses. The model is designed to push historic data into the future and predict it, while making it look quantifiable. 

“Looking at crimes like burglary, one can create quite a useful predictive model because some areas have higher rates of burglary than others and there are patterns. [However], it works really bad for kidnapping, or domestic violence. In the latter case, because so much of the crime is unreported,” Professor Bennett Moses explained.  

Unreported cases aside, the predictive policing software is attractive to management of police as it increases efficiency while reducing the need for staff. It can save on needing in-house intelligence experts to predict crimes and limit the expenses on resources.  

Advertisement
Advertisement

However, while Professor Bennett Moses said that she expects the technology to finally be adopted, it is not likely to be well implemented or to successfully reduce crime.  

“What you do have is predictive models making probabilistic predictions, which can still be useful, but that’s not the sales pitch,” Professor Bennett Moses said. “There is a lack of demonstrated effectiveness, that when put into practice, in a real-world department, [showing] that there was any real impact on crime.”  

Moreover, Professor Bennett Moses said it is likely police will instead become anxious about a crime that is about to happen in a particular spot that strip searches increase.  

While most software simply identifies crime geographically, some offer a more targeted approach and identifies at-risk individuals by creating profiles from historic data. Another issue lies in feedback loops created from this same bias.  

“If you go to police database in Australia and look at offensive language crimes, it looks like it is only Indigenous people who swear because there isn’t anyone else who gets charged for it,” Professor Bennett Moses explained. “So, you have a bias there to start within the data. Any predictive system is going to be based on historic data, and then that feeds back into the system.” 

Professor Bennett Moses said there also needs to be significant improvements in the oversight for law enforcement agencies who might use this software.  

“There’s nothing specific in the law that says the police can use software to make predictions, but there’s also no law saying they can’t. The idea of a program running in the background which takes in diverse data on us, the rules on data sharing are jurisdiction by jurisdiction, and some don’t even have proper privacy legislation,” Professor Bennett Moses said.

“So, while there is a lot of mystique around it, I don’t think it’s understood as a fully implemented system. At worst, you have the capacity to create more problems.”

You need to be a member to post comments. Become a member for free today!

Tags