In a move straight out of the sci-fi anime series Psycho-Pass, police in the UK have been developing an artificial intelligence (A.I) that will predict how likely a person is to commit or be the victim of violent crimes. New Scientist, a London-based publication, reported just last month that the police intend to flag individuals with the system and intervene before crimes happen by offering preemptive counseling.
The system, called the National Data Analytics Solution (NDAS), uses a combination of AI and statistics from local and national police databases. Ian Donnelly, the police lead on the project, said that they have collected over a terabyte’s worth of data from the early phases of the project, including logs of committed crimes and about 5 million identifiable people. Whoa.
The software found nearly 1,400 indicators from the data that could help predict crime. It was found that the number of crimes committed by people in an individual’s social group were a strong factor in their likelihood of committing crime. Donnelly claims that the intent behind flagging individuals with a high risk indicator is not to arrest them before they commit crimes, but rather to provide them with support and counselling from local health or social workers.
The aim of NDAS was to channel limited police resources more effectively. Donnelly said that because police funding has been slashed in recent years, there is a need for system that can help the police prioritize on those who require interventions most urgently. The project has until the end of March 2019 to produce a functioning prototype, with the end-goal being that every police force in the UK could eventually utilize it.
Although the police will work with the UK’s data watchdog, the Information Commissioner’s Office, to ensure that the NDAS meets privacy regulations, the project has already drawn criticism from a team of scientists at the Alan Turing Institute in London, who cite “serious ethical issues” with the foundation of the project and question whether it is in the public good for the police to intervene with individuals who have not yet committed crimes.
Other researchers have highlighted concerns that the system will reinforce pre-existing biases against poor communities and people of color. Another issue is that it could restrict resources to areas that police already have extensive data from. Andrew Ferguson at the University of the District of Columbia said that arrests correlate with where police are deployed and are not representative of crime numbers overall. This tends to disproportionately affect groups that were already marginalized from the start.
Around the world, police are increasingly using data to predict crime before it happens. PredPol, developed at Santa Clara University in California, is a software that identifies future crime hot spots. Earlier this year, Human Rights Watch criticized the Chinese authorities for preemptively detaining people in the province of Xinjiang using its own form predictive policing. The future predicted by Psycho-Pass may not be so far off after all.