Governments and entities around the world are starting to venture deep into Minority Report-land, giving serious attention to the use of artificial intelligence and scoring databases to assign risk in a way that influences the lives of ordinary people in profound, game-changing, and even slightly creepy ways.We reported just last week about how the government in China is rolling out a “social credit” scoring system that central authorities are using to keep closer watch on the country’s 1.3 billion citizens and to limit the activities — like booking flights — of people deemed to be “untrustworthy” and assigned low scores as part of this system.
Over in Britain, meanwhile, police are engaged in a pilot project to assess how effectively AI can be used in determining the likelihood that someone is either going to commit or be the victim of a crime. That’s according to the publication New Scientist, which reported that the system being tested is called the National Data Analytics Solution, which draws from local and national police databases. The project’s police lead told the magazine the project has already hoovered up more than 1 terabyte of data that includes details of crimes that have been committed as well as details of about 5 million people.
“Looking at this data,” the magazine reported, “the software found nearly 1,400 indicators that could help predict crime, including around 30 that were particularly powerful. These included the number of crimes an individual had committed with the help of others and the number of crimes committed by people in that individual’s social group.”
The New Scientist’s reporting goes on to note that people in the database get tagged by an algorithm that assigns them a risk score, indicating how likely they are to commit a serious crime.
The West Midlands Police department is working through this project between now and March, at which time they hope to have a prototype system ready that can actually be used. Eight other police departments are also involved, with the idea being to possibly roll this out to all police departments in the country.
For now, it seems the law enforcement officials involved don’t plan to actually make arrests based on this data before someone has committed a crime. Per the magazine, the idea would be instead to provide something like “counseling to any individual with a history of mental health issues that had been flagged by NDAS as being likely to commit a violent crime. Potential victims could be contacted by social services.”
This is the first such project of its kind in the world, pooling multiple data sets from a number of police forces for crime prediction, the magazine goes on to note. Certainly, the intention here may be well-meaning, but it should go without saying that a myriad of ethics concerns are raised by a project like this. A Gizmodo piece effectively summed them up this way: “This system effectively is sending mental health professionals to people’s homes because an algorithm suggested that, in the future, there’s a chance they may commit or fall victim to a crime. To enact that type of intervention across an entire country paints a picture of an eerily intrusive future.”