[Asia Economy Reporter Hwang Sumi] "Predicting crimes before they happen to protect citizens."
This is a story about the cutting-edge security system 'PreCrime' featured in the science fiction (SF) movie Minority Report. In the film, the PreCrime system predicts the time and place of crimes, as well as the individuals who will commit them. Based on this, the PreCrime special police arrest future criminals. Recently, such a system seems to be becoming a reality.
According to the science magazine New Scientist on the 2nd, a recent study showed that artificial intelligence (AI) predicted with high accuracy where and what crimes would occur in the future based on past crime rate data.
According to the media, a research team led by Professor Ishanu Chattopadhyay at the University of Chicago trained a self-developed AI model with crime rate data by region in Chicago, Illinois from 2014 to 2016, and then had it predict crime rates immediately after the training period.
As a result, the AI's predicted crime rates showed 90% accuracy. The research team divided Chicago into 300-meter square blocks, and the AI almost precisely predicted a week in advance which crimes such as murder and robbery would occur in which blocks. Similar accuracy was observed in seven other major U.S. cities besides Chicago.
The research team suggested that this crime rate prediction model could be used by police to notify caution levels. Professor Chattopadhyay said, "Security resources are not unlimited," and "It is necessary to allocate resources most appropriately." He added, "It would be helpful to know in advance where a murder might occur."
However, concerns have been raised that racial bias may influence AI's crime rate predictions. This is because the crime data learned by AI was produced by humans. In the U.S., some police disproportionately patrol neighborhoods inhabited by Black or minority populations, or punish Black offenders more harshly, which could contaminate AI's judgments. In fact, the AI-generated list included 56% of Black males aged 20-29 in Chicago as potential criminals.
Regarding this, Professor Chattopadhyay explained that efforts were made to prevent racial bias from affecting this study. He also added that the AI does not identify suspects but only indicates areas with high crime risk.
A scene from the science fiction (SF) movie 'Minority Report'. [Photo from a still cut of 'Minority Report']
Meanwhile, similar technology is reportedly being introduced in China. According to the New York Times (NYT) on the 25th of last month (local time), the Chinese government is extensively monitoring residents by using advanced technology to predict future crimes. They observe the daily lives of individuals deemed likely to cause problems through surveillance cameras, and if unusual or suspicious behavior is detected, the police are automatically notified.
In fact, in 2020, an authority in southern China refused a woman's request to move to Hong Kong where her husband lived. The program judged the couple's marriage to be suspicious. This was because the couple did not spend much time together, such as not spending the Lunar New Year holiday together. At that time, the police concluded that the woman had entered into a sham marriage to obtain migration permission. In another case, police caught a man involved in a pyramid scheme after the surveillance program detected that he entered homes with different associates each time.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


