본문 바로가기
bar_progress

Text Size

Close

[The Editors' Verdict] Homicide Prediction System and Memories of the "Samcheong Education Camp"

Controversy Over Algorithm-Based Crime Prediction in the UK
Concerns Over Data Misuse and Discrimination Against Minorities and the Poor
Shadows of Past State Violence Resurface

[The Editors' Verdict] Homicide Prediction System and Memories of the "Samcheong Education Camp"

Is this a story about movies or fake news? At first, it was hard to believe. A system that predicts criminals?it's as if Philip K. Dick's sci-fi novel "Minority Report" has become reality. Recently, news emerged that the UK government is developing a "homicide prediction system" using algorithms to identify people who are highly likely to commit murder. This fact was revealed through an information disclosure request by the civil rights group Statewatch, which focuses on government surveillance.


According to the UK media outlet The Guardian, this project was planned during the tenure of former Prime Minister Rishi Sunak (2022?2024). It is an attempt to identify potential criminals based on police records and probation data. It is an upgraded version of the "Offender Assessment System (OASys)" currently used in the UK criminal justice system. Initially called the "Homicide Prediction Project," the name has since been changed to "sharing data to improve risk assessment."


Up to 500,000 data entries will be used to develop the tool. The analysis includes basic information such as name, date of birth, gender, and race, as well as sensitive personal data like mental health, history of self-harm, and domestic violence. Statewatch claims that the data even includes people with no criminal records, such as victims, witnesses, and missing persons, but the government denies this, stating that only data from individuals with at least one conviction is used.


Statewatch criticizes the project for potentially reinforcing structural discrimination in the criminal justice system against ethnic minorities and the poor. They describe it as truly "dystopian." However, the UK Ministry of Justice attempts to downplay the issue, stating, "This project is being conducted for research purposes only, and a report will be published at an appropriate time."


Algorithm-based crime prediction systems present a positive goal of crime prevention, but ethical issues and technical limitations are clear. First, questions arise about prediction accuracy. Algorithms analyze patterns based on past data, but these are only probabilistic results. Incorrect predictions can harm innocent people. Moreover, if the training data contains biases, certain groups may be unfairly classified as high-risk. This undermines the "presumption of innocence" and fundamental human rights.


At the same time, there is a sense of d?j? vu. Do you remember the Samcheong Education Camp during the Chun Doo-hwan dictatorship in the 1980s? It was a representative case of state violence where citizens "likely to commit crimes" were arrested and forced into labor and military training under the pretext of social purification. Although the camp was operated under the justification of eradicating gangsters, about 60,000 people, including ordinary citizens without legal procedures, were arrested. Many of them were innocent.


The homicide prediction system and the Samcheong Education Camp are similar in that they both claim to strengthen public safety. The decision-making authority to identify and control potential threats has shifted from "humans loyal to the ruling power" to "algorithms." While technological advances open new possibilities, they also carry the risk of more sophisticated reproductions of past violent control methods. Can we be sure that such attempts will not occur in countries with stricter government control than the UK? It is a time when constant vigilance and discussion by the public are necessary.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top