본문 바로가기
bar_progress

Text Size

Close

"AI Algorithms with Opaque Decision Processes May Be Difficult to Subject to Judicial Control"

Professor Heo Seongwook Proposes Judicial Review Standards for AI-Based Administrative Dispositions
Calls for Enhanced Transparency, Algorithm Verification, and Due Process Safeguards

#. Defendant Eric Loomis was charged with unauthorized driving of a vehicle involved in a 2013 firearm incident and fleeing from the police. The prosecution cited a report produced by ‘COMPAS’ to seek a heavy sentence, arguing that the defendant, who had a prior sex crime record, was highly likely to reoffend. The court accepted the possibility of recidivism and sentenced him to six years in prison. Loomis appealed, claiming the ruling violated the principle of due process, but the Wisconsin Supreme Court dismissed the appeal.

"AI Algorithms with Opaque Decision Processes May Be Difficult to Subject to Judicial Control" Photo to aid understanding of the above article. Pixabay


‘COMPAS,’ a recidivism prediction program developed by a private company in the United States, evaluates the risk of reoffending on a 10-level scale based on 137 factors including criminal history, offender tendencies, and attitudes. In 2016, ProPublica, a U.S. nonprofit investigative journalism organization, reported that “COMPAS exhibits biased judgments by predicting higher recidivism risk rates for Black individuals than for Whites,” sparking a full-fledged debate on how AI technology should be utilized and to what extent it should be accepted in the judicial field.


In Korea, although AI technology is not yet applied in the judicial field, its use in administrative dispositions is gradually increasing. A representative example is the Ministry of Food and Drug Safety’s ‘Electronic Review 24.’ This system automatically reviews import declaration documents based on the Special Act on Imported Food Safety Control and issues a confirmation certificate immediately if the documents are appropriate. The process, which previously took 48 hours, has been shortened to within 5 minutes, significantly reducing administrative costs.


In cases like the U.S. COMPAS example, if a party objects to or disputes an administrative disposition made by AI, how should the court rule? Professor Heo Seong-wook of Seoul National University Law School expressed the opinion that “technical and legal standards such as dataset verification, enhanced algorithm explainability, and securing transparency of results are necessary.”


Professor Heo published a report titled ‘A Study on Judicial Review Theory of Administrative Dispositions Based on Artificial Intelligence’ in December 2024, pointing out that “complex issues may arise when parties subject to administrative dispositions based on AI analysis raise objections or find it difficult to understand the operating principles.” A representative problem is the black-box phenomenon of AI, where AI fails to provide sufficient explanations for administrative decisions, potentially undermining the legitimacy of those decisions.


It was also pointed out that AI-made dispositions pose problems in terms of due process, as it is difficult to hear the opinions of interested parties. To address this, Professor Heo proposed institutional supplements including △establishing monitoring and supervision procedures for the appropriateness of AI algorithms and input data △granting rights to interested parties affected by administrative acts to demand accurate data input into AI algorithms △setting up objection and rejection procedures.


Regarding judicial review standards, Professor Heo suggested that when administrative agencies use AI based on clear laws in areas where they have expertise, and when the algorithm structure and disposition results are transparently disclosed, courts should respect administrative decisions while reviewing for legal violations and predictability. In cases based on ambiguous laws or where the algorithm structure and results are opaque, courts should apply proportionality principles and examine rationality and acceptability in the same manner as for human officials. He also expressed the view that when administrative agencies use AI in areas where they lack expertise, courts should directly review the rationality and legality of the disposition. He emphasized that when AI administrative dispositions apply to punitive or restrictive administrative acts that significantly affect citizens’ fundamental rights (including dispositions equivalent to criminal punishment), it is desirable for humans to make the final judgment.


Reporter An Jaemyung, Legal Newspaper

※This article is based on content supplied by Law Times.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top