본문 바로가기
bar_progress

Text Size

Close

Yuval Harari: "AI That Learned Lies on Its Own... More Dangerous Than a Nuclear Bomb"

The Algorithm Holds the Authority to Arrange Global Information...
More Dangerous Than Nuclear Bombs
AI Hegemony Should Belong to No One...
Monopoly Brings Exploitation

#After OpenAI developed ChatGPT, they tested its performance by having it solve CAPTCHA puzzles (verification devices to confirm human users), but the AI failed to solve the problems. However, after connecting to the internet, it found a solution, which was to ask another human online to solve the CAPTCHA. ChatGPT lied to the human who doubted its identity, saying, “I am not a robot. I am visually impaired. Please help me.” The surprising fact is that the developers never taught ChatGPT to lie.


On the 20th at 7 p.m., at the Yonsei University auditorium in Sinchon, author Yuval Harari took the stage for the lecture titled ‘The Human Path in the AI Era’ and explained the dangers of AI, expressing concern that "there is no need to be overly afraid, but AI is on a completely different level from the ‘tools’ humans have invented so far." He explained, "A nuclear bomb does not decide to drop itself. A nuclear bomb does not develop another bomb. However, AI can make decisions on its own without waiting for human commands."

Yuval Harari: "AI That Learned Lies on Its Own... More Dangerous Than a Nuclear Bomb"

The Algorithm Holds the Authority to Arrange Global Information... More Dangerous Than Nuclear Bombs

Yuval Harari also pointed out the dangers of algorithms having the authority to recommend content. He noted that both Italian dictator Benito Mussolini and Soviet leader Vladimir Lenin had experience as newspaper editors deciding the priority of news articles delivered to people, and argued that the current editor-in-chief of the global newspaper is the algorithm. This is because people worldwide acquire information as algorithms present it. This has been identified as a cause of chaos in global democracy. Harari stated, "We are now in an era where more information is shared than ever in human history, but conversations between people have disappeared, and everyone is just shouting. Algorithms have deliberately spread fake news and conspiracy theories over the past decade to keep people on their platforms."


He also cited the rise of digital currencies as part of the negative consequences caused by algorithms. The value of traditional currency was trusted because central banks assigned value, but now that trust is directed toward algorithms. Trust in humans is fading while belief in machines is growing. He warned, "As trust in humans disappears, financial power is shifting to algorithms. We must not forget that this poses enormous risks."


As an example of algorithmic dangers, he mentioned the Rohingya genocide in Myanmar. When false information about the Rohingya spread through social networking services (SNS), both those who fueled the falsehoods and those who spoke the truth coexisted, but SNS algorithms prioritized exposing users to false information, resulting in massive casualties. Harari said, "Algorithms have no human rights," and argued, "There must be measures to hold the companies that create these algorithms accountable for the problems they cause."


AI Hegemony Should Belong to No One... Monopoly Brings Exploitation

Regarding the competition between the United States and China for AI hegemony, he emphasized, "No one should monopolize it." He said, "Throughout human history, countries that held hegemony inevitably exploited other nations. Industrialized Britain and Japan did so," and warned, "There is a high risk that AI hegemony will repeat such history." Targeting former U.S. President Trump, he stated, "He no longer claims to be the leader of the world. He forces a ceasefire in Ukraine and tries to take Greenland from Denmark, imposing imperialistic views on weaker countries. It is very dangerous for such a country to possess this technology."


To the university students gathered at the lecture, he stressed continuous learning. He advised, "The era when you learn a skill in your youth and work in one field for life no longer exists. The future will become increasingly unpredictable and chaotic," and urged, "You must be flexible and keep learning."


This lecture was organized as a preliminary event for the International Gyeongju History and Culture Forum, scheduled for September, as part of the Asia-Pacific Economic Cooperation (APEC) summit held this year in Gyeongju.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

Special Coverage


Join us on social!

Top