⑧ Jang Kangmyeong's Thoughts on Machines, Humans, and Stories
Author Jang Kang-myung is having a conversation with Professor Kim Dae-sik and choreographer Kim Hye-yeon at a studio in Seocho-gu, Seoul on the 1st.
Can machines learn humanity? Novelist Jang Kang-myeong believes that what we call humanity?that is, human frailty and the narratives woven from experiences and insights born of it?are uniquely human realms that machines cannot approach. The paradox that Han Kang's work, which filled a spot on this year's Nobel Prize list dominated by AI, shines so brightly may be due to the painful narratives born of human frailty and the effort to confront them. Jang Kang-myeong's story about humans and humanity, human language, and the coexistence of humans and AI is therefore even more intriguing.
-Nowadays, thanks to technologies like the metaverse or Google Earth, we can experience worlds or eras we've never been to with just a few clicks. Yet, novels remain a powerful medium that allows us to experience not only the world we stand on but also new worlds. What do you think is the power of novels?
▲I understand all humans, including myself, as a single narrative. Even psychiatrists, when counseling patients, unfold the person's life as a story during treatment. The process of advising, "This is going in the wrong direction; if you do this, it will get better," is ultimately storytelling. This is how humans worldwide understand the world. Humans instinctively knew the power of narrative long before the modern novel genre emerged. Narrative is not merely a list of facts. Even when mixed with fictional elements and fiction, it holds a power as strong as real stories.
Even in the era of epic poetry, various techniques to convey emotions were included. With the invention of the modern novel, language became a powerful medium. Although new technologies like the metaverse and virtual reality have emerged, there are limits to enabling humans to understand narratives through all senses. For example, even if one experiences audiovisual elements through the metaverse, it is difficult to receive the entire narrative solely through that. The metaverse experience, lacking elements like smell or touch, inevitably feels different from reality.
All humans are one narrative machine; sensory information has limits
Expressing specific human emotions is insufficient with just audiovisual information. Can the metaverse make someone feel 'the frustration of a dream being crushed' or 'the pain of heartbreak'? Simply providing sensory information has limits. Ultimately, to convey such emotions, a story is necessary. Only by providing the context of the story and the character's experience through a scenario can one feel those emotions. For example, if I want to convey in the metaverse the situation where 'a man I had a crush on ten years ago suddenly stopped contacting me, and I heard he died,' a story that sufficiently explains the context before and after is needed. Without that, if one only hears the line "He died," there would be no emotional response.
Ultimately, narrative is an important element that provides context and emotion beyond simple information transmission. This is the power of language as a medium. No matter how advanced the metaverse or other technologies become, language and narrative will still be necessary to fully convey the complex experiences and emotions of humans. I believe this is why technology cannot completely replace the power of language.
-The development speed of language models like ChatGPT is remarkable. What impression do you have about this situation?
▲At first, I was truly amazed. There had been conversational AI programs before, but ChatGPT seemed to converse much more naturally. Especially when it appeared to understand the context of writing to some extent, I thought, 'Now AI is really entering the human creative domain.' Of course, it is still different from human-written text, but the gap is narrowing, which was shocking. Seeing this, I became concerned as a writer about the impact AI might have on my work.
Writing is not simply combining data; it involves conveying human emotions and experiences. I wondered whether AI could enter that realm. However, I do not think AI can have consciousness like humans. What seems like AI's 'consciousness' is merely an illusion. Even if AI had consciousness, I believe it would not understand human finiteness or experiences through the body. We are physically limited, feel pain, and experience the passage of time. These are important elements that form humanity, and AI would find it hard to empathize with them.
Another point is that future intelligent beings might not need to understand humanity. They would not comprehend the physical body or human experiences we value, nor why they are important. We often assume 'humanity' is inherently important, but I do not think AI necessarily has to acquire humanity. To explain this, I recall Haruki Murakami's novel. In it, a being called 'Sheep' offers superhuman powers, but 'Rat' refuses and chooses human frailty. The frailty Rat speaks of includes small but precious experiences like a glass of beer in summer or the sound of insects at night. I love that frailty too. Compared to beings like AI or transhumans, I cannot say it is superior, but I want to preserve that frailty. It is the essence of a human life.
AI unlikely to have human-like consciousness; data learning struggles to grasp human insight
-Where do you think the differences or limitations between humans and AI originate?
▲AI fundamentally writes based on information learned from data. Therefore, I think it is difficult for AI to fully understand the emotions or deep insights humans feel through experience. Humans are born with finite time and understand the world through physically limited experiences. In that process, we feel joy, sadness, pain, and grow. These emotional and physical experiences are very important elements in writing. When I talk about AI and machines, they might ask, "Why is that important? Why is it necessary?" But having lived in the human world, I feel what is important in those experiences. This may be too human-centered or selfish, but I want to protect this world.
Questions about AI are often posed as "What if this happens?" This kind of question assumes the future as a given. We need to discuss how to create such a future and how our children should be educated if that future comes. Because these questions proceed on the assumption that 'such a future will come,' I think we must seriously consider what kind of future we want to create beforehand.
-If AI asks why human 'frailty' is important and necessary, how would you answer?
▲I would counter-question, "Why should a machine that asks such questions exist?" When thinking about a desirable future for humanity, I hope there are no children starving or dying from disease, and that wealth gaps decrease. The goal is to protect social phenomena called dignity. If technology is needed to achieve these goals, we can develop appropriate technology. But the current logic of technology development is going backward. Developing technology first and then using it to help hungry children is a wrong approach. Technology developers tend to develop technology mainly for defense or profit rather than to meet human needs. Sometimes they develop technology just for fun. I believe the essence and goals of technology must be clarified so that technology can advance in a better direction for humanity.
-You mentioned narrative and stories are important human modes of thinking. How far do you think AI can understand human narratives? What is your view on AI's ability to handle narratives?
▲I think AI's way of understanding narratives is ultimately based on analyzing data patterns. But narratives contain more than simple patterns. Humans understand the world through stories. From birth, we interpret the world through various experiences and emotions and create stories within it. Narratives contain human emotions, conflicts, growth, etc., but since AI does not directly experience these, it inevitably has limits in expressing depth when generating stories.
For example, AI can write novels, but whether those novels can deeply move readers is questionable. A story is not just a sequence of events; the emotions and inner changes humans feel within it are important. Even if AI understands story structure well, that does not necessarily lead to excellent storytelling. AI is merely a tool that analyzes and reconstructs data; I think it is difficult for it to convey human experiences by itself.
-Then, do you see AI more likely to coexist as a tool rather than completely replacing human creators? Given social and ethical issues arising from AI development, what should we protect when technology threatens humanity?
▲Exactly. AI will not be able to completely replace human creative activities. But it is quite possible that AI will play a supportive role in the creative process. Attempts are already underway to use AI to increase creative efficiency or generate new ideas. I also view positively the possibility of AI developing as a tool that assists human creativity.
AI may not directly provide creative inspiration like humans, but its ability to quickly analyze data and suggest various ideas will certainly be useful. Ideally, AI should remain a simple assistant and develop in a way that expands the emotional and creative abilities of human creators. Ultimately, the important thing is how AI is used as a tool by human creators.
AI can assist human creative activities; must protect 'humanity' amid human-technology coexistence
Humans are imperfect and understand and empathize with each other within that imperfection. Technology can enhance human abilities, but in the process, our humanity must not disappear. Emotions like human limitations, pain, and joy are reasons why we live as humans. No matter how advanced technology becomes, the purpose of using technology is ultimately to live better lives. But if we lose our humanity in this process, the meaning of technological advancement will fade. Therefore, technology must develop while protecting human dignity, and balancing this is very important. I believe our task is to protect human dignity and frailty even as technology advances.
-Regarding the creative process, how should we understand the keyword 'human frailty'?
▲I think human frailty is one of the important sources of creativity. We are all imperfect beings who feel pain and grow within it. The experiences and emotions gained in this process greatly influence creativity. For example, the feelings of frustration, hope, and love experienced by characters in my novels often come from my own experiences and emotions. Recognizing frailty and finding creative inspiration within it is important. Human creativity is not simply made by technique or logic but is a work formed by emotions and thoughts coming from deep inside us. As long as AI does not understand that part, the meaning of frailty in the human creative process will remain important.
Kim Dae-sik & Kim Hye-yeon's AHA Dialogue_ On the 1st, novelist Jang Kang-myeong met with Professor Kim Dae-sik and choreographer Kim Hye-yeon at a studio in Seoul for a dialogue. Photo by Kim Hyun-min kimhyun81@
-You are preparing a nonfiction work about AI. What kind of story will it be?
▲In my view, there is no way to convey others' experiences except through language, that is, narrative. This makes me consider the role of narrative, which is not always positive. For example, apocalyptic narratives are effective in alerting people and prompting action. Humanity sometimes needs to be terrified. Works like George Orwell's 1984 show well the dangers of the combination of political power and surveillance technology.
Paradoxically, I think apocalyptic narratives play a powerful role, and I want to create a new apocalyptic narrative warning about the negative aspects of artificial intelligence. I want to address the theme of controlling that technology and show various problems arising not just from AI eliminating jobs but from other consequences. Conversely, utopian narratives also sometimes play a bad role. Stories depicting utopias often lead to hells in the real world. I want to show strange changes that may occur when AI is introduced. For example, through various incidents in the Go community after AI's arrival, I want to say similar changes will happen in other fields. I am preparing a book called 'AlphaGo Part 2' containing such stories.
Only narrative conveys others' experiences; digital generation lacks 'leisure for reflection'
This problem is already something we are experiencing. When we see intellectuals of the past reflecting in the afternoon, such leisure has disappeared nowadays. The digital generation has grown up without experiencing such reflective time, which is regrettable. Frailty is an important element of humanity, and the attitude of recognizing human decisions and uncertainties is also essential.
Currently, many people outsource decisions to others instead of pondering themselves. For example, consulting friends about romantic problems or posting their situations on internet forums and deciding based on comments. This shows individuals entrusting decisions to others. As AI replaces human abilities, certain skills in our lives will become meaningless. We are already losing many abilities. Predicting what will disappear is difficult. Our mental abilities are also changing as AI develops.
Who is novelist Jang Kang-myeong?
He studied urban engineering at Yonsei University. He worked as a journalist for over ten years at Dong-A Ilbo's political and industrial departments. His works include 'Because I Hate Korea,' 'Comment Army,' and 'Our Wish is War.' He is praised for boldly addressing various issues of modern society based on a deep interest in technology and society. In particular, he explores in depth topics such as individual identity, social inequality, and the impact of technological development on humans.
Professor Kim Dae-sik, Department of Electrical Engineering and Computer Science, KAIST
Choreographer Kim Hye-yeon (CEO of Yeonist)
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![[Kim Daesik & Kim Hyeyeon's AHA] Jang Gangmyeong "AI Will Not Understand Human Narratives and Humanity"](https://cphoto.asiae.co.kr/listimglink/1/2024100121030044921_1727784180.jpg)
![[Kim Daesik & Kim Hyeyeon's AHA] Jang Gangmyeong "AI Will Not Understand Human Narratives and Humanity"](https://cphoto.asiae.co.kr/listimglink/1/2024101808384763683_1729208326.jpg)

