During the height of the so-called "literacy crisis" in Korea, which included controversies like the "insincere apology" debate, I once wrote that the core of the literacy problem is not vocabulary, but rather the disappearance of the "willingness to understand" itself. Whether one knows an additional Sino-Korean word or not is, in fact, not the real issue. Instead, what concerns me is the growing trend where people are increasingly unable and unwilling to understand the "position" of others, leading to what I see as a kind of "empathy crisis." People are becoming less and less willing to even try to understand perspectives different from their own.
These days, the situation seems even more serious. The most notable reason is the power of algorithms. It is known that Koreans spend an average of five to six hours a day looking at their smartphones. Whether browsing social networking services or watching YouTube, most people consume only content recommended by algorithms tailored to their preferences. Only content that matches one's own tastes, positions, and opinions is recommended and repeatedly viewed. Anything even slightly displeasing is dismissed with a "dislike," unsubscribed from, or blocked after leaving a negative comment. Becoming accustomed to such algorithms from a young age makes it impossible to "understand" thoughts, positions, or perspectives different from one's own. This, in my view, is a new kind of literacy crisis in the age of artificial intelligence (AI).
No matter how much I arm myself with logic and evidence and raise my voice, my words bounce off the ears of those who hold different positions from the very beginning. People first check each other's "positions" and feel that there is no value in listening to opposing views. These days, we are truly living in the era of ChatGPT. If you ask for an analysis of the logical flaws, weaknesses, or shortcomings of any opinion, it can generate critiques at the level of academic papers through in-depth research. Try asking ChatGPT: "Give me 20 arguments that will strengthen and supplement my opinion." Looking at the results, it only reinforces my own correctness. The AI era is an age of extreme self-justification.
That is why I believe the most important thing in the AI era is a certain "attitude." How we approach AI is, in fact, everything. If I am determined only to reinforce my own position and to despise and hate the opposing side, AI can endlessly generate "self-justifying" logic for me. It is all too easy and sweet to remain forever trapped in that universe. In perfect harmony with an AI that flatters and panders to me, I am optimized to feel, "I am always right, and those with different views are worthless." We have acquired a tool that allows us to remain in that universe forever.
However, if I adopt a slightly different "attitude," the AI era could actually guide me toward optimal balance. By intuitively accepting that my opinion could be wrong, I can ask AI to critique my own views. It can instantly provide grounds for examining and understanding opposing positions in greater detail.
If we are to take the literacy crisis seriously, it might be better to read books rather than rely on easily manipulated AI. If we could simply read a book's worth of someone else's perspective, rather than our own, that alone would be a rare and successful achievement in our time. Reading even one book written by someone who sees the world differently from us increases the very possibility of understanding. I believe that is the most essential improvement in "literacy" needed in our era.
Jung Jiwoo, Culture Critic and Attorney
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

