Is It Okay for Platform Operators to Turn a Blind Eye?
YouTube's Guidelines Are Only Superficial... Creators Find Workarounds
Other Countries Have Regulations... Government and Courts Can Order Content Removal
Domestic Experts Also Say, "Sanctions Are Necessary"
Is it inevitable to leave fake news or false/manipulated information circulating on social networking services (SNS) like YouTube unchecked? While there are individual problems with political YouTubers using SNS, there are also calls to hold platform operators who provide these distribution channels accountable.
YouTube is a prime example. Although YouTube has already established its own content moderation standards through guidelines, it is difficult to say these are actually enforced. YouTube's guidelines clearly state that "channels or accounts may be terminated if they repeatedly post malicious or hate-inciting videos or comments that are personal attacks." They also prohibit "links leading to content that promotes violent behavior by others" or "links to websites or applications that spread content causing serious harm, such as obstructing democratic processes."
There are even provisions to delete inappropriate content or impose penalties on monetization. YouTube's guidelines include a clause stating, "If a video creator repeatedly encourages inappropriate behavior among viewers or exposes individuals to physical harm risks in social or political contexts of a region, the content may be removed or other penalties imposed."
Notably, YouTube has already established an "Election Misinformation Policy," which includes content related to election fraud. The guidelines state, "Content that promotes false claims of widespread fraud, errors, or defects in specific elections for past government leaders, or claims that the election results are false, should be refrained from uploading." The policy even specifies elections to which it applies, including the 2021 German federal election and the 2014, 2018, and 2022 Brazilian presidential elections?all places where election fraud claims were raised. In Germany, there were claims that manipulated ballots influenced the election. In Brazil, conspiracy theories spread mainly among supporters of former President Jair Bolsonaro, who later led riots occupying the Brazilian Congress, Supreme Court, and presidential office in 2023.
However, extreme political content still circulates on YouTube, and doxxing of individuals from opposing political camps continues. In preparation for possible sanctions on "Super Chat" donations on YouTube, YouTubers display bank account numbers directly in video subtitles to solicit donations. Video creators find their own workarounds under the guise of superficial guidelines.
On December 19, 2024, in front of the Seoul High Prosecutors' Office building, lawyer Seok Dong-hyun, who is set to defend President Yoon Seok-yeol, met with reporters to express his stance on the investigation into charges of insurrection and the impeachment trial. YouTubers were live streaming this scene on their phones, and real-time comments were appearing on the screen. Photo by Heo Young-han
Above all, it is problematic that individuals exposed to related content in any way find it difficult to know how or why they were exposed or how to cleanse such exposure. On most SNS platforms, including YouTube, algorithms are considered corporate trade secrets. Since the more viewing time and content there is, the greater the profit, YouTube continues to maintain its policy of keeping algorithms confidential.
However, academia argues that YouTube's algorithm is already biased and self-moderation has become impossible. The Institute for Strategic Dialogue (ISD), a UK-based anti-extremism think tank, reported, "Regardless of the age or interests of YouTube accounts, the algorithm ultimately recommends false/manipulated information, extreme, or sensational content." They added, "When set to a 14-year-old youth profile, YouTube recommended violent and sexual content despite the account being for a teenager."
Overseas Hold Platform Operators Accountable... Content Removal Possible
Abroad, there are various movements such as hearings holding platform operators accountable or passing laws regulating platforms. In France, after ongoing controversies over fake news during the 2017 presidential election, the Law Against Manipulation of Information was enacted. The law stipulates that those who distribute false/manipulated information and fail to delete it may face up to one year in prison and a fine of 75,000 euros (about 100 million KRW). The law applies from three months before the election starts, and judges can order information removal. Platform operators are also required to provide transparent information to ensure the fairness of voting.
In Germany, sensitive to hate speech, the Network Enforcement Act requires platform operators to delete posts containing fake news, Holocaust denial, or hate speech within 24 hours of discovery. Even if judgment is difficult, illegal content must be dealt with within a week once confirmed.
Singapore has enforced the Protection from Online Falsehoods and Manipulation Act since 2019. The government holds the authority to order corrections or deletions, and global operators like Google or Facebook face fines up to 1 million Singapore dollars (about 1 billion KRW) if they fail to comply.
In the United States, controversy arose recently when Mark Zuckerberg, Meta CEO, announced the removal of the platform's fact-checking feature. In 2020, the U.S. Senate Judiciary Committee summoned Zuckerberg and then-Twitter CEO Jack Dorsey to demand responses to false/manipulated information circulating on their platforms during the election period. They promised to create safeguards and take strong measures. Zuckerberg implemented fact-checking and hate speech regulation policies on Facebook and Instagram. However, he recently announced the removal of the fact-checking feature following Donald Trump's successful re-election campaign, drawing criticism that "he ultimately bowed to Trump's demands."
Domestic Experts: Platforms Also Profit... Responsibility Must Be Imposed
Domestic experts, while differing slightly in approach, emphasized the need to impose responsibility on platform operators like YouTube.
Professor Choi Sang-bong of the Department of Journalism and Broadcasting at Sungkonghoe University said, "Freedom of expression must be respected, so expressing opinions is fully acceptable. However, if a fact is clearly established and provable but is continuously denied, it can be subject to sanctions as clear fake news." He added, "Platform operators like YouTube must also be held accountable. They share profits with channel operators even when fake news circulates, so they have no intention to impose sanctions. If the government steps in to regulate, backlash will intensify. It is necessary for platform operators to take the lead and strengthen self-regulation voluntarily."
Professor Lim Myung-ho of the Department of Psychology at Dankook University also stressed, "Extreme information flows without filtering, so sanctions are necessary. Platforms must actively work on filtering."
However, some hold the view that imposing sanctions itself is impossible from the start. Professor Nam Jae-il of the Department of Journalism and Broadcasting at Kyungpook National University said, "Unless an incident violating current laws occurs, it is practically impossible to regulate videos uploaded on YouTube or similar platforms themselves. Ultimately, it is important that citizens and the media improve their capabilities to form a healthy public opinion ecosystem."
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![Even with Deletion and Closure Clauses, 'Ignorance'... Loopholes in Profit-Sharing Structures [YouTube and Confirmation Bias]](https://cphoto.asiae.co.kr/listimglink/1/2025012410054612472_1737680745.jpg)
![Even with Deletion and Closure Clauses, 'Ignorance'... Loopholes in Profit-Sharing Structures [YouTube and Confirmation Bias]](https://cphoto.asiae.co.kr/listimglink/1/2025012410210812511_1737681669.jpg)

