A Harder Question Than Legislation:
How Much Responsibility Are Platforms Taking?
About a month ago, Australia became the first country in the world to implement a law that completely bans anyone under the age of 16 from owning or using social networking service (SNS) accounts. Since then, similar legislative discussions have been spreading across the globe. With a growing body of research showing that excessive SNS use among teenagers leads to deteriorating mental health, criticism and reproach have erupted in South Korea, with many questioning, "Why do we have no regulations at all?" and directing their frustration at the government.
But is legislative restriction of SNS use really the key to solving this problem? A representative from an SNS platform whom I recently met expressed frustration that Korean society ignores the efforts of platforms actively responding to teenage SNS overuse, focusing solely on whether there are strong bans like those in Australia. It remains unproven whether forced bans can actually solve the problem of excessive SNS use among teenagers, yet there is an expectation that the government will resolve the issue through compulsory measures.
Since the implementation of the law blocking SNS accounts for minors in Australia, about 4.7 million accounts belonging to users under 16 have been deleted or blocked across 10 regulated social media platforms. However, downloads of smaller SNS platforms not subject to regulation have surged, and cases of teenagers using accounts under their parents' names, providing false age information, or accessing platforms via VPNs have also increased. In effect, strict regulations have pushed users toward more vulnerable platforms, creating a kind of "regulatory balloon effect." The regulations have failed to change the digital environment surrounding teenagers.
South Korea has experienced something similar in the past with the gaming shutdown law, which banned teenagers from accessing online games late at night. In reality, account sharing and bypassing through overseas servers became rampant. The policy was effectively abolished after about a decade, and it was concluded that forced restrictions were ineffective.
What our society needs now is not to block teenagers' access to SNS, but to encourage platforms to change so that young people can use them more healthily. Many countries are already moving away from legislative bans on teenage SNS use and are instead addressing the problem by redesigning platform structures.
In the United States, including New York State, SNS platforms are now required to display warnings about mental health risks for teenagers, aiming to change platform structures that are designed to be addictive. In France, rather than outright banning SNS use among teenagers, those under 15 cannot create accounts without explicit parental consent, distributing responsibility between families and platforms. The United Kingdom is focusing on blocking harmful content by age group and strengthening safety design standards. A notable example is the introduction of strict age verification systems, such as facial recognition and ID checks, for access to harmful content in July last year.
Mandating a ban on SNS use for teenagers risks undermining normal innovation and investment incentives for platforms. It also weakens the motivation to pursue technological improvements or design innovations for a safer environment. Platforms should strengthen voluntary measures based on the understanding that a structure in which teenage users suffer from SNS addiction cannot sustain business growth. The priority should be to set minimum standards, require transparency, and mandate the measurement and public disclosure of outcomes related to youth protection.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.

