본문 바로가기
bar_progress

Text Size

Close

Tesla's Autonomous Driving Still Fails to 'Identify Children'... Persistent Noise Issues Remain

Tesla's Autonomous Driving Still Fails to 'Identify Children'... Persistent Noise Issues Remain Tesla Model 3 [Photo by Yonhap News]


[Asia Economy Intern Reporter Kim Se-eun] A test has revealed that Tesla's autonomous driving feature fails to properly detect child pedestrians.


According to the recent report by the British daily The Guardian, the private organization 'The Dawn Project' claimed that Tesla's latest Full Self-Driving (FSD) software repeatedly failed to identify child-sized mannequins while driving at an average speed of 25 km/h (40 km/h).


The organization's released test footage showed a Tesla Model 3 continuing to drive without detecting the child mannequins on the road. At the time, the vehicle was traveling at an average speed of 40 km/h over a 110-meter straight section and showed no signs of slowing down or changing direction before a collision. In all eight tests conducted, the vehicle only stopped after colliding with the mannequins.


The Tesla Model 3 used in the experiment was equipped with the latest software released on June 1.


Tesla's autonomous driving software comes in two types. First, 'Autopilot' is a driver assistance feature that comes standard on all Tesla models. When Autopilot is activated, the car can automatically adjust its speed and direction.


The second is the FSD, which has become controversial this time. The Society of Automotive Engineers (SAE) classifies vehicle autonomy into levels 0 to 5. FSD corresponds to levels 4 to 5, a level Tesla has been promoting since 2016. Level 4 represents an advanced stage where autonomous driving is possible with the driver on standby for emergencies, while level 5 indicates full autonomy.


Autopilot's autonomous driving level is around 1 to 2, where level 1 can control one function at a time such as automatic braking, speed control, or lane keeping. Level 2 can handle multiple tasks simultaneously.


Following the experiment, Dan Odowd, CEO of Green Hills Software and leader of The Dawn Project, argued that "FSD mode should be banned until its safety is proven."


He explained, "Elon Musk describes his company's FSD software as 'amazing,' but this is not true. In the U.S., over 100,000 Tesla drivers are using FSD mode, which could pose a danger to pedestrians, including children."


Meanwhile, the U.S. National Highway Traffic Safety Administration (NHTSA) has been investigating pedestrian fatalities involving Tesla vehicles during autonomous driving since August last year, covering 830,000 vehicles.


© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


Join us on social!

Top