"Autopilot" Accident on Unsuitable Roads
"Tesla Did Not Implement Usage Restrictions"
"Regulatory Authorities' Oversight Also Lax"
Amid ongoing safety controversies surrounding the autonomous driving technology of electric car maker Tesla, reports have emerged that the 'Autopilot' driver assistance system has been involved in at least eight serious accidents while operating on roads with many driving variables.
On the 10th (local time), the Washington Post (WP) reported that after analyzing federal databases, legal records, and public agency documents, it confirmed that from 2016 to recently, at least eight traffic accidents occurred while the Autopilot function was active in environments with many driving variables, such as rural roads.
According to WP, in a Tesla vehicle accident that occurred in Florida in 2019, the vehicle was seen ignoring road warning signals while driving autonomously on a rural road. The black box footage obtained by WP showed the accident vehicle ignoring a yellow warning light indicating that it was a dead-end road and that a left or right turn was necessary, continuing to drive straight. The vehicle was driving autonomously at 70 miles per hour (113 km/h) when it struck a young couple by the roadside; the driver sustained serious injuries and the passenger died.
In March, a Tesla vehicle driving autonomously in North Carolina struck a teenage student getting off a school bus while traveling at 45 miles per hour (72 km/h) without slowing down. Additionally, in 2016, a Tesla vehicle driving autonomously in Florida failed to detect a truck crossing ahead and crashed under the truck.
Tesla’s user manual specifies that the Autopilot’s main feature, ‘Autosteer,’ is "intended for use on highways with a central divider, clearly marked lanes, no cross traffic, and controlled access." It also states that "autonomous driving functions may become unstable on hills or sharp curves." WP pointed out that "Tesla had the technology to restrict Autopilot use based on geographic characteristics but did not take definitive action."
Furthermore, WP noted that "federal regulatory authorities also failed to take necessary measures." It added, "After the 2016 accident, the National Transportation Safety Board (NTSB) requested the National Highway Traffic Safety Administration (NHTSA) to limit the activation areas of the Autopilot function, but without establishing safety standards, the rift between the two agencies widened."
Experts criticized these autonomous driving accidents as examples of what can happen when rapidly advancing technology lacks government oversight.
WP reported that Tesla did not respond to requests for comment on this matter. However, through lawsuits and public statements related to Autopilot accidents, Tesla has denied responsibility, stating that "the ultimate responsibility for driving lies with the driver."
Meanwhile, on the 5th, a former Tesla employee sparked controversy by revealing in an interview with the BBC that Tesla’s autonomous driving technology is not safe enough for use on public roads. The U.S. Department of Justice is investigating Tesla’s driver assistance features under criminal charges, and the National Highway Traffic Safety Administration (NHTSA) is also investigating the safety of the Autopilot system following a series of Tesla vehicle accidents involving the Autopilot function, the BBC reported.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.


![Clutching a Stolen Dior Bag, Saying "I Hate Being Poor but Real"... The Grotesque Con of a "Human Knockoff" [Slate]](https://cwcontent.asiae.co.kr/asiaresize/183/2026021902243444107_1771435474.jpg)
