Tesla’s Self-Driving System Slammed After Fatal Crashes

The U.S. government launched an investigation on Friday, October 18, 2024, into Tesla’s “Full Self-Driving” (FSD) system following reports of crashes in low-visibility conditions, including one that resulted in a pedestrian fatality. The National Highway Traffic Safety Administration (NHTSA) is examining the system’s ability to detect and respond to reduced roadway visibility.

The investigation covers approximately 2.4 million Tesla vehicles from model years 2016 through 2024. This was after Tesla reported four crashes where vehicles entered areas with compromised visibility due to sun glare, fog, or airborne dust. One of these incidents led to a fatal collision in Rimrock, Arizona in November 2023.

In the November 2023 Arizona crash, a 2021 Tesla Model Y struck a Toyota 4Runner and a pedestrian who had exited the vehicle to assist with traffic control. Sun glare was identified as a contributing factor in both the initial collision and the subsequent pedestrian impact. The Tesla driver was not charged in the incident.

NHTSA’s probe will assess whether similar crashes involving the FSD system have occurred in low-visibility conditions. The agency will also seek information from Tesla regarding any updates that may have affected the system’s performance in these situations.

This is not the first time Tesla’s self-driving technology has faced regulatory scrutiny. The company has previously recalled its FSD system twice under pressure from NHTSA. In one instance, the system was found to be programmed to run stop signs at low speeds and disregard other traffic laws. These issues were addressed through software updates.

Critics argue that Tesla’s reliance on cameras alone for hazard detection may be insufficient for true self-driving capabilities without incorporating other sensors like radar and lidar. The NHTSA’s current investigation marks a shift in focus, as the agency is now examining the actual capabilities of the FSD system rather than solely ensuring driver attentiveness.

Tesla CEO Elon Musk has been vocal about the company’s self-driving ambitions. Despite the ongoing investigation, Musk recently announced plans for fully autonomous robotaxis without steering wheels or pedals, slated for deployment in Texas and California starting in 2026. The company also aims to implement autonomous cars without human drivers as early as next year.

However, these plans may face regulatory hurdles. NHTSA approval would be required for vehicles without traditional controls, and such approval is unlikely to be granted while the investigation is ongoing. Additionally, state regulations may limit Tesla’s ability to deploy autonomous features in existing models, as there are currently no federal regulations explicitly addressing autonomous vehicles.

The NHTSA’s investigation into Tesla’s Autopilot system, which has been ongoing for three years, has identified at least 13 fatal crashes involving Tesla vehicles with Autopilot engaged. The agency found that Tesla’s claims about Autopilot’s capabilities did not align with reality and that the “weak driver engagement system” was inappropriate for the system’s “permissive operating capabilities,” resulting in a “critical safety gap.”

“The ‘Full Self-Driving’ name may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation,” an NHTSA spokesperson stated.

The outcome of the NHTSA investigation and its potential impact on Tesla’s self-driving aspirations remain to be seen. As the automotive industry continues to evolve towards autonomous driving, regulators and manufacturers struggle to ensure public safety while advancing technological innovation.

━ latest articles

━ explore more

━ more articles like this