Tesla recalls 2 million vehicles in US for fixing Autopilot driver monitoring

KEY HIGHLIGHTS
  • Tesla Recalls Over 2 Million U.S. Vehicles: Tesla recalls all vehicles in the U.S. (2M+) for a software update to address Autopilot’s flawed driver attention system.
  • NHTSA Investigation and Findings: A two-year NHTSA investigation reveals Autopilot’s insufficient driver attention verification, leading to foreseeable misuse and crashes.
  • Enhanced Warnings and Controls: The update includes increased driver alerts, limits on Autopilot areas, and additional checks to ensure sustained driving responsibility.
  • Mixed Reactions to Recall: Safety experts applaud the recall but criticize it for not resolving Tesla’s struggle to detect and avoid obstacles in the path.
  • Recalled Models and Software Update: The recall covers Tesla models Y, S, 3, and X, with the software update aiming to address shortcomings in Autopilot’s functionality.
  • Safety Concerns and Ongoing Monitoring: Despite the update, concerns persist about the underlying safety issues. NHTSA keeps the investigation open, monitoring Tesla’s remedies.
Tesla recalls 2 million vehicles in US for fixing Autopilot driver monitoring
Tesla recalls all US vehicles after a 2-year investigation into Autopilot defects by NHTSA. (Image: AFP Pic)

Tesla Recalls Cars After NHTSA Findings

Tesla has issued a recall for over 2 million vehicles in the United States, aiming to address concerns related to the Autopilot system. This recall comes in response to a thorough two-year investigation conducted by the National Highway Traffic Safety Administration (NHTSA) following a series of accidents, some of which resulted in fatalities, involving Tesla vehicles utilizing the Autopilot feature.

The NHTSA’s findings highlighted a defect in the Autopilot’s design, specifically in its ability to ensure that drivers remain attentive while the system is engaged. The inadequacy of the current method to ensure driver attention was identified as a potential factor contributing to the misuse of the system, leading to accidents.

Tesla’s proposed solution involves a comprehensive software update, which will introduce heightened warnings and alerts for drivers. Additionally, the update will restrict the operational areas for basic Autopilot functions. This measure is intended to reinforce the driver’s continuous responsibility while using Autopilot and to mitigate the risks associated with potential misuse.

While some safety experts view the recall as a positive step, they also express concerns that it places the onus on the driver without directly addressing the underlying issue. Critics argue that Tesla’s automated systems still face challenges in effectively detecting and responding to obstacles in their path.

The recall encompasses various Tesla models, including the Y, S, 3, and X, manufactured between October 5, 2012, and December 7 of the current year. The software update commenced distribution on Tuesday for certain affected vehicles, with the remainder scheduled to receive it at a later date.

Following the announcement of the recall, Tesla’s stock experienced initial volatility, dropping over 3% in early trading on Wednesday. However, the stock rebounded later in the day, closing with a 1% gain amidst a broader market rally.

For individuals like Dillon Angulo, who suffered serious injuries in a 2019 crash involving a Tesla on Autopilot, the recall appears to be insufficient. Angulo, currently in the process of recovering from brain trauma and broken bones, emphasizes the urgency for stronger governmental intervention, asserting that the technology is not safe and poses risks that need to be addressed promptly.

The Autopilot system under scrutiny consists of two main features: Autosteer and Traffic Aware Cruise Control. Autosteer is designed for use on limited access freeways, with an additional feature called Autosteer on City Streets for more advanced operations.

Autosteer Restrictions and Software Update

As part of the recall, the upcoming software update will impose restrictions on the usage of Autosteer. The system will now generate visual and audible alerts, notifying the driver of its unavailability if engagement conditions are not met. Autosteer will remain inactive in such instances.

The additional controls introduced by the update vary depending on a Tesla vehicle’s hardware. These controls include enhancing the visibility of visual alerts, simplifying the activation and deactivation process of Autosteer, and implementing extra checks to ensure that Autosteer is not used on non-controlled access roads and when approaching traffic control devices. The recall documents emphasize that drivers repeatedly failing to demonstrate “continuous and sustained driving responsibility” may face suspension from using Autosteer.

The recall process was initiated following meetings between agency investigators and Tesla starting in October. Although Tesla did not fully agree with the National Highway Traffic Safety Administration’s (NHTSA) analysis, the company opted for the recall on December 5 to resolve the investigation.

Advocates for auto safety have long called for more stringent regulations regarding driver monitoring systems, advocating for the inclusion of cameras to ensure that drivers remain attentive—a feature employed by other automakers with similar systems.

However, some experts express disappointment with the software update, considering it a compromise that fails to address critical issues. Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, points out that the update does not tackle the absence of night vision cameras to monitor drivers’ eyes and the ongoing challenge of Teslas failing to detect and respond to obstacles.

Koopman and Michael Brooks, the executive director of the nonprofit Center for Auto Safety, argue that the safety defect of crashing into emergency vehicles remains unaddressed. According to Brooks, the update does not answer the fundamental question of why Teslas on Autopilot struggle to detect and respond to emergency activity.

Koopman suggests that the NHTSA might have settled for the software change as the most feasible solution, weighing the benefits against the costs of prolonged negotiations with Tesla. The NHTSA, in a statement on Wednesday, emphasized that the investigation is still ongoing, and they will continue monitoring the effectiveness of Tesla’s remedies to ensure the highest level of safety.

Autopilot’s Limitations and Ongoing Concerns

The Autopilot system, despite its name, functions as a driver-assist system rather than a self-driving mechanism. While it can autonomously handle steering, acceleration, and braking within its lane, it requires the driver’s constant attention and intervention. Recent independent tests have revealed vulnerabilities in the monitoring system, making it susceptible to manipulation. Instances have been reported where drivers have been caught driving under the influence or even seated in the back seat while Autopilot was engaged.

Tesla acknowledged these concerns in its defect report submitted to the safety agency, stating that Autopilot’s controls “may not be sufficient to prevent driver misuse.” As of early Wednesday, there has been no additional comment from the Austin-based company.

On its official website, Tesla clarifies that both Autopilot and the more advanced Full Self Driving (FSD) system are designed to assist drivers, who must remain ready to take control at any moment. The FSD system is currently undergoing testing by Tesla owners on public roads.

In a statement posted on X (formerly Twitter) on Monday, Tesla emphasized that safety is enhanced when Autopilot is engaged, underscoring its role as a supplementary tool for drivers.

The National Highway Traffic Safety Administration (NHTSA) has been actively investigating 35 Tesla crashes since 2016, suspecting the vehicles were operating on an automated system during these incidents. Tragically, at least 17 people have lost their lives in these crashes. The investigations are part of a broader NHTSA inquiry into numerous instances of Teslas equipped with Autopilot colliding with emergency vehicles. The NHTSA has taken a more assertive stance in addressing safety concerns with Tesla vehicles, including issuing a recall for the Full Self Driving software.

Transportation Secretary Pete Buttigieg, who oversees the department encompassing the NHTSA, expressed concerns in May about Tesla’s use of the term “Autopilot,” emphasizing that the system cannot operate as a self-driving technology. These developments underscore the growing scrutiny and regulatory focus on the safety aspects of Tesla’s Autopilot system.

Google News Icon

Get latest updates on Google News

Source(s): Al Jazeera; CNN; AP News

The information above is curated from reliable sources, modified for clarity. Slash Insider is not responsible for its completeness or accuracy. Please refer to the original source for the full article. Views expressed are solely those of the original authors and not necessarily of Slash Insider. We strive to deliver reliable articles but encourage readers to verify details independently.