DETROIT — Tesla is issuing a recall for over 2 million vehicles sold in the U.S. This recall aims to address a software update and rectify a flawed system responsible for ensuring driver attentiveness while utilizing Autopilot.
According to documents released on Wednesday by U.S. safety regulators, the forthcoming update will enhance warnings and alerts to drivers, and it may also restrict the areas where basic versions of Autopilot can be utilized.
National Highway Traffic Safety
The remember effects from a complete two-12 months research from the National Highway Traffic Safety Administration. They targeted a sequence of crashes that came about for the duration of the usage of the Autopilot in part automatic riding system, several of which led to fatalities.
The agency’s investigation revealed that Autopilot’s mechanism for ensuring driver attention may be insufficient, resulting in potential system misuse in the foreseeable future.
According to the documents, introducing additional controls and alerts reinforces drivers’ ongoing responsibility to adhere to safe driving practices.
Tesla Using Autopilot
The recall includes models Y, S, 3, and X manufactured from October 5, 2012, to December 7 of the current year. The update was scheduled to be distributed to specific affected vehicles on Tuesday, while the remaining cars will receive it later.
During Wednesday’s trading session, Tesla’s shares initially declined by over 3%. However, they later rebounded amidst a widespread stock market rally, ending the day with a 1% increase.
Dillon Angulo, who was seriously injured in a 2019 crash involving a Tesla using Autopilot along a rural stretch of Florida highway where the software shouldn’t have been deployed, saw the attempts to fix the flaws in Autopilot as inadequate and belated.
Angulo, who’s presently getting better from accidents that worried mind trauma and damaged bones, expressed issues regarding the protection of this technology.
He believes that instantaneous movement must be taken to cope with this issue, emphasizing the need for authority intervention instead of experimental practices.
The autopilot has two main features:
Autosteer and Traffic Aware Cruise Control. Autosteer is designed for limited access freeways and operates independently from another element, Autosteer on City Streets, which provides more advanced capabilities.
According to the recall documents, the software update will restrict the usage of Autosteer to specific conditions. In cases where the driver tries to activate Autosteer without meeting the required needs, a visual and audible alert will inform the driver that Autosteer is unavailable and will not start.
Tesla’s hardware determines the extent of added controls, which include enhancing the visibility of visual alerts, streamlining Autosteer’s activation and deactivation process, and implementing additional checks for Autosteer usage outside of controlled access roads and approaching traffic control devices.
Per the documents, drivers may face suspension from utilizing Autosteer if they repeatedly fail to showcase continuous and sustained driving responsibility.
Based on recall documents, Tesla was informed by agency investigators in October about their “tentative conclusions” regarding the monitoring system fix.
While Tesla did not initially agree with NHTSA’s analysis, they ultimately decided to recall on December 5 to resolve the ongoing investigation.
Advocates for auto safety have long been urging for more stringent regulation of the driver monitoring system. This system detects whether a driver’s hands are on the steering wheel.
These advocates propose using cameras to ensure drivers are paying attention, a feature already implemented by other automakers with comparable systems.
According to Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University specializing in autonomous vehicle safety, the recently implemented software update is deemed a compromise. However, it fails to rectify the absence of night vision cameras for monitoring drivers’ eyes and the issue of Teslas not detecting and avoiding obstacles.
According to Koopman, the compromise is disappointing because it fails to address the problem of older cars needing more hardware for driver monitoring.
Koopman and Michael Brooks, the executive director of the nonprofit Center for Auto Safety, argue that the failure of Teslas on Autopilot to detect and respond to emergency vehicles is a safety defect that has not been adequately addressed.
According to Brooks, this issue needs to address the core question of why these vehicles cannot detect and react to emergency activity.
According to Koopman, NHTSA determined the software modification was the maximum concession they could secure from the company.
The decision is based on the belief that the benefits of implementing the change immediately outweigh the potential costs associated with engaging in further negotiations with Tesla for another year.
Based on Koopman’s remarks, NHTSA concluded that the software modification represented the most significant compromise they could achieve with the company.
This decision is grounded in the perspective that the advantages of implementing the change promptly surpass the potential costs that may arise from additional negotiations with Tesla for another year.
The National Highway Traffic Safety Administration (NHTSA)
The Autopilot system is designed to steer, accelerate automatically, and brake within its lane. However, despite its name, it is essential to note that this is a driver-assist system and not a fully autonomous driving system.
Independent tests have revealed that the monitoring system can be easily deceived, resulting in drivers being caught driving under the influence or even occupying the back seat.
Tesla has filed a defect report with the safety agency, stating that the controls of Autopilot might not be adequate to prevent driver misuse. We reached out to the Austin, Texas-based company for additional comments.
Tesla’s website, Autopilot and the advanced Full Self-Driving system are designed to assist drivers, who must remain vigilant and ready to intervene at any moment. Tesla owners are testing the full self-driving capability on public roads.
Autopilot Engagement Statement
Tesla recently emphasized the importance of safety with Autopilot engagement in a statement published on X, previously known as Twitter.
The National Highway Traffic Safety Administration (NHTSA) has investigated 35 Tesla accidents since 2016, where the agency suspects the vehicles were operating under automated systems. Tragically, these incidents have resulted in at least 17 fatalities.
The NHTSA is conducting investigations as part of a broader probe regarding numerous incidents involving Teslas on Autopilot colliding with emergency vehicles. NHTSA has intensified efforts to address safety concerns related to Teslas, which led to a recall of the Full Self-Driving software.