U.S. auto safety regulators asked Tesla Inc why the electric-car maker has not issued a recall to address software updates made to its Autopilot driver-assistance system to improve the vehicles’ ability to detect emergency vehicles. The National Highway Traffic Safety Administration (NHTSA) in August opened a formal safety probe into Tesla’s Autopilot system in 765,000 U.S. vehicles after a series of crashes involving Tesla models and emergency vehicles. To date, NHTSA has identified 12 crashes that involved Tesla vehicles using advanced driver-assistance systems and emergency vehicles. NHTSA said most of the incidents took place after dark. NHTSA wants Tesla to disclose its “technical and/or legal basis for declining” to issue a recall.

In a separate letter, NHTSA asked Tesla about its “Autosteer on City Streets” which the company also refers to as “Full Self-Driving” (FSD) released in October 2020, and raised concerns about limits on disclosure by drivers of safety issues. “Despite Tesla’s characterization of FSD as ‘beta,’ it is capable of and is being used on public roads,” NHTSA said. Some users have posted social media videos that showed apparent issues with the FSD system. NHTSA wants Tesla to disclose its “criteria and timeline for allowing access to customers who have requested consideration in Tesla’s FSD Beta Request.”

NHTSA added it was aware of reports that participants in Tesla’s FSD early-access beta release program “have non-disclosure agreements that allegedly limit the participants from sharing information about FSD that portrays the feature negatively. Even limitations on sharing certain information publicly adversely impact NHTSA’s ability to obtain information relevant to safety.”

NHTSA said its actions “demonstrate its commitment to safety and its ongoing efforts to collect information necessary for the agency to fulfil its role in keeping everyone safe on the roadways, even as technology evolves. … We will act when we detect an unreasonable risk to public safety.”

NHTSA asked about Tesla’s distribution last month of functionality to certain Tesla vehicles intended to improve detection of emergency vehicle lights in low-light conditions, and Tesla’s early October release of the “Full Self-Driving Beta Request Menu option.”

NHTSA noted that the law says automakers must issue a recall “when they determine vehicles or equipment they produced contain defects related to motor vehicle safety or do not comply with an applicable motor vehicle safety standard.”

The safety regulator said the updates were to help detect flashing emergency vehicle lights in low-light conditions “and then responding to said detection with driver alerts and changes to the vehicle speed while AutoPilot is engaged.”

Tesla must respond by Nov. 1 to NHTSA’s letter. Tesla did not immediately comment. Its shares were up slightly in mid-morning trading on Wednesday.

In February, Tesla agreed to recall 135,000 vehicles with touch-screen displays that could fail and raise the risk of a crash after NHTSA sought the recall, warning it could result in the loss of rearview or backup camera images, exterior turn-signal lighting, and windshield defogging and defrosting systems.

Tags: , , , , , , , , , , , , , , , , , , , , , ,
Nikoleta Yanakieva Editor at DevStyleR International