The U.S. auto safety regulator on Friday opened an investigation into 2.4 million Tesla Inc. TSLA-Q vehicles equipped with the automaker’s Full Self-Driving (FSD) software after four reported collisions, including a 2023 fatal crash.
The National Highway Traffic Safety Administration’s (NHTSA) preliminary evaluation is the first step before the agency could seek a recall of the vehicles if it believes they pose an unreasonable risk to safety.
The new scrutiny of the advanced driver-assistance system comes as Tesla chief executive Elon Musk looks to shift Tesla’s focus to self-driving technology and robotaxis as it faces growing competition and weak demand in its auto business.
Last week, Mr. Musk unveiled Tesla’s two-seater, two-door Cybercab robotaxi concept. It comes without a steering wheel and with pedals that would use cameras and artificial intelligence to help navigate roads. Tesla would need NHTSA approval to deploy a vehicle without human controls.
NHTSA said it was opening the inquiry after four reports of crashes where FSD was engaged during reduced roadway visibility like sun glare, fog or airborne dust. A pedestrian was killed in Rimrock, Ariz., in November, 2023, after being struck by a 2021 Tesla Model Y, NHTSA said. Another crash under investigation involved a reported injury.
The probe covers 2016-24 Model S and X vehicles with the optional system as well as 2017-24 Model 3, 2020-24 Model Y and 2023-24 Cybertruck vehicles.
The company did not respond to requests for comment. Its shares were up 0.1 per cent in early trading.
Tesla says on its website its FSD software in on-road vehicles requires active driver supervision and does not make vehicles autonomous.
NHTSA is reviewing the ability of FSD’s engineering controls to “detect and respond appropriately to reduced roadway visibility conditions.”
The agency is asking if other similar FSD crashes have occurred in reduced roadway visibility conditions, and if Tesla has updated or modified the FSD system in a way that may affect it in such conditions.
NHTSA said the “review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the agency said.
Tesla’s FSD technology has been in development for years and aims for high automation, where its vehicle can handle most driving tasks without human intervention.
Tesla in December recalled more than two million U.S. vehicles to install new safeguards in its Autopilot advanced driver-assistance system. NHTSA is still probing whether that recall is adequate to address concerns drivers are not paying attention.
Tesla disclosed in October, 2023, that the U.S. Justice Department issued subpoenas related to its FSD and Autopilot systems. Reuters reported in October, 2022, that Tesla was under criminal investigation.
There have been at least two fatal accidents involving the FSD technology, including an incident in April in which a Tesla Model S car was in FSD mode when it hit and killed a 28-year-old motorcyclist in the Seattle area.
Tesla’s “camera-only” approach to partially and fully autonomous driving systems, some industry experts have said, could cause issues in low-visibility conditions as the vehicles do not have a set of backup sensors.
“Weather conditions can impact the camera’s ability to see things and I think the regulatory environment will certainly weigh in on this,” said Jeff Schuster, vice-president at GlobalData.
“That could be one of the major roadblocks in what I would call a near-term launch of this technology and these products,” he said of Tesla’s robotaxis.
Tesla’s rivals that operate robotaxis rely on expensive sensors such as lidar and radar to detect driving environments.