US Regulators Launch Preliminary Probe Into Tesla FSD Crashes
U.S. safety regulators have opened a preliminary evaluation of Tesla vehicles equipped with Full Self-Driving (FSD) technology after a series of traffic incidents. The National Highway Traffic Safety Administration (NHTSA) said the system, which requires driver supervision and potential intervention, has at times demonstrated “induced vehicle behaviour that violated traffic safety laws.”
The inquiry marks a formal step that could lead to a recall if regulators determine there is a safety risk. The agency noted reports of 2.88 million Tesla vehicles driving through red lights or moving against the proper direction during lane changes while FSD was engaged. In six instances, Teslas with FSD allegedly approached an intersection on a red signal, continued into the intersection, and were involved in crashes with other vehicles. Four of these crashes resulted in injuries.
NHTSA emphasized that its assessment is ongoing and that the preliminary evaluation will determine whether a more extensive investigation is warranted. If the agency finds a safety defect or a substantial probability that the defect exists, it could order a recall to address the issues at hand.
Context: What FSD Is and How It Is Regulated
Tesla’s Full Self-Driving system is described by the company as a driver-assistance feature, not a fully autonomous system. Tesla states that FSD is intended for use with a fully attentive driver who keeps hands on the wheel and can take over at any moment. While marketed as capable of improving over time, the company cautions that the enabled features do not render the vehicle autonomous. Regulators, however, are tasked with evaluating whether such capabilities meet safety standards and if they inadvertently encourage risky driving behavior.
The current investigation follows an ongoing NHTSA review that began in October 2024, centered on 2.4 million Teslas equipped with FSD after several crashes in conditions with reduced visibility—sun glare, fog, or airborne dust. One incident from 2023 involving a fatal crash has heightened scrutiny and contributed to the regulator’s heightened attention to FSD performance across varying weather and lighting scenarios.
What the Investigation Means for Tesla and Drivers
For Tesla, the investigation could result in a recall or the imposition of remedies if regulators determine FSD contributes to traffic safety risks. For drivers, the case underscores the importance of remaining alert and ready to intervene even when driving with advanced driver-assistance technologies activated. The NHTSA’s disclosures show that concerns about system limitations—such as failing to stop at red lights or misreporting traffic signal states—remain central to safety evaluations.
Industry observers say the probe could influence how regulators evaluate evolving autonomous systems and how automakers calibrate warnings, interlocks, and driver oversight. It also raises questions about public understanding of FSD capabilities, the user interface’s clarity regarding ongoing driver responsibility, and how such features should be marketed as they approach higher levels of automation.
Next Steps in the Safety Process
At this stage, NHTSA has begun with a preliminary evaluation to determine the scope of potential issues. If initial findings suggest a safety defect or substantial risk, the agency could escalate the matter to a formal defect investigation, which may culminate in a recall. Tesla has not provided a comment to Reuters at this time, and the company’s position on FSD’s safety metrics and user alerts remains a focal point for investors and customers alike.
As the dialogue between regulators, automakers, and the driving public continues, drivers should stay informed about any updates, system limitations, and recommended operating practices for FSD-enabled Teslas. The case will likely shape discussions about how far driver-assistance technology should be allowed to rely on automated traffic signal detection and vehicle behavior before mandatory safeguards are required.