Overview of the Probe
The U.S. government has opened a preliminary evaluation into Tesla’s self-driving software following reports that the company’s autonomous vehicles violated traffic laws, including driving on the wrong side of the road and failing to stop at red lights. The investigation, led by the National Highway Traffic Safety Administration (NHTSA), spans an estimated 2.9 million cars equipped with Tesla’s Full Self-Driving (FSD) technology. The inquiry will assess the scope, frequency, and potential safety consequences of reports related to the FSD “Supervised” mode.
What is Full Self-Driving (Supervised)?
Tesla’s FSD feature is marketed as an advanced driver-assistance system. In the “Supervised” mode, the vehicle can perform lane changes and turns, but the driver must remain alert and ready to take over control at any moment. Investigators want to understand how often these modes are used, how reliably the system handles complex traffic situations, and whether there are vulnerabilities that could put drivers and others at risk.
Reported Violations and Safety Implications
According to the NHTSA filing, there were 58 reports of traffic-law violations associated with Tesla vehicles. Six crashes involved situations where the car’s automation appeared to stop at a red light but moved through it anyway, with four resulting in injuries. The agency noted that several incidents offered limited notice to the driver or limited opportunity to intervene, raising concerns about the effectiveness of the monitoring required of drivers in “Supervised” mode.
Scope of the Investigation
The preliminary evaluation will consider whether these incidents indicate a broader safety defect or a need for design changes in the FSD system. As part of the process, NHTSA will examine the conditions surrounding each event, such as road layout, traffic signals, driver engagement, and potential software updates in effect at the time.
Tesla’s Response and Context
Tesla did not immediately provide public comment on the investigation. The automaker has repeatedly argued that its vehicles are equipped with advanced driver-assistance capabilities and that drivers must supervise and maintain attention. This latest probe adds to existing regulatory scrutiny surrounding Tesla’s door-lock mechanisms, which are also under NHTSA review following reports of occupants becoming trapped in some Model Y vehicles.
Broader Industry Implications
The probe signals ongoing regulatory caution as automakers push toward higher levels of automation. Regulators want to ensure that driver monitoring, system updates, and fail-safe mechanisms adequately protect road users. The outcome could influence how future software updates are rolled out, how driver engagement is quantified, and what kinds of testing are required before broader deployments are approved.
What’s Next?
NHTSA will determine whether the incidents warrant a formal safety recall or further investigations. In the meantime, Tesla owners using FSD are likely to see continued updates and communications from the company regarding how to operate in “Supervised” mode and any safety advisories issued by regulators.
Context in the Market
<pAs Tesla expands its lineup and positions itself against other automakers and newer entrants, it remains under intense spotlight for its autonomous features. The company recently introduced lower-priced variants of popular models as part of its strategy to broaden adoption, while political and business dynamics around leadership and corporate strategy continue to influence public perception and investor sentiment.