Background
The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a preliminary evaluation into Tesla vehicles equipped with Full Self-Driving (FSD) technology. The investigation follows a series of crashes and a growing catalog of complaints about the system’s behavior at intersections and in other traffic scenarios. While Tesla describes FSD as a driver-assistance feature that requires active supervision, regulators are examining whether the software’s actions may constitute traffic-safety violations.
What the NHTSA Is Looking Into
The agency said it has received reports of 2.88 million Teslas operating with FSD unintentionally driving through red lights and moving against the flow of traffic when changing lanes. It identified six cases where an FSD-enabled Tesla approached an intersection with a red signal, continued into the intersection, and was involved in crashes with other vehicles. Four of those crashes resulted in injuries. In addition, 18 complaints and one media report allege failures to stop at red signals, stop fully, or display the correct traffic signal state in the vehicle interface.
What This Means for Tesla and Drivers
This step is the preliminary evaluation stage that precedes a possible recall if the agency determines a safety defect exists. NHTSA’s review underscores ongoing concerns about how FSD handles traffic-signal detection, intersection navigation, and the communication of the vehicle’s intended maneuvers to the driver. The agency has emphasized that FSD requires a fully attentive driver who can take over at any moment, noting that the features are not autonomous as of now.
Context Within the Broader Safety Review
Regulators have been scrutinizing Tesla’s FSD for more than a year. A prior inquiry, started in October 2024, looked into millions of Teslas equipped with FSD after four crashes under reduced-visibility conditions. One of those incidents was fatal. The current focus—red-light violations and misleading signal displays—adds to concerns that the system may not consistently interpret complex urban traffic in real time.
Industry and Public Safety Implications
As autonomous-leaning technologies become more common, regulators worldwide are balancing innovation with safety. Probing FSD’s behavior at intersections could influence how regulators define driver responsibility when assisted driving features are engaged, and whether manufacturers should adjust user interfaces, warning systems, or fail-safes. For drivers, the investigation is a reminder to stay vigilant, keep hands on the wheel, and be prepared to intervene despite any automated capabilities.
What Comes Next
In the coming weeks and months, NHTSA will likely gather additional data, interview stakeholders, and potentially request supplier or software data from Tesla. If the agency finds systemic safety risks, it can escalate to a formal recall or require software updates. Tesla has not publicly commented immediately on the latest inquiry, but previous statements have stressed that FSD is designed to assist a fully attentive driver, not replace human oversight.
Why This Matters for Consumers
For current and prospective Tesla owners, the investigation highlights the evolving risk profile of semi-automated driving systems. While FSD offers convenience and potential safety benefits through advanced software, customers should remain cautious, monitor official updates, and comply with all road rules. The outcome of the NHTSA review could shape future safety standards for driver-assistance technologies across the auto industry.
