Motorpost Daily
All things automotive
Safety & RecallsOctober 10, 20254 min

NHTSA Investigates 2.9 Million Tesla Vehicles for FSD Traffic Violations

Nina Alvarez

Nina Alvarez

Safety editor tracking recalls, crash tests and regulations. Drives a Volvo V90; keeps a few child seats for testing.

Share
Automotive

The U.S. National Highway Traffic Safety Administration (NHTSA) has launched an investigation into approximately 2.9 million Tesla vehicles, focusing on potential traffic safety violations associated with the automaker's Full Self-Driving (FSD) system. This probe comes in the wake of reports suggesting that FSD may have led to behaviors that contravene traffic laws, raising concerns about the safety of autonomous systems on public roads.

The NHTSA's investigation was prompted by six reports where Tesla vehicles, while using the FSD system, allegedly disregarded red traffic signals. In these incidents, the cars reportedly entered intersections against red lights, resulting in collisions, some of which led to significant injuries. The agency has highlighted 18 complaints and a media report alleging failures of the FSD system to stop at red lights or accurately interpret traffic signals on the vehicle's interface. The absence of warnings about the system's intended actions, particularly when approaching red lights, has also been a point of contention. While Tesla has not yet commented on the investigation, the company's recent update to FSD, version 14.1, promises improvements in lane management and intersection handling, which could potentially mitigate similar issues in the future.

Despite the concerns surrounding autonomous systems like Tesla's FSD, the broader context of road safety reveals a more complex picture. Human error continues to be a significant factor in traffic accidents, with the NHTSA reporting 3,275 fatalities in 2023 due to distracted driving alone, involving activities such as texting or adjusting controls while driving. This statistic underscores the potential, albeit still developing, role of automated driving technologies in enhancing road safety by reducing human errors. However, the frequency of unreported traffic violations adds another layer of complexity to the issue, suggesting that violations by both human drivers and automated systems might be more common than currently documented.

Tesla's response to the ongoing investigation could involve addressing the technological aspects of the FSD system that are under scrutiny. Given that the incidents likely involved older versions of FSD, which have since been updated, Tesla may argue that these updates have already addressed the issues cited. The FSD system, as it stands, is still classified as supervised, meaning that it requires human oversight. This status is crucial in legal and regulatory discussions, as it delineates the responsibilities of the driver versus the automaker in the event of an accident. The outcome of this investigation may influence future regulatory frameworks for autonomous vehicle technologies, impacting not only Tesla but the wider industry as it navigates the path toward fully autonomous vehicles.

The implications of the NHTSA's investigation extend beyond Tesla, potentially affecting the entire landscape of autonomous driving technology. A potential recall or regulatory action could set precedents for how such systems are monitored and evaluated for safety. This development is of particular interest to stakeholders in the automotive industry, including other manufacturers developing similar technologies. The investigation also raises questions about the balance between innovation and safety, as companies strive to bring cutting-edge technology to market while ensuring compliance with safety standards. The outcome may not only influence consumer trust but also shape future innovations in autonomous driving.

As the investigation unfolds, it is likely to fuel ongoing debates about the viability and safety of autonomous driving systems. Policymakers, industry leaders, and consumers alike will be watching closely to see how Tesla addresses these concerns and whether the improvements in FSD's latest version will prove effective. The resolution of this case could either bolster confidence in autonomous technologies or prompt more stringent regulations, potentially slowing the pace of innovation. Ultimately, the question remains whether autonomous systems can be designed to surpass human capabilities in ensuring road safety, a goal that continues to drive research and development in this rapidly evolving field.

#Tesla#FSD#NHTSA#autonomous vehicles#road safety
Nina Alvarez

About Nina Alvarez

Safety editor tracking recalls, crash tests and regulations. Drives a Volvo V90; keeps a few child seats for testing.

Share

Reader Comments

7 comments

Share Your Thoughts

Join the discussion with other readers

0/280 charactersComments are moderated
T

teslaFan99

Oct 10, 2025
wow, can't believe this! hope they figure it out soon. love my tesla! 🚗💨
S

SkepticalSam

Oct 10, 2025
More hype than substance, I bet.
R

ReflectiveRay

Oct 10, 2025
This investigation sheds light on the broader debate about autonomous systems and road safety. We need to examine the tech within the wider safety context.
T

TechGuy42

Oct 10, 2025
The FSD's potential for non-compliance with traffic laws is concerning. The NHTSA's probe should focus on the system's algorithmic parameters and sensor calibration.
C

CuriousCat235

Oct 10, 2025
Wow, I wonder how this will impact future updates! So interesting to see tech evolve!
B

balancedview

Oct 10, 2025
well this is interesting, but its important to remember that human errors are still a major cause of accidents so lets see what comes of this
D

DrivingDaisy

Oct 10, 2025
I had an issue with my Tesla FSD last year. It misjudged a stop sign, but after I updated the software, it has been fine. I hope these improvements are widespread.

More from Safety & Recalls

Tesla Cybertruck Barred from Germany Due to Safety Concerns
Safety & RecallsSep 30

Tesla Cybertruck Barred from Germany Due to Safety Concerns

The Tesla Cybertruck, known for its futuristic design and robust build, has hit a major roadblock in Germany. The U.S. Army Customs Agency has announced that the vehicle cannot be imported or registered in Germany or the wider European Union due to significant passive safety concerns.

By Nina Alvarez
Jeep Recalls 123,000 Wagoneers and Grand Wagoneers for Safety Concerns
Safety & RecallsSep 29

Jeep Recalls 123,000 Wagoneers and Grand Wagoneers for Safety Concerns

In a significant recall impacting over 123,000 vehicles, Jeep has identified a potential safety hazard involving the rear quarter-window trim on certain Wagoneer and Grand Wagoneer models. The recall affects vehicles manufactured between 2022 and 2024, with concerns that the trim may detach, posing risks to both drivers and other road users.

By Nina Alvarez
Toyota Supras Part of Big BMW Recall for Potential Engine Fires
Safety & RecallsSep 26

Toyota Supras Part of Big BMW Recall for Potential Engine Fires

In a significant safety concern, BMW has announced a recall affecting nearly 200,000 vehicles from the 2019 to 2022 model years due to a potential fire hazard. This recall notably includes the Toyota Supra models from 2020 to 2022, highlighting the intertwined nature of automotive engineering and the shared challenges manufacturers face.

By Nina Alvarez
Ford Recalls 115,000 Super Duty Pickups Due to Steering Column Concern
Safety & RecallsSep 25

Ford Recalls 115,000 Super Duty Pickups Due to Steering Column Concern

In a significant recall affecting over 115,000 vehicles, Ford has identified a potential issue with the steering columns of its 2020 and 2021 Super Duty pickups, prompting concerns over driver safety.

By Nina Alvarez
Share