Tesla's Full Self-Driving (FSD) system is facing a critical juncture, with the National Highway Traffic Safety Administration (NHTSA) escalating its investigation into the system's inability to handle reduced visibility conditions. This development is not only a significant regulatory threat but also highlights a fundamental flaw in Tesla's camera-only approach to autonomous driving. The NHTSA's Engineering Analysis, covering an estimated 3,203,754 vehicles, has revealed that FSD's degradation detection system fails to warn drivers when cameras are blinded by common road conditions like sun glare and fog. This is a damning finding, as it indicates that Tesla's software doesn't adequately compensate for the inherent vulnerability of camera-only systems to environmental interference.
The core problem is that FSD can't tell when it's blind. In the crashes reviewed by NHTSA, the system didn't detect common roadway conditions that impaired camera visibility until immediately before the crash occurred. Worse, the vehicles either lost track of or completely missed other cars directly ahead of them before impact. This failure mode is particularly concerning, as it doesn't get enough attention, such as camera fogging inside the housing. Tesla's cameras can develop condensation between the lens and the outer cover, particularly in cold weather or humid conditions, and the system sometimes doesn't detect it. When that happens, FSD continues operating with impaired vision, and the driver has no idea unless they are looking directly at the camera feeds.
The timeline of Tesla's response is also revealing. A fatal crash involving FSD and reduced visibility occurred on November 28, 2023. Tesla submitted the required Standing General Order (SGO) report for that crash on June 27, 2024 — already seven months later. The very next day, Tesla began developing an update to the degradation detection system, but NHTSA still doesn't know when that update was actually deployed or which vehicles have received it. This delay in addressing the issue is a significant concern, as it suggests that Tesla may not be taking the problem seriously.
The NHTSA's investigation also raises concerns about under-reporting. Tesla told the agency that internal 'data and labeling limitations' prevented it from uniformly identifying and analyzing crashes that occurred while the degradation detection system was engaged. NHTSA believes this limitation 'could have led to under-reporting of subject crashes over portions of the defined time-period'. This echoes the separate investigation into Tesla's crash reporting practices and the ongoing struggle to get Tesla to turn over FSD traffic violation data. The pattern is consistent: NHTSA keeps finding that Tesla either can't or won't provide clear data about FSD-related crashes.
The broader picture is even worse for Tesla. We now have three concurrent NHTSA investigations into FSD, covering visibility failures, traffic violations, and crash reporting gaps. Tesla is simultaneously fighting to hand over data in one probe while this new one escalates. And all of this unfolds while Tesla continues to expand FSD's availability and CEO Elon Musk continues to promise unsupervised 'Full Self-Driving' is imminent. The gap between Tesla's autonomous driving claims and the regulatory reality has never been wider. An Engineering Analysis covering 3.2 million vehicles, with a fatal crash in the record and evidence of systemic visibility detection failures, is exactly the kind of probe that ends in a recall. Tesla needs to take this seriously — and so do the drivers relying on a system that can't tell when it's blind.
In my opinion, this investigation escalation is the most significant regulatory threat to Tesla's FSD deployment we've seen. It cuts to the heart of a problem we've been flagging for years: a camera-only system has inherent vulnerability to visibility degradation, and Tesla's software doesn't adequately compensate for it. This is a critical issue that needs to be addressed immediately, as it poses a serious risk to public safety. Tesla must take responsibility for its actions and ensure that its FSD system is safe and reliable for all drivers.