The National Highway Traffic Safety Administration (NHTSA) said Friday it will evaluate whether Tesla’s system, also known as FSD (Full Self-Driving), is able to detect fog and other low visibility conditions and respond accordingly.
The agency said four accidents have been reported in which FSD was used under such conditions. In one of those crashes, a Tesla vehicle fatally struck a pedestrian, according to the NHTSA. In other cases, pedestrians reported injuries. Tesla representatives did not respond to Bloomberg’s emailed request for comment.
The investigations are a stain on Tesla’s image
The investigation marks a major step back in CEO Elon Musk’s efforts to position Tesla as a leader in autonomous driving. Musk said Tesla’s ability to develop autonomous vehicle technology will ultimately determine whether the company is worth a lot of money or “essentially zero.”
Just last week, the company held an event at a Los Angeles film studio showcasing autonomous vehicle concepts. During the presentation Cybercab autonomous taxi,Tesla has not provided details on how the company will realize its CEO’s fully autonomous driving ambitions.
Tesla has been charging customers thousands of dollars for years for its FSD system, whose name “Full Self-Driving” suggests it is fully autonomous. Unfortunately, this is not entirely true, because using FSD requires constant supervision of the driver. It costs $8,000 to install FSD.
Investigation into FSD malfunction
The defect investigation follows an earlier action NHTSA took in April. The previous investigation looked into whether Tesla did enough to stop drivers from misusing a suite of driver-assist features marketed as Autopilot.
The agency is investigating whether a software update rolled out by Tesla late last year ensures drivers remain engaged while using the system.
In NHTSA’s assessment, there is ““critical safety gap” between what drivers think Autopilot can do and what it actually can do. According to the agency, this vulnerability led to misuse of the system and accidents that could have been avoided if FSD was used properly.
In April, NHTSA said it had found it 211 incidents where Teslas crashed while on Autopiloteven though drivers had enough time to avoid collisions or mitigate their effects. In 111 cases, drivers drove off the road after accidentally turning off the system.