Regulator Demands Answers Over Tesla FSD Social Media Posts
US regulator says Tesla’s messaging on social media could lead drivers to believe ‘Full Self-Driving’ feature does not require oversight
Getting your Trinity Audio player ready...
|
US driving regulators raised concerns about Tesla’s promotion of its so-called Full Self Driving (FSD) driver-assistance feature as early as May, months before the launch of an official probe into the feature, according to a filing made public this month.
The National Highway Traffic Safety Administration (NHTSA) raised concerns with Tesla that it was promoting FSD in a way that suggested the system could be used as a robotaxi and did not need driver attention, according to the May filing.
The agency made the filing public last week and gave Tesla until 18 December to address its concerns.
The NHTSA in October opened an investigation into 2.4 million Tesla vehicles with FSD focusing on four reported collisions, including a 2023 fatal accident, involving driving conditions such as sun glare, fog and airborne dust.
Problematic posts
In a 14 May email, NHTSA told Tesla its social media messaging could encourage viewers to consider the feature as a robotaxi “rather than a partial automation/driver-assist system that requires persistent attention and intermittent intervention by the driver”.
This conflicts with Tesla’s “stated messaging that the driver is to maintain continued control over the dynamic driving task”, the email said.
The email cited Tesla’s repostings on X, formerly Twitter, including posts from a driver who used FSD to carry him 13 miles to a hospital during a heart attack and another who used the feature to carry him home on a 50 minute drive after a sporting event.
Tesla met with the agency in May and told regulators the cars’ owner’s manuals and other messaging tells drivers the vehicle is not autonomous and they must remain vigilant.
The agency asked for answers to questions including the driver-assistance feature’s “potential failure to perform, including detecting and responding appropriately in specific situations where there is reduced roadway visibility that may limit FSD’s ability to safely operate”.
Safety queries
NHTSA said its “investigation will consider the adequacy of feedback or information the system provides to drivers to enable them to make a decision in real time when the capability of the system has been exceeded”.
In December 2023 Tesla agreed to recall nearly its entire base of US vehicles to add new features to the Autopilot driver-assistance feature to make it more difficult for users to bypass controls intended to ensure they are paying attention.
The agency is still reviewing whether Tesla’s updates were sufficient.
While Autopilot comes with every Tesla vehicle, FSD is a paid service that adds features such as self-parking, automatic lane changes and traffic navigation.