Regulator Demands Answers Over Tesla FSD Social Media Posts

US driving regulators raised concerns about Tesla’s promotion of its so-called Full Self Driving (FSD) driver-assistance feature as early as May, months before the launch of an official probe into the feature, according to a filing made public this month.

The National Highway Traffic Safety Administration (NHTSA) raised concerns with Tesla that it was promoting FSD in a way that suggested the system could be used as a robotaxi and did not need driver attention, according to the May filing.

The agency made the filing public last week and gave Tesla until 18 December to address its concerns.

The NHTSA in October opened an investigation into 2.4 million Tesla vehicles with FSD focusing on four reported collisions, including a 2023 fatal accident, involving driving conditions such as sun glare, fog and airborne dust.

Image credit: Vlad Tchompalov/Unsplash

Problematic posts

In a 14 May email, NHTSA told Tesla its social media messaging could encourage viewers to consider the feature as a robotaxi “rather than a partial automation/driver-assist system that requires persistent attention and intermittent intervention by the driver”.

This conflicts with Tesla’s “stated messaging that the driver is to maintain continued control over the dynamic driving task”, the email said.

The email cited Tesla’s repostings on X, formerly Twitter, including posts from a driver who used FSD to carry him 13 miles to a hospital during a heart attack and another who used the feature to carry him home on a 50 minute drive after a sporting event.

Tesla met with the agency in May and told regulators the cars’ owner’s manuals and other messaging tells drivers the vehicle is not autonomous and they must remain vigilant.

The agency asked for answers to questions including the driver-assistance feature’s “potential failure to perform, including detecting and responding appropriately in specific situations where there is reduced roadway visibility that may limit FSD’s ability to safely operate”.

Safety queries

NHTSA said its “investigation will consider the adequacy of feedback or information the system provides to drivers to enable them to make a decision in real time when the capability of the system has been exceeded”.

In December 2023 Tesla agreed to recall nearly its entire base of US vehicles to add new features to the Autopilot driver-assistance feature to make it more difficult for users to bypass controls intended to ensure they are paying attention.

The agency is still reviewing whether Tesla’s updates were sufficient.

While Autopilot comes with every Tesla vehicle, FSD is a paid service that adds features such as self-parking, automatic lane changes and traffic navigation.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

TSMC Begins 4nm Chip Production In Arizona

TSMC begins production of advanced 4nm chips at Arizona plant as US seeks to bring…

15 hours ago

China Chip Imports Surge Ahead Of New Export Controls

China's semiconductor imports grow by double-digits in 2024 ahead of new US export controls that…

15 hours ago

US Rules Divide World To Conquer China’s AI

New US export controls divide world into three tiers as outgoing administration seeks to cut…

16 hours ago

Apple Board Advises Against Plan To End Diversity Programmes

Apple board advises investors to vote against shareholder proposal to end diversity programmes as Meta,…

16 hours ago

Technology Secretary Calls Online Safety Act ‘Unsatisfactory’

Technology secretary Peter Kyle admits Online Safety Act falls short on protection from social harm,…

17 hours ago

Blue Origin Aborts Test Flight Minutes Before Launch

Jeff Bezos' Blue Origin cancels New Glenn certification flight at last minute due to unspecified…

1 day ago