Tesla is reportedly requiring drivers to consent to it collecting identifiable video footage taken by the car’s internal and external cameras, in the event of an accident.
The consent to video collection does not apply to all drivers, but only those drivers who opt into Tesla’s FSD beta program.
FSD is separate to Tesla’s Autopilot driver-assistance system, that is currently being investigated by a US safety watchdog, over a number of high profile accidents with emergency service vehicles while Autopilot was being used.
Tesla sells the more advanced Full Self-Driving capability (or FSD) package for $10,000 or $199 per month in the United States, to a highly select number of people.
Indeed, in order to qualify to join the FSD beta program, drivers must have Driver Safety Scores of 98 and up. Previously, FSD was limited to drivers with perfect 100 scores.
FSD is currently not fully autonomous driving (indeed it is only level 2), and it is not approved by US officials. It still requires a driver behind the wheel paying attention, keeping their hands on the wheel, and being ready to takeover.
But now according to a report by Electrek, Tesla is asking FSD drivers to consent to allowing the car maker to collect video taken by a car’s exterior and interior cameras in case of an accident or “serious safety risk.”
The report stated that while Tesla had previously gathered video footage as part of FSD before, it was only used to train and improve its AI self-driving systems. And the electric car giant always ensured that the footage was anonymous and never associated with a person’s vehicle.
But now this has changed, and testers of the FSD beta program have to accept that Tesla can use footage from both inside and outside the car in case of a safety risk or accident.
Tesla revealed the change when it updated its terms and conditions warning, when testers download a new version of the FSD Beta.
“By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision,” Electrek reported the new warning as stating.
Electrek notes that important part of the warning was the “VIN-associated” wording, which means that the footage collected will be associated with the owners’ vehicle.
It pointed out that Tesla’s language could indicate the firm wants to ensure it has evidence in case its FSD system are blamed for an accident.
That said, it could possibly also be used to detect and fix serious issues more quickly as well.
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…