Tesla’s decision to test its “Full Self Driving” advanced driver assistance software with untrained vehicle owners on public roads has attracted scrutiny and criticism, and that was before this latest release.
Version 10.3 began rolling out on Saturday night / Sunday morning with a long list of release notes. The list mentions changes starting with introducing driver profiles that can swap between different characteristics for following distance, rolling stops, or exiting passing lanes. It’s supposed to better detect brake lights, turn signals, and hazard lights from other vehicles, along with reduced false slowdowns and improved offsetting for pedestrians.
However, on Sunday afternoon Elon Musk tweeted that Tesla is “Seeing some issues with 10.3, so rolling back to 10.2 temporarily.”
Seeing some issues with 10.3, so rolling back to 10.2 temporarily.
Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.
— Elon Musk (@elonmusk) October 24, 2021
(As always, to be clear: this software does not make Tesla’s cars fully autonomous. Tesla CEO Elon Musk has himself even said that he believes the “feature complete” version of the software his company calls “Full Self-Driving” will, at best, only be “likely” to drive someone from their home to work without human intervention and will still require supervision. That does not describe a fully autonomous car.)
While several drivers have already shared videos and impressions of their experience with the release — whether or not that aligns with what Tesla wants participants to share on social media — testers say that the rollback update removes the FSD beta capabilities from their cars entirely.
While several posters said the 10.3 update introduces phantom forward collision warnings (FCW), other issues noted included a disappearing Autosteer option, traffic aware cruise control (TACC) problems, and occasional AutoPilot panic. It’s unclear how common these problems are and which ones, if any, caused the rollback, but Musk responded to a tweet about the Autosteer and TACC issues saying the company is working on it.
If it’s a common problem within the test group, then a phantom FCW would certainly be serious enough to initiate a rollback. In 2019 there was a Mazda3 recall to address problems with the Smart Braking System falsely detecting objects in the car’s path. If another car follows closely, vehicles that suddenly slam on the brakes for no reason — as several social media posts claim has happened — could easily cause an accident. Another problem for testers is that several claimed the false FCW incidents lowered their Tesla-graded “safety score” low enough that they might not be able to remain in the beta.
For anyone concerned with being an unwilling member of the test group by simply existing near a Tesla using work-in-progress software, this could be a sign that the company is addressing problems quickly or an example of how dangerous it is. On the other hand, for Tesla owners hoping the test expands to include people with a lower safety score, hacker @greentheonly tweets, “To those of you with low scores that wait for the FSD: don’t. Imagine if you drove the way the app requires, that would be horrible, right? But the car drives even worse! It’s seriously totally unusable in any sort of traffic and on narrow roads, the videos don’t do it justice.”
Similar to how I don’t use stopping light control the current of iteration is something I’d want off outside of limited experiments, but unlike TL stopping mode you cannot turn off FSD annoyances!
(hopefully it improves, but knowing Tesla, that’s long ways off)— green (@greentheonly) October 12, 2021
Credit: Source link