Tesla has started rolling out over-the-air software updates for version 9 of its long-awaited Full Self-Driving beta. As has been noted many times in the past, the system is not fully autonomous (yet) but offers an advanced driver assist system.
Elon Musk announced that the software update (2021.4.18.12) would begin uploading after midnight on Friday, allowing thousands of Tesla customers who paid for the FSD option to use several of Autopilot's advanced driver-assist features on local, non-highway streets.
For quite some time, Musk has promised v9 of the software. In 2018, he stated that the "long-awaited" version of FSD would be released in August. In 2019, he predicted that "a year from now," there would be "over a million automobiles with complete self-driving, software, and everything." “FSD 9 beta is shipping soon,” he said earlier this month. To say that Tesla enthusiasts have been looking forwards to this upgrade for a long time is an understatement.
The majority of known concerns are addressed in Beta 9, however there will be unforeseen issues, so drivers are expected to be cautious. Musk said, “At Tesla, safety is always a top priority." The update's release notes caution testers against complacency, saying that "it may do the wrong thing at the worst time." They also note upgraded, larger visualisations on the in-car display, as well as upgrades to the cabin camera's driver monitoring to check for attentiveness.
Tesla is without a doubt more willing than its competitors to test beta versions of its Autopilot driver assistance technology on its customers in order to gather data and iron out any flaws. And most Tesla consumers are happy with it, regularly bombarding Musk's comments with requests to join the company's Early Access Program for beta testers. Despite its vehicles consistently falling short of what most experts think defines a self-driving car, this has contributed contribute to Tesla's public perception as a leader in autonomous driving.
Tesla warns drivers to keep their eyes on the road and hands on the wheel at all times, despite the fact that the automaker is notorious for refusing to include a more robust driver-monitoring system (such as infrared eye tracking, for example) to ensure that its customers follow safety protocols (though that may be changing). According to the Society of Automotive Engineers (and Tesla's lawyers), Autopilot is a Level 2 "partially automatic" technology, which requires drivers to keep their hands on the wheel and their eyes on the road.
However, consumer advocates have demonstrated that Tesla's system can be easily fooled into believing there is someone in the driver's seat, a topic that has resurfaced in the aftermath of a tragic Tesla incident in Texas, in which investigators claimed there was no one behind the wheel.
However, this hasn't stopped some Tesla owners from abusing Autopilot, with some even filming and publicising the consequences. While driving down a congested highway, drivers have been found sleeping in the passenger seat or backseat of their Teslas. Last year, a Canadian man was penalised with dangerous driving when he was pulled over for sleeping while driving at 93 mph.
Since Tesla released Autopilot in 2015, at least 11 people have died in nine collisions involving the driver assistance technology in the United States. There have been at least nine more deaths in seven separate crashes around the world.
Meanwhile, the US government is forcing automakers to report crashes involving self-driving cars or sophisticated driver assistance systems as soon as possible. It was a significant shift, indicating that authorities are taking a harsher stance on these partially automated systems.