Tesla boss Elon Musk has announced this week that the beta version of its Full Self-Driving (FSD) feature will be released next week Tuesday. At Tesla’s Battery Day in September, Musk spoke about how the EV company has done a total redesign of its Autopilot software and the private beta release would be available “soon” with extreme improvements.
“I think we’ll hopefully release a private beta of Autopilot the full self-driving version of Autopilot in, I think, a month or so, and then people will really understand just the magnitude of the change. It’s profound.”
While FSD refers to Full Self-Driving, we know that it’s not yet fully autonomous…the keyword being “yet”. The company is working on refining this feature to one day be fully autonomous. Tesla’s electric vehicles are factory-fitted with Autopilot, which is the driving assisted technology that’s designed to steer, accelerate, and brake automatically, while also allowing the driver to summon its driver-less car. The Full Self-Driving package, which will be an $US8,000 option, includes advanced features that will allow the electric cars to change lanes, recognise stop signs and traffic lights, and park automatically.
Although Tesla has steadily added capabilities to the Full Self-Driving package, Tesla has lagged a bit in delivering its beta version but Musk defended the delay on Twitter explaining the overhaul has been complicated. It’s hardly worth panicking about considering Tesla is the only car manufacturer to even offer an inkling of autonomous driving.
In 2016, the CEO claimed that a Tesla EV will one day be able to drive itself across the United States by the end of 2017. Perhaps a bit outlandish but he’s certainly determined to prove it one day. Musk confirmed the beta release of the new FSD on Twitter saying, “Limited FSD beta releasing on Tuesday next week, as promised. This will, at first, be limited to a small number of people who are expert & careful drivers.”
Musk’s emphasis on “expert and careful drivers” is noteworthy considering past reports of recklessness from Tesla owners. Over the last few years, Tesla has had problems where people would crash their cars and blame it on Autopilot, even though Tesla has always stated, the human driver remains fully responsible and must keep their eyes on the road and hands on wheel while using the systems.
Police caught a driver of a Model S with the front seats reclined and napping while the car was driving itself at 90 miles per hour. Another driver in San Jose was way over the limit using Autopilot to get him home but he ended up crashing into a fire truck. Another incident also had a Tesla driver enjoying a coffee and leisurely lunch in his car while it cruised along the highway and then also hit a truck because the driver was not paying attention.
Then earlier this year when the Smart Summon feature was released, Tesla drivers tried to push it to its limits by getting the driver-less car to traverse busy parking lots, which resulted in a few near misses and fender benders. Again, Tesla had to warn owners to be careful with using Smart Summon because it’s not a fully autonomous feature.
“You are still responsible for your car and must monitor it and its surroundings at all times and be within your line of sight because it may not detect all obstacles. Be especially careful around quick moving people, bicycles and cars.”
This explains why Tesla does a careful selection in choosing responsible drivers to beta test features such as the FSD.