Tesla Full Self-Driving Beta saved me once and tried to kill me twice
I have been driving with Tesla’s Full Self-Driving (FSD) Beta for about two years now, and during that time, it saved me once and tried to kill me twice.
How should I feel about that?
The Promise
In 2016, I enthusiastically listened to Elon Musk announcing that from now on, every new Tesla vehicle will be equipped with all the hardware necessary to become self-driving through future software updates.
I had an older Model S at the time and couldn’t afford a new one, but I loved the idea that you can buy a car and in the future, it could become self-driving.
In 2018, I bought a Model 3 with the promise that it would become self-driving. I purchased the Full Self-Driving Capability package for $5,000 CAD (now $16,000 CAD).
Shortly after Musk announced that all new Tesla had all the necessary hardware to become self-driving with its onboard computer, cameras, radar, and ultrasonic sensors, he did signal that there might be a need for a computer upgrade.
That’s OK. In 2019, Tesla sent a mobile technician to my home who quickly changed my HW2 computer for the new HW3/self-driving computer.
Starting in 2019 forward, Musk basically said that Tesla would deliver its self-driving capability by the end of every year, but we are now in 2024, and it hasn’t.
The Delivery
I have been enjoying Autopilot features in my Model 3 for years. It removes some of the mundane tasks of driving on the highway and allows you to focus on keeping your eyes on the road and being ready to take control at all times.
However, it is not a self-driving taxi like I was promised.
Instead, Tesla delivered Full Self-Driving (FSD) Beta. The feature enables the vehicle to control itself through intersections, city streets, and highways. The vehicle virtually drives itself. However, Tesla doesn’t take responsibility for it. The driver is always responsible and has to be ready to take control at all times.
In itself, the system is impressive, but it is not the robotaxi Tesla promised. It is able to render its environment to an impressive level of accuracy, and it can navigate difficult intersections, but it also often fails in dangerous ways.
I received FSD Beta in early 2022. Shortly after, I tried it in the Blue Ridge mountains, and I had a terrible experience.
As I was going through a sharp right turn, FSD Beta decided to stop turning halfway through the turn and brought the steering wheel back straight. If I didn’t instantly grabbed the wheel and applied the brakes, I would have driven us right off the cliff side (around 12:30 in this video):
It was a very scary situation. Fortunately, I was hyper-vigilant because it was one of the first times I used it. I could see if someone becomes complaisant with the system that it could be super dangerous as I only had a fraction of a second to react.
It wasn’t the only time FSD Beta almost killed me.
Last year, I was testing a more recent FSD Beta update (v11.4.7), which merged Autopilot (highway driving) with Tesla’s FSD Beta.
I was driving on the highway on FSD Beta with the speed set at 118 km/h (73 mph) on the 20 direction Montreal, and the system automatically moved to the left lane to pass a car.
As I was passing the car, I felt FSD Beta veering aggressively to the left toward the median strip.
I was able to steer back toward the road, which disengaged FSD Beta. It was super scary as I almost lost control when correcting FSD Beta and again, I was passing a vehicle. I could have crashed into it if I overcorrected.
A few moments later, I gave FSD Beta another shot thinking that I might have an idea of what went wrong, and I was actually able to repeat the problem.
As I moved to the left lane again, I was way more alert, and when FSD Beta again veered to the left toward the median strip, this time I saw one of those sections for U-turns for emergency vehicles:
FSD Beta tried to enter it at full speed. I again was able to correct it in time and sent Tesla a bug report, though it cut me off before I could explain what happened. It should be clear if they can pull the video.
This is a very dangerous behavior as there would have been no room for me to slow down if I had entered the median at highway speed, or I could have crashed into another vehicle if I had overcorrected to the right. I also only had a fraction of a second to react.
That was actually a dangerous behavior – trying to take exits and medians when it shouldn’t – that used to be in Autopilot early on, but it was new to FSD Beta for me.
Now, on another occasion, FSD Beta actually saved me. I was in traffic in the middle lane on the highway, and I got distracted by what appeared to be a near-crash on my right and a car blasting its horn. As this happened, a car coming from the right lane cut me off as I was turning my head back, and I believe FSD Beta reacted to the car cutting me off before I could because I was looking to the right.
Tesla FSD Beta is now on its 12th version and the automaker is yet to offer a clear path toward taking responsibility for the system and delivering on its promise of self-driving.
City Dwellers’s Take
Now, you could argue that this is a net positive. I was able to correct FSD Beta the two times it almost killed me, and if it hadn’t reacted in the traffic on the last example, I most likely would have had a crash.
I would agree with that. My general take is that it is safer to drive with FSD Beta than without as long as you are paying as much or more attention as you would if you were driving without.
I think the main problem comes with people being overconfident with the system. Of course, you open yourself to that when you decide to call it “Full Self-Driving”. I know that Tesla tells people to keep paying attention at all times, and that’s good, but it might not be enough amid all the promotion around the capability.
You have the company’s CEO continuously talking about the next FSD Beta update being “mind-blowing,” and he is sharing videos of “no intervention drives” from his fans. For example, Musk often shares videos from Omar Qazi, who goes by Wholeblogmars on X. He shares the videos as examples of the incredible performance of the FSD Beta system, but they are not really representative of the average experience.
First off, they are virtually all in California, and Tesla admits that the system works better in California, where most of the training happened.
Also, Qazi has evidently been using a third-party product to avoid standard alerts to put his hands on the steering wheel, which makes his videos unrepresentative of how people use the system or should use it.
These things can contribute to people becoming overconfident in FSD Beta. You don’t have to spend much time on social media to find people abusing Tesla’s Autopilot and FSD Beta.
Tesla should spend more time denouncing those things and making it clear that the feature they call Full Self-Driving is not representative of its name for now. But that’s hard to do because every time Tesla does that, it highlights its own failure to deliver on its promise.
For years, Musk has claimed that Tesla’s Autopilot and FSD beta is safer than driving normally, but he hasn’t supported with believable data.
Tesla used to release its Autopilot Safety report, but the data was extremely limited in proving Musk’s point, and the automaker abruptly stopped producing these reports and releasing data over a year ago.
I think Tesla, and especially Musk, should be much more conservative with their approach until they can actually support their claims with clear data.
FTC: We use income earning auto affiliate links. More.