Technology

Tesla FSD Beta Users Show How the System Works — and Doesn't

David Paul Morris | Bloomberg | Getty Images

Tesla CEO Elon Musk has been promising customers a driverless vehicle since at least 2016.

While the company hasn't delivered on that promise, Tesla lets thousands of employees and customers try new and unfinished driver assistance features on public roads in the U.S. through a program called Full Self Driving Beta, or FSD Beta.  

Only Tesla owners who have the company's premium FSD driver assistance system installed in their cars can join the FSD Beta program. (That option costs $12,000 up front or $199 per month in the U.S. today.) Owners must then obtain a high driver-safety score, as determined by Tesla software that monitors their driving habits, and maintain it to keep FSD Beta access. No safety certification or professional training is required.

FSD Beta can best be summarized as a host of new features that are not yet debugged. Chief among them is "autosteer on city streets," which lets the car navigate around complex urban environments automatically.

In January and February, CNBC rode along with three Tesla owners who are participants in FSD Beta to get an uncensored look at how the system works -- and doesn't -- today.

All three drivers understood that the technology does not make their Tesla electric cars fully self driving, despite the brand name. The vehicles with FSD Beta engaged maneuvered around some suburban and rural roads successfully, but suffered some dramatic glitches as well, especially in crowded urban environments.

Tesla bull, FSD skeptic

One owner, Taylor Ogan, took CNBC for a drive using FSD Beta version 10.8.1 on his 2020 Model Y vehicle in Brooklyn in January.

Ogan is the founder and CEO of Snow Bull Capital, a hedge fund that invests in green and high-tech sectors, and a self-proclaimed Tesla fan. But after becoming part of the FSD Beta program, he's become increasingly critical of Tesla's approach to autonomous vehicle tech development.

During the Brooklyn drive, his vehicle ran through a red light without stopping or warning him to take over steering. The main display screen in his Tesla went blank in the middle of the drive. And Ogan apologized sheepishly to those who shared the road with him as his Tesla lurched to a near-stop when a pedestrian standing at a curb triggered a sudden slow down, even though his vehicle and others had a green light and right of way.

Ogan is generally bullish Tesla, but the test drive in Brooklyn left him saying, "I don't think it's right that customers are able to just test this."

He's also skeptical that Tesla will be able to turn their cars into autonomous vehicles with an over-the-air software update based on their current driver assistance performance.

Recently, Elon Musk said on Tesla's 2021 year-end earnings call, "My personal guess is that we'll achieve Full Self-Driving this year at a safety level significantly greater than a person." He added, "The cars in the fleet essentially becoming self-driving via software update, I think, might end up being the biggest increase in asset value of any asset class in history."

Another Tesla owner posted a video to his YouTube channel, AI Addict, showing an FSD Beta drive where the car plowed into bollards along the road in San Jose, California this month.

Cost of making it better

Another Tesla Model Y owner and FSD Beta participant, Kevin Smith, in Murfreesboro, Tennessee, sees glitches and disengagements during drives as inevitable, and part of the process of making FSD Beta into a truly, autonomous system someday.

Smith has driven more than 5,000 miles with FSD Beta, he told CNBC.

"Any time the car could just make a mistake," he said. "And I have to be ready for that. My stress levels go up, not down from using Full Self-Driving... But that's that's the cost of of making it better," Smith said.

On the ride-along, CNBC witnessed his vehicle automatically stopping and navigating through an intersection, without Smith having to steer.

He's generally impressed with the technology so far, but notes it hasn't worked in snow or inclement weather, and that every new version of FSD Beta, released via over-the-air software updates to his car, can solve one problem while introducing a new one.

By using the FSD beta on public streets, Smith says, "I don't feel I'm increasing the assumed risk that people are putting themselves in by also being on those public streets. We share those streets with people who are, you know, using a car for the first time with their learner's permit."

Another Tesla owner, Dan Eldridge, took CNBC for a ride in his Model 3 in San Francisco on Feb. 1. He said that he's been able to use the feature safely by remaining attentive.

"I haven't really been in a situation where I felt like I couldn't gain control, like I didn't have enough time to gain control," he explained at the outset.

The car navigated some thoroughfares well, but Eldridge had to stop it from rolling through a stop sign, which it nearly did without warning him to take over steering. The car didn't navigate roundabouts properly. It also nearly cut off another driver by trying to automatically change lanes, requiring Elridge to disengage the system.

While he was vigilant and avoided any accidents, Eldridge said, "When I'm using the FSD, I'd say I'm less worried about hitting a pedestrian than I am about being the victim of a road rage incident, because I'm just not driving in a courteous way."

Experimenting in the real world

While there are no federal laws barring Tesla from doing this, transportation experts are not convinced Tesla's experiment on public roads is a safe or sensible one.

So far, two Tesla owners have submitted complaints to the federal vehicle safety authority NHTSA saying they thought FSD Beta contributed to or caused crashes they experienced. The FSD Beta program and technology is under investigation by NHTSA and the California Department of Motor Vehicles.  

Kelly Funkhouser has been testing Tesla's systems including FSD Beta on a closed route for Consumer Reports. She found that a lot of the time, Tesla's driver monitoring systems, including cabin cameras that are supposed to track drivers attentiveness, don't work.

"For Full Self-Driving beta, that's when Tesla claims that they have these additional messages that say things like the camera's blocked or to please pay attention. We've never experienced any of those warnings," she said.

She added, "I understand why Tesla might want to get novice testers out there experiencing it. But I definitely think that that's a huge risk that that consumers take when they are doing this on public roads."

Destiny Thomas, the founder and CEO of Thrivance group, an urban-planning organization with a focus on marginalized communities, said she's concerned that Tesla is thinking about what drivers want, but not about the safety of people who share the roads.

"How is this car going to recognize someone using an assistive device that maybe isn't a wheelchair and doesn't look like one? How is this technology going to be able to recognize someone who has purple undertones in their skin that don't react to the sensors that are in the car?" She asks.

She would like to see Tesla do more community engagement before putting experimental vehicles with novice testers behind the wheel on city streets.

CNBC reached out to Tesla but the company did not reply to a request for comment.

Watch the full video here.

Copyright CNBC
Contact Us