I Took a Ride in a ‘Self-Driving’ Tesla and Never Once Felt Safe
When the “Full Self-Driving” setting is enabled in a Tesla, according to the automaker’s own description, the car “attempts to drive to your destination by following curves in the road, stopping at and negotiating intersections, making left and right turns, navigating roundabouts, and entering/exiting highways.”
“Attempts” would be the crucial word here, as I learned during an occasionally harrowing demonstration of FSD around surface streets and freeways in Los Angeles. While it’s true that the technology manages to impress at first, it doesn’t take long for severe and dangerous shortcomings to emerge. And, contrary to claims from exaggeration-prone Tesla CEO Elon Musk, it certainly didn’t seem safer than having an average human driver at the wheel.
More from Rolling Stone
One morning in early August, I hop into a 2018 Tesla Model 3 owned by Dan O’Dowd, founder of the Dawn Project. Easily the most outspoken critic of Tesla’s so-called autonomous driver-assistance features, O’Dowd — a billionaire who also co-founded Green Hills Software and made his fortune developing secure, hacker-proof systems for the U.S. military and government — established the Dawn Project to campaign “to ban unsafe software from safety critical systems” spanning healthcare, communications, power grids, and transportation. For several years, Tesla has been his primary target; O’Dowd has orchestrated one safety test after another, mounted a single-issue campaign for Senate, and run expensive Super Bowl commercials to spread his warnings against the company’s FSD software.
My driver for the day’s ride-along — that is, the person who will babysit the self-driving Tesla to make sure it doesn’t kill us or anyone else — is Arthur Maltin of Maltin PR, a London-based public relations firm that represents the Dawn Project and helps to amplify their consumer safety message. As soon as I see Maltin’s bandaged right hand, I ask nervously if it’s from an earlier collision, but he laughs and assures me it was an injury sustained from a fall off his bike. We set out east on Sunset Boulevard with FSD engaged, Maltin with his hands poised right over the wheel to take manual control if necessary.
On any given day, if you open the algorithmically curated “For You” feed on X (formerly Twitter), you’ll probably come across a video of a Tesla influencer demonstrating the latest incremental update to the FSD software. More often than not, they keep their hands out of frame in an effort to demonstrate that they made zero “interventions” during a drive, having no need to override FSD maneuvers — even though Tesla’s materials explicitly command: “Keep your hands on the steering wheel at all times.” Earlier this year, as FSD moved out of beta testing, Tesla added the word “Supervised” to its language about the feature, which it has offered as an upgrade since 2016 without that qualifier. The change was an apparent acknowledgement that “Full Self-Driving” could be, in itself, a misleading term. (Tesla did not respond to request for comment on the safety failures I observed in reporting this article.)
In the car, with FSD on, your eye is immediately drawn to the large center console screen, which models the environment around you: streets, other vehicles, pedestrians. “Very impressive when you first look at it,” Maltin says, “but then if you actually start to pay attention to it, and look at what’s actually on in the world, like here, these two women” — he points at two women on the sidewalk — “there’s one person represented. If you watch, the cars will sort of appear and disappear. And parked cars move around.” The system also tends not to pick up on the presence of children or smaller dogs, he adds.
These blind spots are due to the way Tesla has decided to pursue autonomous driving: a cameras-only setup called Tesla Vision, which relies on a neural network. This is in marked contrast to fleets of self-driving taxis operated by companies including Waymo, a subsidiary of Google parent Alphabet, which are equipped with sophisticated LiDAR (light detection and ranging) and radar sensors in addition to cameras. Such cabs, which require no one at the wheel, are also programmed to operate within thoroughly mapped, strict boundaries in the cities where they are now available (Los Angeles, the Bay Area, Phoenix, and Las Vegas). Tesla FSD, however, can be activated anywhere.
A Tesla’s eight external cameras, which have a relatively low resolution of 1.2 megapixels (newer iPhones have cameras that offer a video resolution of 12 megapixels), may be thwarted by any number of driving or weather conditions: “Low visibility, such as low light or poor weather conditions (rain, snow, direct sun, fog, etc.) can significantly degrade performance,” Tesla warns, while FSD may also be compromised by construction zones or debris in the road. “What’s really bad for it is sunset or sunrise,” Maltin says. “When it’s got the sun in its eyes, it will sometimes just put a big red warning on the screen. ‘Take over, take over, help me!'” A self-driving car equipped with radar and LiDAR doesn’t have those problems: the former can see through fog, and LiDAR doesn’t need light to work. Waymo and similar companies have also put work into figuring out how to resolve conflicting messages from cameras and other sensors, whereas the Tesla cameras lack any redundancy, or backup, if they fail.
We experience the pitfalls of Tesla Vision several times during an hour-long drive. Once, the car tries to steer us into plastic bollard posts it apparently can’t see. At another moment, driving through a residential neighborhood, it nearly rams a recycling bin out for collection — I note that no approximate shape even appears on the screen. We also narrowly avoid a collision when trying to turn left at a stop sign: the Tesla hasn’t noticed a car with the right of way zooming toward us from the left at around 40 mph. Maltin explains that this kind of error is a function of where the side cameras are placed, in the vehicle’s B pillars, the part of the frame that separates the front and rear windows. They’re so far back that when a Tesla is stopped at an intersection, it can’t really see oncoming traffic on the left or right, and tries to creep forward, sometimes gunning it when it (falsely or not) senses an opportunity to move in. A human driver, of course, can peer around corners. And if a Tesla with FSD engaged does suddenly notice a car approaching from either side, it can’t reverse to get out of harm’s way.
After a while, you start to anticipate these mistakes. Maltin, who has plenty of experience testing FSD, frequently tenses up if a complicated or unusual driving task suddenly presents itself. But I notice a certain hesitancy and what I can only characterize as indecisiveness in the Tesla itself, like you’re being chauffeured by a student driver. It reliably slams on the brakes too hard instead of rolling through yellow lights, for example, and swerves unpredictably between lanes as if it can’t decide which it prefers. On the other hand, it operates well enough for five-minute stretches that you can be lulled into complacency.
That’s another huge issue: drivers trusting FSD more than they should, and indeed more than Tesla really wants them to. In April, the National Highway Traffic Safety Administration released an analysis of nearly a thousand crashes involving FSD (or the less advanced Tesla driver-assistance feature Autopilot), which caused 29 deaths in total, and concluded that drivers engaging these systems “were not sufficiently engaged in the driving task,” while Tesla’s tech was not adequate to keep them focused. Indeed, as Maltin shows me, you can effectively disable a FSD reminder to apply hand pressure to the wheel with a cheap hack. The Dawn Project has also demonstrated how Tesla’s internal camera system designed to monitor driver attention falls short in cases where the driver looks away from the road, falls asleep or is otherwise distracted. The feature will even remain engaged when a stuffed teddy bear is at the wheel.
The driver is ultimately liable for accidents under these conditions; the company’s fine print allows it to place blame on an FSD “supervisor” who allows the car to make a mistake. That may protect their bottom line, but, as the NHTSA report shows, it hasn’t done much to prevent what the agency called “foreseeable misuse” from drivers assuming the tech can handle anything. At the same time, there are indications that customers who have paid up to $15,000 for FSD capability are finding it unsatisfactory. O’Dowd and the Dawn Project have gone over Tesla data on the mileage that more 400,000 vehicles with FSD have traveled with the system engaged and determined that these drivers are, on average, only using it 15 percent of the time. If it were as safe and capable as advertised, you might expect drivers to rely on it more often.
That FSD doesn’t necessarily improve car travel becomes more and more apparent to me with each alarming failure I witness from the passenger seat. The Tesla attempts to run a stop sign at an on-ramp for the 110, a notoriously hazardous freeway that requires you to come to complete halt, twist your head far around to check for oncoming traffic, and accelerate rapidly to merge with motorists traveling around 70 mph. Shortly after that near-disaster, it misses its assigned exit on the left. Elsewhere in the city, the car tries to make a left turn from a dedicated right-turn lane. It also doesn’t pull over for an ambulance with its sirens blaring behind us — Maltin steers us aside and, as the ambulance passes, points out that FSD doesn’t recognize it as an emergency vehicle on the screen, but a regular truck. Then, climbing one of the steepest hills in Silver Lake, the Tesla accelerates too fast while steering into the middle of the road, unaware of the sharp decline on the other side or whatever vehicles may be coming up. Without a braking intervention, it feels like we might have launched over the crest (as seen in a notorious viral video).
Such baffling responses to familiar L.A. conditions underscore how little practical “knowledge” of basic driving is programmed into FSD. (Maltin tells me the Dawn Project took it to the DMV in Santa Barbara and ran it through the practical driving test administered by a professional instructor; it failed four times in 15 minutes and thus did not live up to the standards that a teenager must meet for a license.) Tesla Vision recognizes stop signs, stoplights, and (sometimes) posted speed limits, but not one-way signs or alerts such as “road closed” and “do not enter,” as I see first-hand when we pull onto a side street where the latter warnings are clearly posted.
This is the most frightening demonstration of the day, one staged by the Dawn Project on a street blocked off to regular traffic. Stopped along the right-hand curb, past the various signs that should have prevented us from taking this turn in the first place, is a rented yellow school bus. Its stop sign is extended outward on the left side, red lights flashing. The Tesla ignores this too, though the bus appears on screen as a flashing red truck — something the car must try to avoid. Our car blows past the bus, and, when a small mannequin is pulled forward on a track to simulate a child crossing the street in front of the bus, we mow it down, barely slowing down afterward.
In its ongoing tests of this scenario, the Dawn Project has produced this same result time and again, with each new version of FSD, and has been sounding the alarm about it since before it became a horrific reality: one of the accidents examined in the NHTSA report released this year included a Tesla Model Y driver who had Autopilot enabled when they struck a 17-year-old student getting off a school bus in Halifax County, North Carolina, in March 2023. The teen, who suffered life-threatening injuries, was transported to a hospital by helicopter, and survived. NHTSA determined that the vehicle had been traveling at “highway speeds.”
Somehow, Musk and his acolytes continue to sell the idea that cameras-only Teslas are not just safe but the future of autonomous transportation. Musk has been promising since 2019 that the company’s software will one day convert FSD-equipped vehicles into “robotaxis,” freeing drivers once and for all from having to watch the road, and has promised to unveil a new model intended expressly for this purpose (the hyped event, originally scheduled for Aug. 8, has been postponed until Oct. 10).
Yet when Maltin and James Bellhouse, another member of Maltin’s PR firm, take me for a ride in a driverless Waymo taxi to compare the technology, it’s like night and day — or, to put it more accurately, the past and the future. The car, a modified electric Jaguar I-PACE SUV, gets a lot more attention than the Tesla Model 3, not only due to its suite of bulky sensors, but because there’s no one in the driver’s seat. After seeing all the interventions necessary in the Tesla, I’m a little anxious. However, it turns out that using radio waves and lasers to determine the shape and distance of things around you is quite an improvement on Tesla Vision. The Waymo operates with what you’d call “calm confidence” in a human driver, and deftly navigates some difficult or unexpected situations in Downtown L.A., such as when a pedestrian jumps into the street to take a photo of it. In a few short moments, I’m remarkably at ease with this autonomous taxi weaves through the city. Doesn’t hurt that the sound system plays soothing ambient music.
If this is what Musk and Tesla are up against, they are hardly on the cutting edge — they’re years behind, and it’s hard to imagine how they catch up, particularly with the regulatory investigations they’ve faced of late. That said, I’m not entirely sold on Google’s driverless future, either. Waymos have caused ridiculous traffic jams in the Bay Area and made critical errors when confused by signage. This month, San Francisco residents who live near a Waymo taxi lot voiced their frustration over the empty vehicles honking at each other at 4 a.m. as they struggled to organize themselves in the available parking spaces. Like many unintended consequences of Silicon Valley “disruption,” the problem reads as absurd satire.
No, humans will never be ideal drivers. Nor perhaps can we ever design a flawless robotaxi. In the absence of perfection, it seems we must take every precaution we possibly can, since every life matters more than the next quarterly earnings call. For Tesla to put that principle into practice would be a welcome change — one already long overdue.
Best of Rolling Stone
Sign up for RollingStone's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.