> It's worth noting that Tesla claims FSD is still in "beta," so it's incomplete
When a broken tail light is considered a safety issue, how is a half-baked "full self-driving" system that's in perpetual beta and has never delivered on Tesla's self imposed deadlines for true self-driving allowed on the road at all?
That’s a strange perspective. AEB isn’t 100% reliable but we still allow it on the roads because it’s a net increase in safety. When used correctly FSD seems to be a net increase in safety as well. You can make the case they should do a better job with the labeling, maybe stop calling it FSD to start.
Current labeling is FSD (Supervised), which in my experience is accurate. There’s so many prompts and enforcement of paying attention that zero people are being fooled into thinking they can take a nap and arrive at their destination. If they tried, it’d yell at them before they counted a half dozen sheep.
My car drives me all over town without intervention (on 12.3.6, and now on 12.5.4). It’s beyond confusing to be constantly told it doesn’t, when I have never taken over for a safety issue. Only for cases where it’s being too hesitant. 2020 Model 3.
I bought a used Tesla and had very low expectations for FSD’s usefulness, and those were shattered. All my complaints are stylistic - accelerates a bit too quick, hesitates a bit longer in some cases. Steering wheel is a bit jumpy on turns at low speeds, but not unsafe.
The only explanation I can come up with is that there is either a well-motivated group that wants FSD to be bad, or there is some huge variation between vehicles and I lucked out on my car’s cameras being aligned right or something. Maybe the roads in my area are easier than normal, but I’m in a large non-California metro area, so I don’t think I’m unrepresentative.
> AMCI found that, on average, when operating FSD, human intervention is required at least once every 13 miles to maintain safe operation.
I drove a 200 mile road trip last week, combination of city, suburb, interstate, rural. Didn’t take over except to pull in/out of driveways. So this statement is just mind boggling - implies that my drive last week was, like, a 3 or 4 sigma event. But I’ve done it more than once. At some point, it stops being anecdotes and starts being “I simple cannot square your number with my experience”
Last time I said something positive about FSD here, someone suggested I’m simply forgetting the many times it must try to kill me. Debates around this get wild.
I will not reply to anything negative below, because I’ve learned it’s much more fun to have my car drive me around town than argue with people who insist it can’t.
I recently got 12.5.4, but haven't put it through the paces yet. My personal experience has been that I don't see very many (any?), critical interventions now. It seems to be handling the more complicated situations like unprotected left turns, roundabouts, etc smoothly.
However, it still doesn't serve a valuable purpose for me yet. I can't take my eyes off the road or my hands off the wheel. Instead, it's the opposite - I must still be hyper vigilante. In fact, it's even worse than that. Because it doesn't drive how I would, I find myself constantly intervening. E.G. I take over when I know there's a good spot to merge for an upcoming turn, but it's delaying the merge. You can force fsd to merge by signaling, but the navigable gaps between vehicles in heavy traffic can disappear faster than fsd will perform the merge. Then the maneuver shifts from reasonable to rude.
If I were just a passenger in the vehicle, I would probably care much less. But right now I'm the operator.
I don't know that anyone will ever train vehicles to behave how you would prefer to drive. I think the gold standard will rather be to behave as you find acceptable as a passenger.
For now, I'm appreciating the much improved summons. Having your vehicle pick you up in the rain is super nice.
Breaking my no-replying rule because you seem reasonable and have real experience with the topic. Just wanted to reply to this:
> However, it still doesn't serve a valuable purpose for me yet. I can't take my eyes off the road or my hands off the wheel.
Agree - it doesn’t allow me to open a laptop and get something else useful done. In that sense, it’s not returning any time to me.
But to me, it feels much the same as cruise control: it removes a set of monotonous tasks that take up some part of my brain. When I’m not needing to keep a car between lane lines and react immediately to the car ahead slowing down, I’m getting free cycles to look other places. I’m finding I spend a lot more time looking in the mirrors, looking much further ahead down the road, etc. I’m quite sure that me + FSD is safer than just me driving, simply because I’m looking much further ahead and behind. It turns driving into less of a tactical activity and more strategic.
It feels almost like being a passenger for a new driver. You can be pretty sure they aren’t going to just dive out of the lane and kill you, so you can look far ahead for anything weird that might trip them up.
> “I’m quite sure that me + FSD is safer than just me driving”
A drunk driver is quite sure that alcohol is not impairing their driving too. NASA has shown that it literally is not possible for a human mind to monitor a system for a prolonged period of time, ready to intervene at any moment; after a while you tune out. That’s specifically why Waymo never pursued level 2.
Perhaps a lifetime of watching my rear view mirror, scanning traffic up far ahead, keeping an eye on drivers next to me, etc, has made those tasks mechanically low-effort. I find them more engaging than exhausting.
However, on long road trips, autopilot definitely reduces fatigue for me far more than standard cruise. I'm hoping now that FSD is on interstates, it'll be even better. (Assuming they've fixed the issue where the new minimum speed signs in my area are being read as a speed limit).
Maybe; obviously it’s unfalsifiable on the internet. The only objective signal I can provide is that I’ve driven hundreds of thousands of miles without a single incident, accident, insurance claim, ticket, warning, speed camera infraction, parking ticket*, or any interaction with law enforcement.
Is that enough to objectively say anything? No idea. But my assessment of myself is as a safe driver with strong self-preservation instincts, and I’m far more comfortable with FSD than I could have imagined myself being prior to getting this car.
(* well, I got a warning from a college library parking lot once, but that was a calculated cost vs. buying a permit. I won that gamble.)
"I've driven drunk many times, and I've never been in an accident. It’s beyond confusing to be constantly told it's an unsafe way to drive. The only explanation I can come up with is that there is either a well-motivated group that _wants_ drunk driving to be unsafe."
I look far ahead, look in the mirrors, look for anything that might trip it up. Basically all the things the state’s driver’s handbook says you’re supposed to be doing, but 80% of the time instead of 20% because the basic “follow lane lines, don’t hit the car ahead” task is delegated to the car.
It feels much like cruise control on a freeway; the brain cycles you’d spend monitoring and controlling speed are allocated elsewhere.
I’ve developed the same hook in my brain for “why am I lanekeeping manually” as I used to have for “why am I controlling speed manually”. It’s a useful task offload.
> It’s beyond confusing to be constantly told it doesn’t
I don't think I've seen any assertions here (or elsewhere) that FSD doesn't work in rogerrogerr's car. I've seen LOTS of assertions (and plenty of video evidence) that it doesn't work in other cars, and that it behaves unpredictably.
This is why data over many samples is important. Maybe you've got an environment where it happens to work really well. Maybe it's probabilistic and you are the person who keeps rolling well on your random dice rolls.
There's plenty of evidence the system is not generally safe, even if your personal anecdotes are that it is.
I have a fairly deep expertise in the area. Masters level work in machine vision,
many years of robotics at Unimation, IBM, and CMU. Current hacking on LLMs.
I have programmed for 53 years.
FSD is "indistinguishable from magic".
This thing drove from my house to the destination and back without a single event.
It has never seen either path. It handled stop signs, traffic lights, country
roads, a flock of birds in the road, highway entrance/exit, directional
traffic lights, traffic merging, speed adjustments, lane changes, rain, etc.
I am beyond impressed.
As for the "safety issue", my Y has airbags on the steering wheel, knee,
and "curtain airbags" on the windows and seats. The Y has the highest
safety rating available. It won't roll over easily. The passenger compartment
is well isolated from front/rear collisions. Plus it apparently has emergency
braking if a collision is likely.
In all possible circumstances even humans get confused. Two weeks ago I had a
turkey fly across the hood of my car. I didn't know turkeys could fly. I thought
it was a boulder at first. So FSD can't be perfect. But it is damn good.
The article mentions that Tesla now claims they will show a prototype of a self-driving taxi on October 10. After four years of postponements.[1] After fakes going back to 2016.[2] At a movie studio, not on the road.
The Red light at night "failure" [1] is a bit suspect: The car was already in the intersection but was blocked by the stopped car in front of it. When the car in front of it started moving, it moved too, which is the legal and correct thing to do as it was already in the intersection.
The car should not have entered the intersection if it could not exit the intersection, and it pretty clearly shows it waiting to enter the intersection. Note the dashed red line showing it was correctly waiting to enter, and only choosing to enter after the lights were red.
while far from perfect, after watching all three videos, nothing stood out as egregious. Yep... definitely not perfect and still room for improvement. But this headline feels pretty sensational when backed up with the 3 videos (poorly) linked.
Its amazing how one person changing political opinion destroyed the widespread faith in "self driving" that existed just a few years ago.
5 years ago, saying "I doubt self driving trucks will destroy the careers of truck driving humans" was enough to get one burned as a luddite. Now, not only can it be admitted that the systems that exist require human help; doubts about them are required in civilized conversation of the subject.
Its almost as if the facts of the matter are unrelated to the conversation, and the important point is enforcing tribal signals.
Tesla's "Full Self Driving" is most definitely NOT full self driving --- according to Tesla.
As further proof of this point, reports are that Tesla has been training a car to drive around a selected movie studio lot for their upcoming robotaxi demo.
And all indications are that this staged production will work as planned to keep investors engaged in the fantasy that has been ongoing for over a decade.
For a realistic robotaxi demo, let's put Elon in the backseat and ask the car to drive him to a randomly chosen destination 10 miles away and see how it goes.
But this won't happen. Because FSD is NOT fsd --- and it is blatantly obvious that Elon knows this as well as anyone.
When a broken tail light is considered a safety issue, how is a half-baked "full self-driving" system that's in perpetual beta and has never delivered on Tesla's self imposed deadlines for true self-driving allowed on the road at all?