There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.
If this were a planned move, the software would already be ready to replace it. It is not. Cars without radar still aren't at parity with the car that have it, although they "fixed" this a few weeks back by simply turning off the radar on the older cars.
The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
Even more concerning: some features simply cannot be replaced by camera. Tesla infamously does not have a front bumper camera, making it impossible to detect obstructions occluded from view by the hood. For 3/S, which are low, this means it will be impossible to detect low obstructions like too-high concrete wheel stops and similar obstructions. Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Laughable that this is being done on cars from a high-end brand, and that the remedy for customers who are getting worse cars today than you would get yesterday is to wait for these features to "be restored" at some indeterminate time in the future.
With the removal of radar, they showed some pretty convincing data that the radar was too noisy to be useful, especially with discriminating things like highway overpasses vs. stopped cars. They showed that vision could already outperform radar.
With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
> They showed that vision could already outperform radar.
important to note that much like there is a wide range from bad cameras (“filmed with a potato”) to high resolution cameras, there is a wide variety of radars with different capabilities.
the radar unit in Teslas was pretty limited (basically designed for adaptive cruise control), and they showed that vision could outperform that radar (and have no interest in exploring non-vision approaches because “humans can drive with just eyes”)
In fact, there are many many many more signals that your eyes give your brain than just a 2d grid of pixels, which is why cheap off the shelf cameras slapped all over a car CANNOT do what two eyes can.
Your eyes also give coarse distance info by having to adjust focus, slight parallax (which depends on a HUGELY POWERFUL and pretty good edge detection and object classification system), an ability to perceive distance by the slight difference in orientation your eyes have to look at something in 3D space, some information about YOUR OWN head's orientation and movement, and a brain with a rough approximation of a simulation of the world around you to constantly check/verify any info against.
The brain seems to make most of the resolution up, it seems the eyes really can see only a small area. The rest is made up by object permanence.
https://www.shadertoy.com/view/4dsXzM
Modern sensors combined with computational photography does seem to be changing this though.
For example, I now use my phone camera to see where I'm going at night if it's dark, and to read distant text. This is despite my (corrected) eyesight being better than 20/20 and I have better than usual night vision.
The eyes are fantastic organs, but the thing that makes them exceptional is the brain. Computers seem to be catching up with that.
Flashlight's have come a long way. I much prefer that to using my phone, especially on Android where opening the camera can be finicky and turning on the lamp requires a ridiculous amount of steps compared to the iPhones. Especially handy if out in the cold with gloves. Armytek prime c2 pro is pretty great.
I do love a good torch, and I keep one on me most of the time.
However, I meant non-illuminated phone camera. The new low light sensors combined with AI is incredible. They can see things when it looks pitch black to me.
Also not sure how opening a camera is finicky, just double tap the power button and it opens for me, and turning on the flash takes two taps - though I rarely use the flash.
Why do you have to open the camera in order to use the flashlight on an android phone? The button to turn the flashlight on is in the quick settings panel which can also be opened from the lock screen.
That's not actually the case, from what I heard. Most of our peripheral vision is smudged 720p at best, and we can only see a tiny focused area at slightly beyond 4K density. We also can't look at two things at once.
I'd love to see the convincing data RE: radar - they're using the same radars as every other car that has emergency braking from the car in front of the car that's in front of you. I've not heard of many "phantom braking" sessions from these other vehicles - I may have missed something.
Allow me to take you for a drive in my Volvo V90. I turned the auto-braking feature off entirely because it spits false positives like crazy. Parked cars, signs, birds, cars travelling through roundabouts across my nose, nothing at all - they all trigger it, and the car stomps the brakes extremely hard. The 2017 VW Golf it replaced was FAR better - maybe two false positives in 5 years. The Volvo? Daily.
I have a model X raven (2019) and model 3 awd (late 2021), both have radar and hw3, and both have the update that has disabled the radar.
Pre-patch there was maybe a phantom brake (hard brake) every six months, but lots of small slowdowns where you could almost sense the car got conflicting input (once pr 100 km maybe).
The latter is gone and I have not had any phantom brakes yet, but I've had a slow down where a bus went into my lane. The cars behaviour in stop and go traffic is also much better now.
The only regression I've found is that it's a bit more confused in a local spot where the road splits in two in a turn to the right. In the past it went into the right lane without much fuzz, but now it is confused for a second or two before it decides on the right lane.
Tbh, as a daily user of these cars I think the no radar update was an improvement.
Also, my experience with the Volvo one (on XC90 tho) was that it would happily plough into cars standing at a red light in front of me if I would let it.
Where do you drive? I feel at least some of it depends of driving style and culture... Rental MB I had in Spain had phantom braked few times a day until I eased off. Other cars in LT and NZ - maybe once a month or less.
And by phantom braking I mean emergency braking a bit too early - i.e. I am perfectly aware I am too close to a car in front, I have foot on brakes and I see how traffic flows, yet the car brakes.
In Sydney, Australia. I’d say I’m quite a conservative, safe driver (I have kids). I don’t tailgate, and actually I’ve never had it false positive on a car driving in front of me. It’s typically on stationary objects, or small moving objects. Or on nothing at all shrug
Hypothetically, Tesla would spin this story differently in no time if they are making great strides in a cheap solid-state radar.
For now, though, it looks like Tesla is building a case after they decided to do away with radar. I recall EM saying that it looks ugly and expensive too. I really got confused when Telsa came up with this patent though https://twitter.com/iamkellex/status/1534240730633236480?s=2...
No one sensor will work for all use cases; each has pros and cons. Radar really shines with depth sensing; it can cut through rain, fog, and snow like a hot knife through butter, much better than pure vision-based systems. At the same time, it seems to fail during harsh breaking and maybe a few other scenarios.
The ability to cut through rain, fog, and snow is precisely why so many companies use radar based systems - that's arguably when they are the most important and useful.
Leaning further into their vision systems when their vision systems cause their cars to slam into emergency vehicles that aren't even in travel lanes, and to rear-end tractor trailers, isn't encouraging. Musk is placing the rest of us at risk and I'm sick of it.
I get phantom forward collision warnings on my car quite often, a 2015 VW Jetta TDI SEL. There are a few places where it triggers ~1/3 of the time while driving on a clean road with no other cars around in either direction.
The system doesn't trigger breaking just a warning alarm beep and dash screen warning message, but all these systems are unreliable at this point, even the newer ones have these same false positives and false negatives.
It does make sense to err a bit on the side of caution for warnings. With warnings, a false positive is better than a false negative. It’s better to occasionally annoy the driver (within reason) than fail to warn for an accident.
This isn’t the case with braking. You want to have a high degree of confidence in that scenario because you don’t want to cause an accident. It’s better to react later to an impending collision than it is to cause an accident that would have never happened.
I got a ton of phantom braking just two weeks ago when I rented a 2020 Toyota Corolla with this feature. I've also gotten it in a 2019 Tesla Model 3, of course.
Anecdotally I've been driving a 2020 Toyota Corolla Hybrid for around 2 years now and haven't ever experienced phantom braking. I have driven it all around Australia (Melbourne, Sydney, Coffs, Brisbane) and the driver assist features have been nothing except rock solid reliable.
I have noticed though that it is very timid when people are leaving their lane, and it takes a few seconds to decide that a car has fully left its lane. It will brake as though the car is still in your lane well after it is gone. I'm not sure I'm going to hold it against Toyota, because I'd prefer it to be timid in case the other driver does something unpredictable like change their mind. If you press the accelerator pedal lightly when you notice someone leaving the road, the software will cancel out the braking.
I drive a 2018 Hyundai IONIQ EV. The forward radar is hit-and-miss in heavy rain: the car will often barrel quite happily into stopped traffic when it’s raining hard, requiring a stomp on the brake pedal.
Just a couple weeks ago I was driving on the motorway around 4am, no other traffic for miles, and the forward collision alert sounded and the car slammed on the brakes for ~quarter of a second and then released as if nothing had happened and we merrily resumed our journey. Having the brakes slam on at a smidge over 100 km/h gave me a sore shoulder and quite the rush of adrenaline, quite the rush at 4am!
Not braking per se, but on all cars I've driven recently (ie some Seat Alteca, our BMW F10 5-series and maybe 2 more a bit longer ago) there is sometimes collision warning in situation where no collision is about to happen.
Seat was pretty bad in this, had it rented for 2 weeks in Sicily few weeks ago and first few times all screens start flashing like crazy in normal situations is very distracting (since I wasn't sure what the heck is happening, but it looked sinister). BMW is pretty subtle with this, and only happened few times in past year.
I imagine if this warning system would be also connected to breaks, bad things would be happening.
They were using the radar for lane guidance. That's why they had to have a whitelist of locations where the radar was false triggered by the roadside environment. It never actually worked properly and they just papered it over with a hack to temporarily blind their input.
Radar has a longer wavelength than visible light. Thus making it better at some stuff and worse at others. It's a crime to not be using as much of the EM spectrum as possible for an application like this.
Some pretty convincing data that they handpicked to make themselves look good and pretend they don't need a radar, a lidar and that vision only is good enough.
Perpetuating a pattern of lies from Tesla, ever since they started on self driving.
You don’t need to take Tesla’s word for anything. Their latest vision stack is in the hands of ten thousand customers, some of whom regularly post their drives on YouTube.
But that’s the thing. Tesla wasn’t able to tell when the radar was “uncertain”. So when do you trust one data source more than the other? If it was so easy to label misinformation as such, the misinformation would probably not have been communicated in the first place. Tesla explained this at AI day 1.
If the vision stack is just below its certainty threshold that a car is coming across at 63mph at some angle and can't quite decide to take action on its own, and at the same time radar indicates a car is coming across at the predicted angle and speed, it should push it over the threshold, even if it might get false positives from overhead bridge traffic Lining up with the vision estimate makes that less likely, and the vision stack itself can also be used to exclude data that might be from an overhead bridge by detecting that there is no bridge nearby.
No sensor is certain and can be blindly trusted - the data is going into a neural net anyway so the computer can be trained to understand what the most likely ground truth is given all sensor data.
> With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
I'm especially worried by dark garages / parkings. Tesla cams have already enough problems when outside is too dark.
> They showed that vision could already outperform radar.
Except on emergency vehicles parked in a lane at an oblique angle, which Teslas did not recognize, and plowed into at speed. I wonder what unknown secondary effects this change will bring.
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
I have observed that this is how vision is currently implemented, but does it have to be this way? I can pull up to a concrete wheel stop in my Toyota without vision and without the sonar enabled even though my eyesight is occluded by the hood because I know where it was and how far I have moved. Concrete wheel stops do not flicker out of existence when you stop looking at them, they should be able to monitor the wheel speed sensors and shift the 3D map of the world into your camera blind spots, perhaps showing a "hidden" wireframe on the cameras.
It would be inconvenient if you were unable to pull forward because a tumbleweed or (more likely) plastic bag rolled through, but you could back up and try again or the human could decide to ignore the beeping.
Granted, I'm not a domain expert, but we do this in my field of industrial robotics when building models with 3D vision. The computer can composite multiple profiles into a single higher-resolution image, can return data about that model when the camera on the EOAT has moved such that the field of view is limited, and can provide faults when the model does not match a previous image because something has been added or removed while the camera wasn't watching.
The car parks and turns off, with no obstacles in front.
While parked a child/dog/bucket of concrete materializes directly in the front blindspot.
Next time the car drives, it has no knowledge of this obstruction.
This isn't a huge problem for a car driven by a human, since you can and should check around the car before driving, or will have other context clues that there is a child or dog or whatever running around your car, but this seems incompatible with the stated goal of making all current cars into Robotaxis.
Lol that's horrendous. I had a Mercedes with automatic parking assist back in 2016 that would do it perfectly 10/10 times, used it all the time too(narrow streets in the UK). What's shown in the video is just hilariously bad.
Old Tesla was using mobileye systems, that powers half of the automotive world. Tesla later started to abuse it, mobileye got angry and they stopped working together. Since then Tesla's tech is in house and in many aspects still behind 10 years old product.
I don't use the automatic parking, but I find the ultrasonics quite useful in my own parking, maybe I've been more lucky or just more hyper careful, but I've never scratched the front or rims on a curb. The feedback about the shape of the obstruction and the distance it is away from you is great, much better than my 2014 era vehicle. I would like to see some lower sensors, maybe a single pixel lidar to look for curbs or something, but I think that is going the wrong way with component count for them.
To just say that these features that are pretty standard on any other car at this price point are "coming soon" is laughable with Telsa's delivery cadence. I'll wait for vision based parking, I'm sure it's coming right on the heels of full self driving in 2019 (not here,) smart summon in 2019 (delivered in what, 2021, and the only thing I've seen it do is nervously back out of a spot then try to merge into a surface street instead of pick me up,) The Tesla Semi, Tesla Cyber truck, and now a robot, which is hilarious because they couldn't even get the million dollar industrial robots to work on their line, can't say I've got a lot of faith in some 20k robot from Tesla.
No, they overpromise and basically just don't deliver, why anyone would take Tesla's word for this I don't know.
Having slightly more than average (researched the industry as an outsider) knowledge of industrial robotics. It turns out million dollar industrial robots are kind of like enterprise software. You "buy" an ERP/CRM/etc but it doesn't just work out of the box, theres weeks and months of work-hours needed to try and get it actually integrated into a company, and it still might not work since software has bugs, humans are fallible, and huge software with endless features sometimes forgets which ones still work properly in combination with others.
Industrial robots are kinda like this too. You can buy an arm based on physical specs like how large it is, how precise it is with certain weights of objects (because it may be less precise with larger objects), how quickly it can move around, how quickly it can get from precision point A to precision point B (because precision and speed are a tradeoff once weight is involved due to momentum), how much power it uses, is it hydraulically powered or electrically powered, available off the shelf end effectors provided by the manufacturer... etc.
Then you have to install it, as in mechanically bolt it to the ground (which might have issues with load cantilevering), mark safety areas humans cant be in when its operating (that could require redesigning the entire floor plan of a building depending on how much spare room there is around everything because people still have to get around to other important things when the robot is doing its job), then you have to program the robot (which can in some ways be simple xyz coordinate motion, but also you need to tie it into some form of process control software so it knows when to do its job, and other things around it know when to do theirs, and process control systems are software, and may not be compatible, or have bugs, etc), then you have to maintain it (spare parts, breakdown rates, warranty turnaround times, industrial equipment built to be used, is sometimes used to do things that will wear them out, but its at a known rate that was planned for, but is not lived up to, requiring replanning or additional costs)
Its a web of complexity that can lead to what seems simple "just buy a robot to lift this thing from here to there and then buy another robot to weld it to the car" ... into a project that takes piles of money, months of work-hours by multiple different disciplines from construction to programming, and anything going wrong dominos down the line making it take even longer and cost more...
I wouldn't hold failing to get industrial robots working against them. Feel free to continue holding all the other opinions though :) everyone is entitled to our opinions after all... i just had something relevant to say about the robots.
As an insider of the automated manufacturing space, your ERP analogy is good but missing one detail: every other large company eventually gets an ERP working. Tesla is an outlier in regard to their manufacturing processes and even much lower margin industries tend to design new factories around the automation platform yet Tesla just...doesn't?
I've never worked with Tesla so have no inside info on their situation but their factory designs have always seemed strange to me - like they refuse to follow industry norms.
Not entirely invalidating your point, but it’s worth keeping in mind how many organisations will publicly declare an ERP project has been “finished successfully” in various PR channels, while staff will be stuck shouldering the burden to maintain duct tape and bailing wire hacks for a few years as the ERP that was “finished successfully” has in reality only met 40-80% of its original goals, the project having been either quietly rescoped or just “finished early to the satisfaction of all parties” or any number of other PR bullshit phrases.
I’ve personally witnessed a couple if “successfully deployed ERP projects” which were in reality complete shit shows to the point where lawsuits were considered, and it was only the good old fashioned grease of the professional consultants who started getting worried this might haunt them in the next job after they quit, that kept the gears turning and stopped things getting that far. One of these was barely 50% of originally promised scope, downscoped twice and “successfully completed” 5 years before the internal teams who took over when the consultants were kicked out, gave up and told management they couldn’t deliver and and a new solution should be put to tender.
Oracle and IBM and SAP and the like may be able to always end up delivering, but plenty of clients will walk away partially down the road due to going bankrupt or merging or any one of a number of outcomes because the massive vendors that will “always deliver” might take so long to get there that you can basically just call it what it is, a practical failure… if an ERP is installed in a company but no one ever uses it, was it even deployed?
More silly than that. It’s like saying they can’t get ERP from Oracle or whomever to work, so they are going to build their own ERP from scratch, it’ll do more and be 1/1000 the price.
This. In a previous life we were asked to install a motion platform for a vehicle simulator. Our customers had a small hangar-like building with a free corner, and a research budget that could easily afford the hardware required. But the installation required floor safety markings, physical barriers and interlocks so that bystanders couldn't get whacked and integration with the building fire alarm system so that in an emergency the sim would park in a safe position so that the trainees could exit. The latter required software development as well.
Thanks for this info about industrial robots, I suppose I can understand this difficulty. I have heard a rumor that when they were having trouble getting the line robots to work properly, Elon personally came and sat next to the people who knew how to program and run the robot, grew frustrated with them, and fired them one by one until no one on staff could program the robot… so the difficulty might have been magnified at Tesla. Who knows if that’s true, but I can definitely see that fitting his personality.
Does anyone expect anything else from any corporation? Of course they are not gonna say "I'm sorry, we had to make your experience worse", of course they will try to spin it into something positive. Almost every for-profit company does this. Not to take away that it's shitty, but it's reality.
> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
My 4 year old Ford Focus doesn't show an outline of obstacles around the car, but still provides Park Assist and Autopark. Especially Park Assist is something that everyone expects from every reasonably equipped car nowadays. So yeah, maybe understandable if it's due to supply difficulties, but still a bad move...
It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.
> It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.
Don't worry, it won't be a problem as long as they can keep the marketing budget up.
Their marketing is mostly Elon Musk. He uses Twitter for this and it has cost him / them quite a bit. From fines by the SEC, to potentially 45 billion USD for the purchase of the platform.
> There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.
I like the idea of coalescing multiple sensors into one, but I can't shake off the idea that relying on vision alone when you can sense depth through US or LIDAR is a terrible idea. You can fake depth data from multiple cameras, but it takes more processing and additional input should be a good thing. Did anyone ever try to fool a Tesla using the Looney Tunes painted tunnel trick?
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Any sensible navigation software should "know" objects don't cease to exist. I hope Tesla's does it.
People who have never driven a car worth more than $50,000 love the luxury feel of their Teslas. It's been really interesting to see how many people had a Tesla as their first expensive car.
We've had the complete opposite experience with our BMW->Tesla move. Sure, the BMW had a very nice interior, that complaint gets a tons of play online, and it's not inaccurate. But you know what just blows that away? Getting into a car on a 100 degree day, and instead of scorching my skin on the leather, like I would with our old BMW, the seat is cold because I tapped a button on my phone 5 minutes before when we were about to head out. That is real luxury, and I appreciate it so much every single time. I stopped noticing the interior of the cars maybe two weeks after I bought each. Maybe that's just me.
And the thing is just automatically at a completely perfect temperature when we get into it in the morning. The automatic seat warmers that work in concert with climate control are a newish software feature, but wonderful.
Recently, one of our tires developed a bubble in the sidewall due to a curb impact, Tesla sent out a truck and they changed the tire inside our garage a couple hours after we noticed it and told them about it in the app. Not needing to interrupt my day to deal with the issue was a real luxury.
Music that picks up right where you left off when you come back.
A really efficient heat pump that makes it almost guilt free to just sit in the car and leave the AC running. All the things it enables, like camping mode! AC temperature keep mode!
Voice commands that actually work well.
A unified interface that controls everything about the car, and allows the voice commands to do everything from temperature changes to music searches to activating child window locks because our toddler needs to stop rolling down the windows and I can't take my eyes off the road to hunt for a setting (the voice commands obviated my biggest worry, the loss of physical controls).
There are just so many nice little touches that just work, and add up to make the experience much better than a BMW with carplay. I miss the exhaust note, the hardtop, the nice leather, fit and finish, but I would still choose the Tesla 10/10 times. And that would be the case against any legacy carmaker vehicle, except to go the complete opposite direction with full physical controls, basic radio, and no infotainment system at all. The legacy carmakers are still quite good at bolting together the mechanical components.
Interestingly, all those features you mention are also available on BMWs from the last years. Your comparison is not fair if you compare against an old BMW.
On the other hand, on all the Teslas I’ve tried I missed basic things like proper traffic sign recognition, proper adaptive headlights, proper automatic windshield wiper activation, safe lane keeping assistant, etc. those were unusable and unreliable on all Model 3 and Model Y I’ve tried. Not to mention how helpful is the 3D view for parking, also not available in any Tesla.
Are they all standard features on a BMW, or available as a dozen optional, rapaciously expensive add-ons? Or a subscription like the seat heaters? ;-)
I’ve heard a lot of complaints about the automatic wipers, but mine have always worked well, better than they did on my BMW. Maybe one of the software fixes in the past couple of years did it?
Unsure about the headlights, they seem to switch on at the appropriate times, but I’ve turned off auto-highbeams.
Autopilot’s been quite good for us, but we have a radar equipped model, can’t speak to vision-only, though my cousin has complained of phantom braking with their vision-only. The speeds listed have always been accurate, but perhaps you’re referring to a different use of sign recognition?
And it buzzes appropriately when it thinks I’m drifting out of the lane without a blinker.
So I guess I’ve just had a very different experience than you for whatever reason. It’s not perfect by any means, but it’s the best I’ve seen, and I’d buy another if I needed a new car.
You should give the new BMW i4 a test, it's probably a much better car than the BMW you had. Built quality of BMW combined with the advantages of an electric car without the "features" of Elon Musk as discussed in here.
There was a fun stat that model S cars when then first came out had bay far the largest proportion of first-time-in-this-price range buyers. I can't fathom that someone coming from his/her 10th s class would characterize Teslas as luxury.
This is the thing people dont realize, as someone who has owned a luxury car before, the Tesla Model 3 is not what i would call a "luxury car" despite them playing in the same price space as the BMW 3 Series.
It's still one of the best cars i've ever owned but the fit, finish and feel are not luxury. My 2013 Ford Focus felt more solid than the model 3 i own.
When you have a several month wait list for your product, legions of devoted potential customers, if you're head and shoulders above any of your competitors, you're able to get away with things like this.
> if you're head and shoulders above any of your competitors, you're able to get away with things like this.
The hubris of Tesla is still believing this to be the case. I personally cancelled my cybertruck order from lack of interest. better competition exists in 2022 that wasn’t the case years ago.
Tesla's are outselling every other manufacturer (ICE or otherwise) in many countries for the last quarter and they are still severely supply constrained; your personal antidotal data point doesn't serve much purpose in the broader context.
But the best selling EV manufacturer in Norway is Volkswagen. Volkswagen has four EVs in Norway's top 10 sellers: VW ID.4, Skoda Enyaq, Audi Q4 e-tron, and the Audi e-tron.
The VW ID.4, Skoda Enyaq, and the Audi Q4 e-tron are different versions of the same car. Rather than selling just one model Volkswagen's approach gives you the choice of more options.
Add in ICE vehicles and it's not even close. Volkswagen sells 10 million per year globally across all of its brands (VW, SEAT, Cupra, Skoda, Audi, Porsche, Lamborghini, Bentley, Ducati, Scania, MAN, etc.).
One of the weird things about Tesla is that, in general, supply side constraints are widely regarded as a flashing warning light that a firm can’t execute. Yet Tesla is just given an “I’m sure they’ll work it out.”
I've yet to be in a country were sales are not skewed toward local cars. Sweden lots more Volvos than anything else (and definitely Teslas), France, lots more Citroen and Peugeot than anywhere else, Australia: Holden and Ford, US: lots of cars that drive nowhere else (GM, Chevy...).
Regarding outselling most other car manufacturers: this seems to be the Tesla realitity distortion, it is likely true for electric cars but for all cars VW delivered ~5 M cars in 2021 vs Teslas 900k
Australia with its long drives if you leave a city and lack of infrastructure doesn't suit electric cars at the moment. Will change eventually, but likely lagging the rest of the world.
> Australia has no local car manufacturers and hasn't for some time. Toyota is by far the biggest car seller , followed by Mazda and Hyundai.
I know Toyota closed it's Melbourne plant in 2017 (Holden and Ford had stopped manufacturing even earlier IIRC). I would argue though that Ford and Holden (which I just read ceased operating in 2020) are still perceived as Australian car manufacturers by most. They definitely had some cars specifically for the OZ market.
"Car leases trough employers are only for German cars" - that is not true, as a German company you can lease whatever car you want for your employees. They are predominantly German cars for the obvious reasons. :)
While there is a preference for VW group in the corporate leasing market, theybare far from the only ones. Your first argument doesn't make any sense whatsoever so.
“In Q3, we began transitioning to a more even regional mix of vehicle builds each week, which led to an increase in cars in transit at the end of the quarter. These cars have been ordered and will be delivered to customers upon arrival at their destination” - [0]
In other words, they said all of the cars had been ordered, although many were in transition at the end of the quarter.
I mean isn’t that the same thing for traditional dealerships? They order the cars and they’re waiting to be sold. A good amount of Teslas end up rejected for whatever reason and show up on the existing inventory page. You can often find new cars in socal with delivery that week. I got my model s that way.
I'm pretty sure you can't stuff your supply chain like that.. Sunbeam got caught once doing that so I'm pretty sure it's a major no no. If it's ordered then it's ordered by someone. Not by a dealer etc? Plus Tesla have no dealers except themselves? so at most it'd be bunch of rejects?
> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this,
Perhaps the occupancy network is a good replacement? You're characterizing this as if Tesla Vision won't ever be able to replace it: "impossible to detect obstructions occluded from view by the hood".
What if the neural net persists where everything in the world is, and can extrapolate where objects are as the car moves? Then it's better than cameras, blind spots would actually be reduced.
I agree it's a dick move to remove it on cars that were already ordered though. Very Tesla.
What happens when a small object rolls into that blindspot? What happens when the car goes into deep sleep and the AP computer state is lost? Reloading all those nets takes like 40 seconds on startup from deep sleep.
It would be possible to use cameras instead of ultrasound, but that would require mounting a lot more cameras in the dead zones as you said. One of the coolest things I've seen was on a BYD, which has cameras mounted pointing downwards all around. It then creates a fake birdseye view when you are parking, or close to an obstacle. It looked really convincing, as if there was a drone above the car.
I mean damn that looks legit. I'd like to see how it performs at night when you're trying to park in some unlit parking lot, but otherwise that's a pretty amazing projection right there. The dream of driving in 3rd person view is finally here.
I had a Lexus ES 350 loaner the last time my (22 year old) Lexus was in the shop- it had the same feature you describe in the BYD. There was obviously some stretching/warping trickery being done to create the birds-eye image, but it was really impressive regardless.
So... your first sentence is very likely true. But it doesn't match your argument two paragraphs down:
> The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
If cheap "bargain-bin" cars have these things, then there is no supply chain problem. In fact it's likely these parts aren't available in quantity anywhere right now and every manufacturer is having to deal with it. It's just not news when Ford has to rejigger options on a bunch of models to adjust to demand. But with Tesla, yikes.
It's just so exhausting the level of emotion this brand drives. Every decision they make is An Affront to All Sensibility to someone, it seems like.
FWIW: I like the parking sensors too, they're helpful. But good grief, maybe tone down the outrage?
Ford doesn't sell cars directly. Dealers get what they get, and if this month's new cars don't have Package A then they won't sell that. This happens all the time, just go to any auto manufacturer site and configure the car you want, then call your local dealers to find one. I all but guarantee you you'll find some option package that is supposed to be available but isn't anywhere you can buy it.
And that's my point. No one cares about that, it's just not news. It's "the way the auto industry works". Except for Tesla. Then it's an existential disaster. It's just exhausting.
My understanding is that people had already paid for a specific configuration and while they were waiting for delivery of that particular car/configuration, it got changed out from under them? This understanding of how this affects existing orders could very well be completely wrong... but if it's roughly correct, then it's more analogous to not finding the desired config at the dealer, then special ordering it, then waiting 3 months for it to arrive at the dealership, then you get in the car, then they say "oh by the way we didn't have X Safety System so your car doesn't have it."
I think people would justifiably throw a hissy fit about that.
> I think people would justifiably throw a hissy fit about that.
FWIW, that would be a better argument if the people throwing the hissy fit were the ones who actually ordered cars. Are you waiting on a Tesla, for example?
Instead it's the peanut gallery of folks who just want to flog a point about the manufacturer out of some kind of emotional response. Like I said, it's exhausting.
Elon attracts the attention he does, both positive and negative, because he's regarded as a pioneer. IMO the general public does have an interest in trying to ensure 1) unsafe cars don't proliferate across public roads and 2) irresponsible business practices aren't celebrated.
Do you think the only people who had a right to be pissed off about Lehman Brothers' collapse were actual Lehman Brothers customers? This is all well within the public interest.
This is exactly my point. We started from a discussion about the availability of ultrasound sensors that the overwhelming majority of new cars don't even have and now we're on to "Tesla is unsafe" ... "Making changes to car configuration is an irresponsible business practice" and (sigh) "It is in the public interest that I be pissed off about this".
It's just exhausting. Can't you find something more productive to be outraged about?
I don't know where you get the impression I'm outraged? I figure this link is posted on HN because people like conversing about fast-growing companies and their practices, not because it's actually Tesla Owners Car Club News.
It seems extremely odd to come into a discussion then complain about how exhausting the discussion is and then stay in the discussion complaining about how exhausting it still is.
Also note: Most companies don't claim their cars are fully self driving (even with a Beta tag attached), so the analog to other cars missing these sensors is pretty obviously disingenuous.
Not to be rude, but this is so far off base as to be laughable. You don’t have to understand much about how the system works (really no more than what is contained in the last ai day presentation) to understand why none of this is a problem (specifically the occlusion thing).
The Tesla Vision only was tested by some state agency and has a higher score then the radar one. But only in terms of safety, I guess the software does other aspect as well.
There should be Tesla blog post about it or something.
As a human that has only eyes, I've smashed the underside of my front bumper on many a curb over the years in various cars because just sight alone, blocked by the hood and front end of a car is a crappy way of judging things like that.
I say this as someone who absolutely loves his Model3.. but a step back is a step back, no matter how one spins it.
Radar-delete was a user experience negative for me as well. My autopilot experience today is far more 'jerky jerk' in terms of stop/go traffic than it ever was in the glory days of 'radar distancing data', so much so that AP is now banned on anything but 'smooth sailing highway driving' when I have certain motion sick vulnerable individuals in the car. Add on to that it's now (rightfully, given it's lost the radar available data) far more touchy about weather conditions before it'll even engage. :(
Your wording, especially the "Not true" puts it in your own voice.
Not to mention, just repeating Musk's statement in the context you did (as a response to someone explicitly disagreeing already) implies that you agree with it.
Your eyes are an incredible set of cameras. They have huge dynamic range, extremely accurate and fast depth sensing capabilities, huge focal distance range, and mounts that move incredibly quickly and precisely to point at details.
There are a lot of good reasons why you shouldn't think that today's cameras are equivalent to human eyes. There are a lot of good reasons to believe that cameras will never be equivalent to human eyes.
Critically, your eyes are connected to a human brain, which is by far the most capable visual-spatial computational system in the known universe. Also critically: even your eyes + brain get things wrong all the time while driving.
I love the "humans do it with only two cameras!" comment. It's so funny to see people disregard the human brain, as if any computer today can hold a candle the the visual processing prowess it contains.
I say this as a happy Model 3 owner: Elon is remarkably stupid (or perhaps outright disingenuous) sometimes.
Not really, both USS and Radar give unreliable output. The cars you mention would all have phantom beeping from the USS and unreliable adaptive cruise where the radar suddenly doesn't see the car ahead.
The Tesla vision on the other hand works like a more focused human driver. It can remember that wheel stop is there.
Agree they shouldn't have jumped the gun with production till it was ready though.
While the USS sensors are crap they probably should keep them for things like small children playing behind cars.
People arguing for just cameras in cars have never been outside near-ideal road conditions that California offers.
Good luck driving with cameras only in northern countries (Scandinavia, Canada) when the sun is hanging low over horizon and is reflected back at you from the road.
Cars delivered today without USS will not have: "Park Assist: alerts you of surrounding objects when the vehicle is traveling <5 mph." (among other features like Summon)
This is the most basic feature that almost every car has had for years - but not your $60k Model Y. Why can't they wait to remove sensors until they've achieved feature parity with the vision only system? This seems crazy.
I don't really understand the push to do everything through the cameras. This has caused other problems too - like their decision to control automatic wipers with just the camera. Which meant in my Model 3, the automatic wipers were erratic and ineffective while in every other car with a normal $5 rain sensor the feature worked great.
I guess they think their computer vision and AI can catch up to and exceed the visual processing capabilities of vertebrates with their millions of years of evolution. I think they are wrong. Multi-media sensor data fusion is the only sane path for the foreseeable future, in my lay opinion.
They are wrong. A truly unbelievable coincidence coming across this article, I just got rear-ended by a Model 3 this morning.
Low speed, I'd figure about 10 mph tops: A softball to end all softballs for AEB, and somehow I got hit with considerable force. We've had radar based systems that would have avoided this since the 2000s.
I work in the self-driving space and while it's not an apples to apples comparison because of cost, all I could think is how our sensor stack would have allowed recognizing the car I stopped for, and probably the car it stopped for, let alone my car stopping.
I've already harped on how stupid the "FSD Beta" is, but I quite literally the absurdity of it shoved in my face just this morning only to come and read this. Why is Tesla even being allowed to run this circus at this point?
My entire point is that even outside of FSD this is a scenario a base model Corolla has the technology to handle: AEB
You don't actively engage AEB, at the kinds of low speeds involved AEB would have engaged and at least helped stop the car.
This was traffic that was going maybe 20 mph, and the car still hit me solidly enough for my car to lurch forward and almost hit the one in front of me
-
But as an aside, this is why Tesla's "Full Self Driving" shouldn't be on the market: It is always human error.
Even if FSD was enabled (I hope it wasn't) it'd be the human's fault for not reacting.
> Teslas AEB works but if the driver has their foot on the accelerator it won’t.
Seriously?!? Tesla drivers learn to basically always have their foot on the accelerator, because the car slows quite aggressively with the accelerator all the way released.
IMO the one-foot model makes it quite hard to drive without lurching a bit and without unnecessary accidental light breaking while cruising. I often imagine that careful haptic feedback on the pedals could enable a much better one-pedal or blended braking experience.
I feel that may be irrelevant - my 2019 honda odyssey will stop me no matter what. All lane departure / following, cruise control, etc can be off - but automatic emergency braking / anti colission will work. I would not have it otherwise - I do not have any automated stuff in my old wrx, but I wanted all the safety always in the car with my family.
(it is crazy to me how many brands only have the good safety stuff as part of some ridiculous 9000$ sunroom and leather and Sirius xm package / trim ;-< )
It is completely irrelevant, but that's not stopping people from downvoting the anecdote, seemingly because they don't realize (or don't want to accept) the fact that Tesla has regressed so far that they're starting to miss the bar that earlier versions of the same vehicles set with AP1!
Contrary to some of the replies, at low speeds AEB does override the accelerator, unlike the highway ACC component of AP, so it should have engaged.
For the record, I just took delivery of a Model 3 and seemingly all of these safety features can be toggled off in settings. Because of how complex these systems are, it wouldn't surprise me if someone turned off or configured automatic emergency braking because they thought it would make autopilot better or something.
I CAN turn off AEB in my car, but I would NEVER do that, because it's only shown itself to be very reliable. It's radar based. I never get false warnings from stationary objects and I've only had a single warning that I would consider phantom.
The IIHS tests vehicles with automatic emergency braking at 12mph and at 25mph.
When these systems were new, many vehicles were only able to decrease collision speeds. These days, many vehicles completely avoid crashes at both 12 and 25mph.
I tested it a little bit during the 2019 Honda Odyssey test drive. It seemed to work.
FWIW, Then unintentionally I had a bit of real world confirmation: a few months ago a car in front of me in slow traffic made a very sudden and complete stop.
I stepped on the brakes; after a second, as I slowed down, the brake feel completely changed and became harder, and I realized that I hadn't been alone - initially, car was braking for/with me, as confirmed with the massive orange letters on the dash board. Then as the system felt it was out of most immediate danger, it released and it was just me braking (that was the brake pedal feel change).
I do occasionally have it engage stage 1 of 3 (1 Visual -> 2 Audio -> 3 Auto Brake). It happens once a month predictable on very slight curves when it thinks I may be accelerating into oncoming traffic.
>an catch up to and exceed the visual processing capabilities of vertebrates with their millions of years of evolution
I mean tons of technology outperforms humans by a mile... plenty of other computer vision systems already outperform people so not sure why Tesla would be exempt here due to 'evolution'.
We don't have any computer vision system that outperforms mammals, let alone humans, at general vision, Tesla or not - especially on real 3D vision. We do have computer systems that outperform humans at extremely specialized tasks, such as facial recognition in photos.
Again though - these sensors performed a highly specialized task - there is no inherent reason Tesla wouldn’t be able to outperform them with a specialized visual system
The problem of using computer vision to compute distances to moving 3D objects in any weather conditions is too general - it is precisely the generalized vision problem I was talking about.
Ultrasonic or radar or lidar sensors are much simpler solutions to the problem of computing distances, as they do not rely on computer vision.
Note that the reason this problem is very hard for computer vision is that there is simply not enough information in a 2D picture to get 3D distance information, even with parallax. Our eyes also can't solve this problem. Instead, our visual system uses numerous heuristics based on our inherent understanding of simple classical physics and the world around us.
For example, we recognize that a particular blob of color represents a car, and that cars are solid objects, where each part of the car moves at the same speed as each other part (this assumption breaks if something flies off the car). We recognize that objects throw shadows, and what effect shadows have on color, so we can very easily tell the contours of an object even if it's speckled in light and shadow. We also know what approximate sizes cars are, and that means we can immediately differentiate a far away car from a close one. We also know that cars sitting on the road are probably moving, while cars on the sidewalk are probably parked, and that again helps us estimate their movement based on relatively little data.
If you don't believe that depth perception is based on far more than parallax, just try to explain how come people/animals with a single eye can still estimate distances to a very good degree (not enough to be marksmen, but more than enough not to run into walls).
> there is simply not enough information in a 2D picture to get 3D distance information, even with parallax. Our eyes also can't solve this problem.
That is certainly false - perhaps with extremely limited amounts of parallax yes but that isn't how this setup is working. Humans certainly use heuristics not only because our brains are capable of it but also because a face only has a few mm of separation, with enough parallax you can get phenomenal 3d information out of a visual system. Replacing a single sensor here is certainly not too general for good performance but also my point again was never to say this was an optimal or perfect solution only to say evolutionary arguments are stupid here because we build machines and algorithms every day that operate better than evolution does.
Because driving isn't some narrow task. It has a lot of variability in conditions. Human brain is very adaptable, yet we require 16+ years of training before they can take the wheel.
I'm not disagreeing that the task is difficult but this topic isn't about general driving - this is about replacing ultrasonic sensors with visual systems. Evolutionary arguments are meaningless here since we outperform evolution all the time even with hand engineered visual systems, let alone ML derived ones.
Nah I think it's a logistics and cost thing. Adding several extra sensors dramatically increases the complexity of logistics, building the car and the overall costs.
Because if the hardware isn't available they can't deliver the car and recognize the revenue. They can wait, but if they're gonna remove it anyway it lets them make more cars with less costs in shorter time. Seems logical if it is indeed parity.
I feel like Lexus is greatly underappreciated as a technology brand. They might be behind the curve on some things, but when they're ahead of the curve, the execution is always superb.
My 8 year old Golf can't drive itself, but it does a heck of a good job in parallel parking. It works very well, but I have to break and use the throttle.
At the same time, all other manufacturers stopped production until parts were available again. Which is the reasonable thing to do. Only they got ridiculed for how bad the supply chains are, because Tesla continued shipping, that Tesla shipped what is normally considered half finished cars was blissfully ignored.
That would be an extreme logistical and costly nightmare.
Tesla doesn't operate the dealership model, which all tend to double up as a service station. So it's far more difficult for customers to get to the telsa service stations. Since there's less, that also means they'd be insane waits when the product comes back in.
My 2018 VW GTI locked that functionality behind a several thousand dollar interior package or a few hundred dollar dealer software update :( . I literally have the sensors and software, it's just turned off because I don't like leather seats.
This does not make me optimistic for Tesla's future. Unless this is truly a temporary thing due to a supply shortage and they are planning to return to normal when they can.
Tesla's track record is poor. They cheaped out on a few dollars for an IR rain sensor like everyone else uses, and as a result the wipers on a Model 3 are psychotic. Combine that with their hatred for manual controls, and I had a terrible experience every winter when the rains really started.
Now they don't have radar, which everyone else does. AFAIK they still haven't got that reliable IR sensor. No CarPlay yet, which 99% of all other cars have. No 360 degree camera view. No ultrasonic parking sensors. Why again would I buy another Tesla? The competition has more features.
The cherry on top is Tesla's demonstrated willingness to screw over their customers well after the sale, and not even for the benefit of the company. Just because they hate their customers, I guess. That alone is a deal killer for me now.
I agree that Hyundai is killing it, and I will seriously consider the ioniq 6 when it is available.
However, I don't think you can discount Tesla here. They produce ~ 5x more model 3's than Hyundai can produce ioniq 5's. And the Tesla battery / motor specs are much better.
I don't see any real competition for Tesla at their price point, specs, and production capabilities. Which is frustrating, because I want an EV but I will not buy a Tesla.
Which price point? Lots of non-Tesla EVs selling for a wide range of price points right now.
Totally with you on the Ioniq 6 — very attractive, and will have better service options should something go wrong. Would also consider Polestar 2. Maybe the Ford Mustang Mach-E. Maybe.
The only thing matching the Model 3's drivetrain specs (350+ miles, sub 4s 0-60, AWD) is the lucid air, which is more expensive and not in production yet. The mercedes EQS matches the range, but is much slower (and more expensive). Everything else has both less range and is slower. The Teslas also have the best MPGe of all the EVs available.
It is hard to deny that Tesla is dominating the EV scene right now. They objectively have the best, most efficient EV platform and the best charging network.
Realistically, my plan is to wait a few years before I buy my EV. I think with supply chains cleared up, battery tech improving, and old auto manufacturers ramping up their EV capacity, the options will be much better in ~5 years. I really hope the Dodge Challenger EV is good.
The right answer to the problem of "Tesla not having enough sensors" is NOT to rush into the hands of a brand who can't make an engine that doesn't grenade itself.
Oh, your engine made it past 50K? Good thing you're cars are gonna get stolen since they were too cheap to include immobilizers!
Do you own Ioniq? I am looking at EVs now, and top of typical things like range, reliability, etc. that you need to worry about, I also don’t want a make that is moving into “heated-seat as a service” territory. So far I haven’t heard about such crap from Hyundai, but I don’t know anyone who owns newer model.
As an engineer, I see little value in removing any sensors. No matter how noisy or limited they are, they bring additional information which, if processed, could improve overall accuracy. This is commonly referred to as "sensor fusion". So this is probably is purely cost-saving move. While I understand how radar could be expensive, I am not sure the saving on much cheaper ultrasound sensors is worth the loss of additional information about the environment.
As Telsa Model 3 owner, I can attest that their automatic parking rarely works anyway. They should be adding more sensors to improve it, not remove them.
I've worked on 3d vision systems before and have experimented using different types of sensors in multi-sensor setups, and I'd argue that overwhelmingly it's a software problem.
More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
I do agree though - assuming processing power isn't a performance bottleneck here, more data is always better. But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
Completely off topic, but there's a fascinating list of animals with more than two eyes. For example the four eyed fish which uses one pair of eyes to see below water and one pair to see above water, when it's swimming along the surface.
> More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
Unless other sensors break, then they're suddenly useful redundancy. In a safety critical product like this, that alone should preclude the removal of sensors.
> But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
You still get two, both for depth sensing and for redundancy; and some animals like spiders have 6-8 eyes.
You also have ears, a body that allows to move your face, your eyes can move too, and the brain has a model of the world including a model on how people thing and how would they act.
So will your Tesla have some eyes that can turn and look in all directions? would look cool to have snail like eyes.
“In the near future, once these features achieve performance parity to today’s vehicles”
Right... like how vision vehicles have parity with autopilot speeds and following distance? Oh wait they don't.
Vision-only vehicles are still limited to a max of 85mph and 2 car minimum following distance for autosteer while radar is 90mph and 1 car following distance.
Texas has 85mph highways. The two car following distance constantly leads to cars cutting in-between you and the car in front of you because the gap is so large.
edit:
Just to clarify, when it says "1 car length", the car adjusts the amount of space depending on speed. So its not literally just one car length.
Dude if you are going 85-90 miles an hour and you aren't leaving AT LEAST 2 cars in between you, please stay off the road. You should be keeping 3 seconds of distance between you and the car in front of you. At 90 miles per hour that's a hell of a lot more than 2 car lengths.
op is posting about the people who cut him off. If I use adaptive cruise control at 2 or 3 (stopping distance) car lengths, then I'm just cut off every time, even though there is nowhere to go.
The end to this unsafe and traffic causing behavior is something I look forward to in more self driving cars. (Fewer drivers racing dangerously for "pride" points to end up at the same traffic light in 500')
> op is posting about the people who cut him off. If I use adaptive cruise control at 2 or 3 (stopping distance) car lengths, then I'm just cut off every time, even though there is nowhere to go.
Then your entire state, or at least, the residents of it that drive on the freeway need be in prison. Or, at least, to have their licenses revoked.
I drive with a six car gap and even that was a close-call when someone had a contact about 3 cars ahead.
The good part is that if the car in front of you flashes the brake lights, that seems to trigger the braking. And some of the newer cars blink hazards automatically when braking hard.
My Subaru is also vision only (Eyesight), which also has the cameras too close together (in my opinion) to detect a car ahead braking, but it did not slow down quickly when facing a stationary car facing the wrong way around (my feet went down before the car did anything).
All of these need more parallax, at least better than a human to do this well. But they do react to visual cues like brake lights and hazards. Though sometimes, it sees a motorcycle nearby as a car in the distance simply due to the lack of parallax.
"more sensors good" is not always a good thing, when the sensors disagree with each other. Sometimes more sensors just mean more corner cases and braking at bridges throwing shadows on the road.
So just removing something feels wrong, when I don't see them improving the thing they're putting more eggs into.
I think this is where Tesla is not credible right now, when they say they're improving something in the car, but it is a thing I cannot see, check, confirm or test.
> "more sensors good" is not always a good thing, when the sensors disagree with each other.
When they disagree, one of them is wrong. Would you rather have just the wrong sensor or both the wrong one and the good one as inputs to decide what to do?
> Vision-only vehicles are still limited to a max of 85mph and 2 car minimum following distance for autosteer while radar is 90mph and 1 car following distance.
Is that common knowledge, what's the source of that I'm curious.
Aha so we’re talking about a Tesla design spec constraint not some fundamental safety thing or computational limit or some other physical barrier or whatever. Interesting.
Just from a physics stand-point, you can't bring your car to a safe stop (or any kind really) in two car-lengths at 85mph. If you take the most generous scenario possible, that the car in-front of you will continue for another two car-lengths (but at a slower speed than you), it's still pretty much impossible to bring your car to a complete stop in four car-lengths.
Because humans are not good at eye-balling distance, especially while moving. The only other point of reference is going to be the only other thing travelling on the roads at a similar speed... cars. Hence, the rough comparison of "car-lengths".
I thought I remembered being taught in driver's ed that you should leave a 1 car gap for every 10mph of speed. 1-2 cars at 85mph is clearly not enough distance to stop.
That is similar to what car length adaptive cruise control does. In heavy, fast traffic, you won’t get your 7-9 car length though, due to frequent cutting in and weaving around you.
I have had a model 3 for four years and the autopilot has steadily gotten worse.
In summer 2019 we went on a road trip that was 10 hours of total driving (each way), where I actually drove for maybe 30 mins total. It was flawless and downright magical.
Since then we often have the car slam brakes out of the blue (once waking up my sleeping family and causing both kids to cry), and a few other times.
I have lost trust with the autopilot system. It’s that simple. And I LOVE my model 3, the supercharger network, all else.
I don’t know what’s going on over there but I am not remotely convinced that they have their act together with vision-only based autonomous driving.
I drove a new M3 in Germany last week and was blown away by how well the assisted driving features worked. Below 60 kmh it also allows you to never touch the steering wheel while on the highway in a traffic jam.
For my purposes it seemed superior to the current Tesla system and after watching some comparison videos, my conclusion seemed accurate.
Is Tesla planning on replacing the autopilot branch with FSD? Because right now autopilot is trash and it’s basically a shareware/free trial of FSD and I am not willing to upgrade based on my current experience with AP.
Yup 2022.24 should have a lot of the autopilot changes from the FSD branch. I haven’t tried it personally but reports are that phantom braking is reduced there too.
You've experienced many false positives (braking going off for no reason), which is not a surprise, because they're lower cost than false negatives (braking not going off when needed), so if you work in Tesla AI you're going to accept lots of them in exchange for a tiny reduction in false negatives.
But maybe 1 in 1000 people will get to experience the true positive (braking going off when needed) that would have otherwise been a false negative, and their life was saved as a result of the updates, unbeknown to them.
> Since then we often have the car slam brakes out of the blue (once waking up my sleeping family and causing both kids to cry), and a few other times.
And you still drive this car today? In the same mode?
This would potentially work... if the cameras could see the places where the ultrasonic sensors were. But they can't! There are big blind spots near the bumpers. I guess they are planning to just have the car fill in the blind spots by remembering what it saw from other perspectives? But that only works when the car is moving and can't account for small obstacles that can move unpredictably, like animals or toddlers.
I'm sympathetic to the idea that it's possible to drive a car with just cameras, but the specific cameras that Tesla has aren't good enough. They need more coverage, better quality, and self-cleaning.
> Will vehicles equipped with ultrasonic sensors have their functionality removed?
> At this time, we do not plan to remove the functionality of ultrasonic sensors in our existing fleet.
I really really do not like the use of the phrase "our existing fleet" here. It heavily implies that Tesla has some level of ownership over the vehicles they sell post-purchase.
Technically, this is true for any product which has a pathway for software updates. Arguably this includes nearly all models of car on sale today, which can have their software updated — even if the process is more cumbersome.
The key difference is that a Tesla vehicle can conveniently perform comprehensive whole-of-car updates while it's sitting in your garage, whereas many other cars can only have some systems updated during a service visit.
Regardless, I don't see how software updates being convenient could serve as the defining criteria for a manufacturer having "ownership over the vehicles they sell".
I’m saying they couldn’t do it because it’s a safety related feature of a heavily safety-regulated product category. I don’t think any EULA could override that.
The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward. What should have the higher precedence? So they ultimately decided that cameras can do what ultrasonics can do, but even better.
They won't remove ultrasonics from the delivered cars, but they will probably disable them in coming months.
>The reason why they decided to remove ultrasonics is that combining contradictory signals from cameras and ultrasonic is not straightforward.
It's fairly well known that nearly any kind of sensor fusion 'is not straightforward'. If that's a fact that was just recently discovered by Tesla -- causing a recent change in their lineup only now -- .... well, i'll just say that it speaks to Tesla's confidence being greater than their technological prowess.
I don't think that's the underlying issue here myself. I think this change is likely caused by an effort to get away from certain vendors and ship cars more rapidly.
>What should have the higher precedence?
again, this isn't a new problem and there are volumes of resources regarding that. It's a tough problem, but not new.
How could vision possibly replace the front bumper ultrasonics? Their primary function for me is showing proximity to objects at low speed, and many of those objects aren’t visible to the front camera because the hood is in the way.
The decision about what sensor to use is more easily made when one of the sensors provides no data, or clearly disturbed data. Like in case of fog, mud on a camera, pouring rain, etc.
My phone often signals me that I have to clean the lenses of my camera, so I'm sure it's possible to detect unreliable sensors. Sure, you have to decide on all sorts of tipping points, and in the end that is probably expensive to develop and the reason the sensors are dropped.
If fog or rain is so intense that objects within a few feet of your car are obscured from vision, I'd suggest that maybe you shouldn't be operating the car at all, regardless of any assistance technologies it might possess.
My brother's Tesla signalled that the side cameras were faulty or needed cleaning when we drove down a country lane in the dark. This did not raise my confidence in the Autopilot system.
This sounds like a problem well suited for ML, which is basically the thing most of us expect Tesla to be best at. Just throw sensor data at cases and let the situations in which certain values should get precedence manifest in the weights and biases.
I expect Waymo to be the best at it. Waymo has a bunch of sensors on their cars to provide accurate information, because they know that cameras aren't always reliable. Tesla may not be right here.
An autonomous vehicle is operating on a road network designed for humans with eyes. If we are aiming for the kind of “common sense” required for parity with a human driver it seems intuitive that the vehicle would have sensory inputs in common.
Cars do not use ultrasonic sensors for autonomy. They are for low-speed situations (such as parking) and are designed to give more information than the driver is otherwise capable of gathering.
While I generally agree to get parity you would use common sensory input, I would hope the goal is to go beyond human level and navigate in for e.g. fog conditions or heavy rain, etc. things radar would help with. Ultrasonics probably don’t buy them much at this point as long as they have good object permanence. Think about parking in a garage. If you can map objects to special memory adequately then maybe ultrasonics aren’t necessary. However if the object is out of sight out of mind, you probably need the ultrasonics to help not hit it.
Does anyone else think they're running out of compute power on their DL computer and trying to eliminate sensor fusion in order to preserve precious cycles for vision processing?
I've long wondered if TSLA screwed up by choosing the hardware before fully solving the problem, leaving millions of vehicles incapable of performing safe FSD without HW upgrades.
It does seem rather underpowered if you compare it with the requirements of state of the art Deep Learning models in other areas, not an apples and oranges comparison though.
This will absolutely not work, I have FSD and it messes up distance of large well defined objects, think about how this will work with a concrete wall, or curbs that it can probably no longer see.
When I pull into my garage sometimes it recognizes my shelf as a car ON FSD....
> The output of the “occupancy network” isn’t displayed to the user right now.
I may be mis-understanding, or experiencing something else but I think it is if you're on FSD. It shows up as grey (elevated) regions around the visuals, not the standard "not a road" grey. The gates for my community began showing up in this update with it, but they look like about 2-3ft tall triangles, not a gate... though still recognized I guess?
As stated by Tesla just a few days ago, the occupancy network is not currently shown in the customer-facing visualisation. (Some unrecognised objects are represented by generic placeholders. This is derived from, but not the output of, the occupancy network.)
This video links to the relevant citation, and if you continue watching for around ten seconds you can see examples of the occupancy network visualised:
I don't think it will ever be shown as "raw data" like that though. They will most likely only show the current street view, with cars and unknown objects that are in the road.
Absolutely. The point is though that the visualisation isn’t the occupancy network. And the occupancy network is a lot closer to a substitute for ultrasonics than anything shown on car screens. How much closer is hard to guess and likely unknown by anyone commenting in this thread. But I dare say it’s probably closer than some critics assume.
> The point is though that the visualisation isn’t the occupancy network. And the occupancy network is a lot closer to a substitute for ultrasonics than anything shown on car screens.
What is it then? Because it's my car, detecting the occupancy of something that isn't a car that it previously did not show visuals for, and the release notes don't mention anything that would enable this visualization aside from the occupancy network[0].
> Added control for arbitrary low-speed moving volumes from Occupancy Network. This also enables finer control for more precise object shapes that cannot be easily represented by a cuboid primitive. This required predicting velocity at every 3D voxel. We may now control for slow-moving UFOs.
> Upgraded Occupancy Network to use video instead of images from single time step. This temporal context allows the network to be robust to temporary occlusions and enables prediction of occupancy flow. Also, improved ground truth with semantics-driven outlier rejection, hard example mining, and increasing the dataset size by 2.4x.
Additionally the wording they use when presenting the research was vague, they say it's not currently shown, but proceed to show raw visual data. What I take this to mean, is that they won't ever show the raw data, but will build visuals from it. They even detail this by mapping occupied space to "the nearest available item"[2]
FWIW, _if_ this is the occupancy network in action it's lacking a lot. It can't detect thin (~4" squared, floor to ceiling) poles in my garage, but my ultrasonic sensors can. (I know this because my car showed a path going straight into one.) I also can't find anyone discussing the visuals that I am seeing on my car either.
Additionally, the cameras on teslas have a very big blind spot in the front of the car, exactly where the ultrasonics were. [1]
My rear view camera regularly gets covered with dirt and grime after a highway run during the monsoons. The car has both camera and radar so that I can have reliable assistance while reversing. With this experience in mind I fail to see how Tesla’s camera-only move is reliable for the use cases that they have mentioned.
Tesla cameras have sizeable overlap, so if one is obstructed it tries to extract the information from the others. If it can't then it tells you to clean the cameras. It's not perfect but it's better for them (Tesla) and hopefully better long term for all Tesla owners
They have some overlap but not large enough to be reliable. My B pillar is regularly blinded due to their inefficient moisture sealing (called out in their manual instead of fixing it) and sunlight. It causes it to fail to see most things to our side
They are reliable enough to use for FSD: https://youtu.be/hx7BXih7zx8 so i think it's reliable enough for park assist, summon etc. Yeah you're def right about the moisture sealing... it's been hit or miss on production tolerances
I disagree. regular navigate on autopilot and the FSD Beta are still very flakey outside of pristine conditions for me.
At this point, I either am expecting Tesla to go ahead with a camera upgrade with an expensive retrofit in the future, or FSD will never fully realize.
This is something that comes to mind when I see "designed in California" for a product that can't handle snow. Guess the same thing applies to monsoons!
My VW hides the rear view camera under the emblem, so it isn't exposed to the elements and only flips out when in use. It's brilliant, at least as long as the motor that flips the emblem holds up.
When parking in tight spaces, getting exact-distance feedback from the ultrasonic sensors is pretty handy. I can't see how vision will replace that; I don't think any of the cameras can see that close to the bumper.
I read this Somewhere(tm), if anyone can confirm it I'd be grateful:
The issue with Tesla's camera only neural network thing is that it needs a TON of cycles to determine the distance of every point in the image.
With LIDAR, for example, you get the direction and distance in the same datapoint without any extra calculation.
And the current generation of Tesla cars are operating pretty much at the limit of their power and will run out of resources unless Tesla manages to do some crazy optimisations on their detection algorithms/neural network training.
As a Tesla customer I’m so used to being constantly gaslighted that it’s a pleasant surprise when they’re only removing features from new cars rather than my existing one.
When leadership sees numbers going in the wrong direction, they squeeze everywhere they can. Often this means making poor long term or poor engineering decisions in hopes they can squeeze some short term gain.
What I see, from decades of being an engineer, is a company making a bad decision probably to cut costs. Why would Tesla need to cut costs? Something isn't working out correctly.
There's no evidence that Tesla has a unit cost problem. If anything the opposite is true. Tesla has significantly higher automotive margins than all mainstream competitors, despite producing 100% EVs with large expensive lithium battery packs. While their competitors mostly produce mature, highly cost optimised ICE vehicles.
The result is that Tesla has insane amounts of cash on hand and record-breaking profits, despite rapidly constructing/expanding three factories on three continents.
It's not the unit cost that's the problem. The unit cost decrease is a patch that can buy time to address eg an engineering cost problem, or a "we're going to need to buy bigger chips soon to get this to work" problem.
Given that my Model 3 seems to think (visualized on the screen) that my garage door is a semi-trailer pointing towards me, I'm inclined to not believe that they can't do all of this with cameras only.
This also means these sensors will be disabled at some point in older cars too. Interesting move though I am not qualified to comment on the technical merits of it
I am not sure if this is smart technically but from an environmental perspective I prefer cars to have passive sensors. Blasting out ultrasound and Lidar is pollution that can impact animals.
Personally I think self driving cars will only work once roads are built for them with specific markers and self driving cars communicate with each other.
I'm not entirely sure this is a good idea. But their removal of radar seems to have gone okay, so perhaps they've done their calibration tests and know this can be done.
It is crazy to me that these cars are always recording and streaming the result to Tesla. Sure, that data might be good for training self-driving cars, but also a lotttt of other things.
They are absolutely not streaming the data to Tesla in real-time.
They are storing some data on-board to retrieve later which makes it convenient for them to deflect blame in crashes.
Yeah - there's an interior, driving-facing camera(at least in the model Y), which I suspect has no real purpose other than to send video after a crash to Tesla, so they can say "look, the driver wasn't staring at the road, not our fault autopilot crashed them".
I would cover it up, but I'm afraid that it will look incriminating if I ever am in an accident as to why I did it.
The camera ensures you are paying attention with using Autopilot or FSD. For example if you pick up your phone, it will alert you to pay attention and eventually shut off.
I own a 2019 model 3 and the last I heard was that they couldn't use those cameras for monitoring a driver's attention because they were intended for monitoring the cabin when you "rented out" your FSD vehicle, so they aren't aimed at the driver. Have they changed the orientation since then?
Geohot was right. Tweet re: LIDAR, but still: "When this delusion ends and all these companies fold, it won't be LIDAR vs. Or ride sharing vs. None of this matters. All that matters is building an intelligent machine. Self driving is 100% a software problem. Given software written by God, a smartphone can drive a car."
Musk has said similar things in the past; that cars should be able to do everything human drivers can do with only vision, because that's all human drivers have (discounting the far more minor sensory inputs like sound and touch).
I find that take fascinating. Its, in effect, saying "well, computer hardware and the human brain are similar enough; the human brain just got an evolutionary headstart on its software, so once we solve that we're golden". Not only is there no evidence to support this conclusion; its trivial to recognize that the human brain and computers operate differently, at a fundamental, logical level. The brain has no transistors. It doesn't care about XOR or solid state memory. There is literally no part of the human brain which even remotely resembles any internal part of a computer; not remotely.
Why is it so accepted that this problem can be solved in software? Or, maybe: it can be solved, but it will take so much computing power that any computer which could solve it in real time couldn't fit in or be powered by a car? Virtualizing linux on my Windows PC incurs a double-digit percentage performance penalty, and that's on the same ISA; what is the human brain's ISA? AI is an emulation of human behavior; what is it's performance penalty? Are we just surfing Moore's Law hoping to outrun it?
I just find the hubris of his statement baffling; that we, meaning Our Scientific Community, literally does not completely understand why or how acetaminophen works to alleviate pain in the human brain; yet they and Musk are wholly confident that given the same inputs their software can emulate the human brain's outputs.
I'll commit this to the public record, and feel free to revisit in ten years; if a synthetic brain system ever hits performance parity with the human brain in a task like driving or consciousness, it will be running on hardware that looks nothing like what I'm typing on. I suspect that even calling it "hardware", with emphasis on the word "hard", will seem antiquated. Maybe self driving is 100% a software problem; but the brain doesn't really differentiate.
It would be interesting if the thing wrong with the "eyes are just cameras" logic was the subtle differences. Like how retinas are not flat, but concave or the way our eyes continuously reposition to fill in gaps in our vision. Such differences seem incidental or compensatory, but it's not impossible that our brain is able to make better distance/position judgements because of them.
The other missing piece is just how good the brain is at object recognition and persistence. A huge advantage when you have a 2d image and you need to gauge distance (at least partially) by size or when you need to avoid a moving obstacle by predicting its trajectory.
Your eyes 100% do several other things to give depth cues to your brain. That's why you can cover one eye and still judge depth pretty well. In fact, maybe two eyeballs really are all you need to do safe driving, but 2 CCD image sensors are NOT an adequate replacement for all the other things eyeballs give you.
Your eyes know the distance of something because they know how they had to move to bring the object into focus. Your eyes also have to be at slightly different angles to look at a distant object, which also is a depth cue, as well as all the temporal depth info you get simply as a byproduct of having an insanely powerful and robust object classification and world simulation engine in your skull.
I remember seeing some studies suggesting that vision training of artificial neural networks forms "similar" structures to the human brain. I don't think the brain is that magical compared to modern DL.
> it will be running on hardware that looks nothing like what I'm typing on.
not to mention the fact that humans with just sight and sound for sensory input are pretty crappy drivers - just look at accident statistics. Don't we want our AI drivers to be better than humans.
If this were a planned move, the software would already be ready to replace it. It is not. Cars without radar still aren't at parity with the car that have it, although they "fixed" this a few weeks back by simply turning off the radar on the older cars.
The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
Even more concerning: some features simply cannot be replaced by camera. Tesla infamously does not have a front bumper camera, making it impossible to detect obstructions occluded from view by the hood. For 3/S, which are low, this means it will be impossible to detect low obstructions like too-high concrete wheel stops and similar obstructions. Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Laughable that this is being done on cars from a high-end brand, and that the remedy for customers who are getting worse cars today than you would get yesterday is to wait for these features to "be restored" at some indeterminate time in the future.