Hacker News new | past | comments | ask | show | jobs | submit login
NTSB: Tesla Model S in Crash Couldn't Have Been Using Autopilot (caranddriver.com)
186 points by gok on May 10, 2021 | hide | past | favorite | 199 comments



No matter the cause of this accident or previous ones, Tesla's response highlights why I'll never buy one of their cars. When an accident occurs, the response by Tesla should be nothing more than "we will provide any and all data we have concerning this incident to the appropriate authorities when we receive a valid request for that information."

Musk tweeting about whether or not the driver purchased certain features is insane. He shouldn't have access to that data, let alone be allowed to publicize the information.


I’m not so sure.

I recognise the risk — these days I assume that entirely innocuous facts about me today can become socially unacceptable in the future, having witnessed this happen to multiple people in my life already.

I also recognise that if Tesla doesn’t get out in front of every single incident, it may set back the replacement of human drivers with safer AI drivers.

I also also recognise that Musk is wildly overpromising on the self driving tech, and really only trust him as a rocket scientist and salesman (be the thing he’s selling cars or visions of the future), not digital privacy.

I don’t really trust any famous person for privacy, because they necessarily don’t have even close to as much of it as a normal person.


>I also recognise that if Tesla doesn’t get out in front of every single incident, it may set back the replacement of human drivers with safer AI drivers.

Tesla doesn't need to get out in front of every incident. Musk feels the need to because of the outlandish claims he continues to make about how close Tesla is to having autonomous vehicles. This is something Tesla's board should have reined him in on long ago.

>I don’t really trust any famous person for privacy, because they necessarily don’t have even close to as much of it as a normal person.

I agree that famous people have less privacy, and there's a lot of unfairness around that. But even if Musk hadn't chosen to become a public spectacle, that doesn't give him the right to treat others the same way.


In this case he felt the need due to outlandish and incorrect claims about the nature of the accident.


> But even if Musk hadn't chosen to become a public spectacle, that doesn't give him the right to treat others the same way.

Of course.

What I’m saying is, I don’t expect the famous to be able to empathise with the rest of us, not that it’s good that they can’t.


> Tesla doesn't need to get out in front of every incident.

Tesla has been attacked in the past so maybe Elon takes such issues personally.


I'm not convinced that 'autonomous' driving will ever be better than non-distracted/drunk drivers. Suppose this is a task that, short of overhauling all our road infrastructure, ultimately requires GAI? I think we generally have way too much confidence that cars will ever drive better than people, at least on the current road system. I am not saying that they definitely won't, only that I'm not sure we should take the attitude that anything that gets in the way of autonomous driving is sacrificing human lives. We don't know that.

I also worry that mixing autonomous systems with people will result in more accidents, both in the case of drivers nodding off instead of supervising the car, and in the case of other drivers being surprised by the autonomous car's behavior. On top of this, it would not be surprising if autonomous car accidents were more psychologically devastating - they would likely commit different categories of error than human drivers.

Conversely, if the tech industry wanted to save lives on the road, easily the most impactful thing they could do is shut off any phone screen moving faster than five miles an hour. This would inconvenience passengers, but we did after all survive for many years without phones in the car.


> Suppose this is a task that, short of overhauling all our road infrastructure, ultimately requires GAI?

I would be surprised if level 5 self driving requires an AI smarter than a mouse or an eagle — freeform environment, arbitrary weather, hostile others, camouflage(!), long range navigation.

The rest of your points I broadly agree with though.


How many birds have crashed into your windows? At my parents house, this happens a couple of times a year.


One, over my lifetime.

I don’t think this a great comparison. Firstly, sheets of glass are not a common road obstruction, and even if they were cars have the added advantage of more sensor types (doesn’t require much more brain, it’s equivalent to just another ‘colour’); Secondly, I’ve also been hit by a car driven by a human.

The driver had stopped at a give-way (major-minor junction) [0], then failed to look in my direction and pulled out into me as I was turning into that road.

[0] right where this unrelated white van is in this Google street view photo: https://goo.gl/maps/Xe87aruFbbUazXhp7


> at least on the current road system

I came to the conclusion years ago that we're going to need RFID tags on everything (or something similar) to make FSD work safely enough without AGI. We'll probably also need dedicated lanes in some places. Nothing I have read or seen yet has given me any reason to think differently.

Edit: The common counter-argument I see to this is that tags could be altered maliciously. My response is that we've already been building systems which scan for and filter out malicious changes for a long time, and we keep getting better at it. It seems to me like this kind of system would be far easier to design & update than any FSD software I know of.


RFID is no solution at all, how do you ensure a toddler running into the road is properly RFID chipped?

If you need every possible obstacle to be tagged with a locator beacon, you’ve essentially just reinvented 1970s autonomous subway or PRT technologies that require the track to be completely fenced off from surprise animal or humans or debris ended up on the track.


The idea is not that it would be RFID only, but to reduce the number of non-tagged objects & people so false positives happen less often. Obviously you still need a way to deal with random, unexpected obstacles, but it's a much smaller problem space when the surrounding environment is mostly properly tagged.


There's a very real risk that half-assed "AI" driving just leads to worse human drivers that stop paying enough attention to driving normally. Sensors are definitely better than humans at reacting to collisions though.


Hah, I thought you meant worse drivers outside of the Tesla. I guess if I wanted to be an ass, I can cut off most Teslas because they have the auto-brake feature. (Well most modern cars have it).


You can, but if you cause an accident - you now have 8 cameras recording your every move. I bet I know how that goes down in court.


You know who’s making real progress on AI driving? Waze is. I don’t want to be too dismissive of EV progress thanks to Tesla but I feel like there’s no contest about who is pushing forward AI driving beyond the status quo.

And Waze doesn’t need a professional troll to go around garnering media support 24/7.


* Waymo not Waze

* Operates in a tiny geographically fenced area

* Despite success, has not expanded to any new areas or rolled out any further.

* As a consumer, I have no access to buy Waymo, but I can buy at least the current implementation of Tesla Autopilot.


Waymo believes that the the roads need to be fully 3d mapped, with crosswalks, street lights and signs before their car can autopilot safely on it. More mapping will happen if/when the AI autopilot is ready.


Tesla autopilot is not a comparable product though! It operates nowhere at Waymo capacity.

My understanding is that Tesla is extremely behind on these fronts compared to Waymo, and that gap keeps widening every day. I know who I would bet on


But they aren't trying to solve the same problem.

I believe Waymo works better in a pre scanned area, where there are no construction sites or major changes, but in a general context you can atleast attempt to use Autopilot, which you can't with Waymo. Thus I would not say that Waymo is miles better.


About Waymo: I used to think they were a leader in this space, partly based on their published safety statistics, but I am no longer so sure.

I had the great fortune of living in "Old Palo Alto" (think "mega rich tech CEO paradise") for a year while my wife was at Stanford. This is a grid of low-traffic, low-speed, sidewalked residential streets, ~10 minutes on all sides from the nearest highway, with typical mid-Peninsula weather (in other words, essentially perfect driving conditions).

I would would frequently see Waymo vehicles going all through the neighborhood at 6-9AM on Sunday mornings when I was out on long runs. They would often be essentially the only cars I saw on the road.

If there is a better way to pad your urban safety statistics, I do not know of one.

Old Palo Alto on Google Maps, for the curious/unfamiliar: https://goo.gl/maps/cAqifusdRL647YgWA


Do you mean Waymo?


Yes


No way.

I have my mom Texting me that I shouldn't drive my Tesla because of this, and I should sell it and buy something else.

How many of those people will read the follow up paper from the NTSB?

This is literally the definition of smear. Shout something bad about a successful X, and by the time the actual truth comes out, the damage is already done and people have moved on.


This is a problem created entirely by Musk. He repeatedly makes claims about Tesla's technology that even their engineers say don't match reality [1]. No other car company gets the scrutiny Tesla does because no other CEO persists in making outlandish claims about the capabilities of their vehicles.

Tesla needs Musk to do two things: 1) Stop wildly over-promising Tesla's FSD capabilities, and 2) stop publicly posting rebuttals to any news about accidents involving Teslas. Combine this with recreating a PR team for Tesla who can work to push out real information about what the cars can do plus respond appropriately when asked to comment on any accidents, and the public and media will move on before you know it.

[1] https://www.cnbc.com/2021/05/07/tesla-engineer-to-california...


You completly missed his point.

You seem to complain about it being more based on a PR angle rather that facts and truth, but the whole point of what parent is saying is that this is a game Musk himself is pushing.


No, I did not.

When the truth is the PR angle, it should be pushed.

A lie shouldn't be made up to advance PR, but if someone is reporting an actual lie? Anyone speaking up their truth has my 1000% backing, even if I do not agree with that position.

I am not willing to join the post-truth era of American politics.


> Musk tweeting about whether or not the driver purchased certain features is insane. He shouldn't have access to that data, let alone be allowed to publicize the information.

I agree with you up to this phrase. A company knows exactly what you purchased from them, so knowing what a certain customer bought is just a call away, and not only for the CEO.

Regarding publicizing if you bought a certain feature or not, I think withholding that information when the media, and even the police, were blaming the non-existent feature for the accident, is more than you could ask of most of us.

Choosing between risking a fine for divulging shopping data, or a costly and unfair reputation hit while the police investigates. The decision seems obvious.


The only reason Musk immediately starts defending Tesla is because the media immediately blames Tesla when anything goes wrong.


It is international news any time a Tesla crashes. This is not the case for any other manufacturer.


If that were true, he should just shut up and wait for the NTSB results to speak for itself


The media does not care about the accident itself (nor the NTSB findings) so the results wouldn't make headlines. Let's see how many newspaper will retract their fake news about this particular accident...

What they care about is all the rage they can spark from anything seemingly bad that can be associated with Tesla and/or Musk.

I'm sure they would change their coverage if Tesla were paying billions to them through advertising. But they don't, so here we go.


Honestly I'm not sure how that follows from my comment at all.


The story was widely reported as a driverless Tesla crash. In fact, there are no driverless Teslas. Furthermore the Autopilot driver assistance feature was not even used and could not have been used. It's too bad you will never buy one of their cars because of Musk trying to correct this story, because their cars are quite good.


I agree, but that won't stop me from considering one when my next car purchase comes up. Overall they are much safer than your regular fossil vehicle


If you actually were a potential Tesla buyer before this, you'd certainly be in the minority if this is what changed your mind. Most people see the transparency as a good thing, especially people with concerns about whether their own car might kill them. By getting in front of the story, he's doing his job. Correcting a false and derogatory narrative that threatens your differentiating feature in an extremely competitive market is absolutely the most important thing he should be doing.


I am totally with you and this is the ideal situation in a perfect world.

However, prior to any tweets by Musk, the media already claimed that autopilot was (likely) on as there was no person in the drivers seat.

Is it possible that Tesla's response would have been different in case there were no such claims being made initially?


It feels a bit odd for them to say, autopilot could not have been used in that area. "In a reconstruction using a similar Tesla, the agency found that Autosteer was not usable on that part of the road, meaning the Autopilot system couldn't have worked." Whose to say that Tesla didn't update something between the crash and when the NTSB tested autopilot in that area? I don't think they did do that, but are there any safeguards in place to detect that happening?

Its one thing to say that a car cannot do something because of a physical issue on the car, vs some software that can be updated or have bugs that haven't been caught yet.


This point feels a little strained. There is a question: "Was autopilot in use?", and an experiment was performed to answer that question, which produced an informative result, which was reported. You seem to be attacking the linguistic structure of the statement by interpreting "coudn't" to mean some kind of existentially inviolate truth and not just... the result of the experiment.

I mean, come on. This is what people were saying within hours after the crash: autopilot as shipped simply doesn't have behavior consistent with this accident. It won't command the accelerations that would have been required on that tiny street and (as measured by the NTSB) won't even engage in that particular environment.


My car has never allowed AP to be engaged on roads without lines painted on it. There was no update that took that functionality away.


Yes this is true.

Basically every Tesla owner that saw the story knew right away that it was being sensationalized.

I was very surprised to see the LA times jump on it when they certainly have numerous employees that drive these vehicles.


It is sensationalized, but it could have a bug and not a designed behavior. This could have been a software bug or a rare combination with hardware or environment conditions.


Great, so the bug would need to be proven. A newspaper can't decide what happened the day of the crash with no evidence.


> My car has never allowed

That is an anecdotal result

> There was no update that took that functionality away

Maybe not explicitly, but we all know how software works


> There is a question: "Was autopilot in use?", and an experiment was performed to answer that question

Why an experiment? Isn't there a Black Box, like on aeroplanes, with logs to definitively answer the question?


From the article:

The report said the car's lithium-ion battery case was damaged, and in the ensuing fire the infotainment system's onboard storage device was destroyed.


The very next sentence of the article:

However, the "restraint control module," which stores data such as whether seatbelts were in use, how fast the vehicle was going, acceleration information, and airbag deployment, although damaged by fire was recovered and turned over to the NTSB's laboratory to evaluate.


The point is that Tesla can do over the air updates, which means (in theory!) that the experiment could be gamed because it would use a different version of the software.


My wife's truck has a lane guidance system. It will gladly keep you in a well marked lane. When there are no lines or bad lines it makes a loud ding and shows a big orange cancelation message on the dash display. This is a current year F150. I can only imagine Tesla has at least as good a system as Ford. As for updates, the infotainment system has a record of the last check-in and the last update, as well as the current systems version, similar to your phone or computer. With the number of Tesla hackers out there, it seems nearly impossible for the conspiracy you're suggesting to be a reality.


>it seems nearly impossible for the conspiracy you're suggesting to be a reality.

What conspiracy? majormunky is saying Tesla might push out updates every so often, and that might effect whether autopilot can be used, and the investigators might not have realized that.


> Whose to say that Tesla didn't update something between the crash and when the NTSB tested autopilot in that area? I don't think they did do that, but are there any safeguards in place to detect that happening?

That one. To do so for a very specific location with no visible lane markers (implied by the article) to disable the lane guidance system seems very much like a stretch. "I don't think they would, but are there safeguards" implies the author of the comment is describing intent. I don't think it's a stretch to say that's a statement of conspiracy.


I guess you're right. But the same basic thing could happen without a conspiracy.


On the other hand wouldn’t Tesla be irresponsible if it didn’t manually review sections of road involved in a crash and perhaps override the settings if automated systems had made bad calls?

Can you imagine a second wreck on that same stretch of road and the accusations against Tesla for not doing the manual review envisioned above?

Have Tesla said definitively that they cannot override capabilities without updating the version number?


I think manual review might actually be the worst way to solve this problem, because you're relying on human labor to fix every single hiccup, which ruins the point of an automated system anyways. I think for sure they run the route through a bunch of simulations to tune the ML mode, but I seriously doubt they would go as low-level as using telemetry to disable certain features by hand. The best solution would be updating the model so that it makes the right calls on the right type of road - so that when it encounters a similar road, you don't need to wait for another fatal accident before kicking the review process into gear and making the necessary changes again.

tl;dr it doesn't scale and would be a logistical nightmare to maintain individual configurations for every road the system struggles on


Of course it would not solve the issue in a generic way. To me, the reasonable fix would be to immediately disable the feature in the specific area manually and then retrain the model or whatever it is they do.


The article says that it couldn't have been in autopilot because autopilot requires lane markings for it to be activated and that area has no lane markings. It's nothing to do with software versions.


Is there any possibility for bugs caused by errors in the software or hardware?


Not a statistically meaningful possibility. If someone suffocates you don’t assume all the oxygen in the room just happened to randomly end up away from their face.


Do you have a source for that statistics? I have seen bugs in my Tesla but never had the oxygen issue happen. This is just anecdotal of course, but I suspect your comparison is off.


Sure, but the existence of individual bugs is very different that picking a very specific bug.

It’s the same as assuming that because someone regularly wins the lottery a specific ticket must have good odds. The space of all possible bugs is so vast, if the odds of every possible bug where as high as a lottery ticket then the software would be unusable. Some bugs are of course more likely than others, but assuming this one is more likely than average requires justification.


Elon also said that the vehicle didn’t even purchase full self driving capabilities.


> Elon also said

A man who uses his celebrity status to pump cryptocurrencies in order to enrich himself and the companies he owns has no moral standards and no credibility.


Even if he was pumping crypto, that doesn't mean he has no morals or that he would lie about technical details, particularly where he can be easily found out.


To be clear, are you referring to the guy in the company selling cars claiming that the current models sold today will be fully autonomous with only software patches?


Hah, the more evil version of billionaire dodgy car salesman would have a database script to modify the purchase order to retroactively remove that item.

"Self-drive? Our records say he never bought it!".


Might be worth a read on https://en.wikipedia.org/wiki/Ad_hominem

Doesn't matter what you think of the character, it has nothing to do with this instance.


It has everything to do with the character, since he provides no evidence for what he says, we have to trust him. Therefore, the first question we must ask is: is this person trustworthy?


Autopilot also includes the less-capable levels, not just FSD. For example, Enhanced Autopilot uses Autosteer and TACC which are both mentioned in the article.


They used the same version of software.


Local software != map data, necessarily.


Tesla has version of map data, if they did their job they used the same version for both.

Also, I don't think this feature actually uses map data.


It was reported that the versions are the same.


TL;DR let’s ignore the regulatory agency report, that confirms what literally everybody was saying, they are also being fooled. Tesla evil hurr durr.


also just because the driver was in the driver seat at the time of being on camera doesn't preclude some sort of stupid behavior like leaving the driver seat while on "autopilot"


All facts aside, people are still required to be in control of their cars and need to be legally able to intervene.

"But I used autopilot" is probably the most used excuse to avert fault away from the driver. Especially in the most broken insurance system on the planet where an accident like this leads to such a change in life that the driver is scared of the outcome.


my understanding is that whether autopilot is in use or not has no bearing on your insurance. as far as the insurance company is concerned, it's an accident while you're responsible for the operation of the car.


I’ve heard that the presence of Autopilot is actually viewed negatively by insurers.

This goes against intuition-wouldn’t a machine that behaves very reliably be better than a human driver - but my dad used to say that the insurance company’s calculation was only ever about how likely they thought they could win if they sued the other party. I don’t know if that’s true, but something rings true about it.


From an actuarial perspective, all forms of driver assistance contain the potential for moral hazard.


My insurance rates are the "same" as a similarly priced Audi or such.

As Teslas are safer over time, one would expect rates to gradually go down. Some repairs are a bit more expensive, so could be a bit of a balancing act.


> As Teslas are safer over time, one would expect rates to gradually go down.

In Germany it's currently the opposite due to the overly high amount of Teslas that are declared as total damage because the auditors that work for insurers usually don't know that their frames can be partially replaced.

Additionally there's no availability to repair a Tesla in practice at that point, but the given software lockout policy of Tesla is a different story.


At the same time, “he/she used autopilot while it is only drive assist” is a similarly overused excuse made by Tesla all around. If so many do so, maybe they did buy that car anticipating that feature, which makes it at least misleading advertisement.


The ironic result of this could be that insuring an Autopilot car gets more expensive than driving a normal car due to the high cost in investigating an accident after it happened.


Also, they shouldn't need to reconstruct the issue, there should be a black box recording all the events.


According to the article, the main storage aka black box was destroyed in the fire. They are trying to pull data from a secondary controller that was less damaged.


Looking forward to the retractions from all those publications that strongly implied or outright claimed AP was involved.


Can you provide an example of a publication that "strongly implied or outright claimed" on their own that autopilot was involved?

At least in the mainstream press, they seem to have reported what the police were saying at the time, and indicated that they were quoting the police. For example, from the NYT [1]:

> Two men were killed in Texas after a Tesla they were in crashed on Saturday and caught fire with neither of the men behind the wheel, the authorities said.

> Mark Herman, the Harris County Precinct 4 constable, said that physical evidence from the scene and interviews with witnesses led officials “to believe no one was driving the vehicle at the time of the crash.”

They also attempted to get Tesla's side of the story, but Tesla didn't respond:

> Tesla, which has disbanded its public relations team, did not respond to a request for comment.

I'd be upset with the police for making such confident statements that were contradicted by later investigations. But the media, who may not have had access to the accident scene, seems to have done a reasonable job reporting, at least from what I've seen.

In an ideal world, maybe they'd do an independent investigation instead of relying on police statements, but it's a question of how to allocate resources across all things they could be reporting. It probably makes it more difficult to do this when Tesla won't talk to them.

[1] https://www.nytimes.com/2021/04/18/business/tesla-fatal-cras...


> They also attempted to get Tesla's side of the story, but Tesla didn't respond:

To be fair, that's a little disingenuous given that Musk was literally tweeting denials at press time. Tesla certainly didn't refuse to respond, they had lots of things to say. But the article didn't get them in time so went to press trying to imply they had something to hide.

As far as the media reporting the story: it's fine. It was newsworthy. The "there was no one driving the car" business from the fire chief seems in hindsight to have been the real failing. That quote seems to have been a lot jucier than was warranted. In fact it seems almost certain that someone was manually operating that car via some mechanism or another.


I don't think this was a case of them simply not getting a response in time. The NYT will usually add an update to the online article if they receive a response after it's published. And according to Electrek, Tesla refuses to comment on all press inquiries [1].

> Since dissolving the team at the beginning of the year, I had hoped that Tesla would try to build a new one, but it doesn’t seem to be the case.

> Instead, Tesla is leaving all press inquiries unanswered and doesn’t seem to comment on any story.

But I agree with you that the local authorities really screwed up here when communicating with the press.

[1] https://electrek.co/2020/10/06/tesla-dissolves-pr-department...


"We ignored the public statements the company was making because they didn't send it to us directly" is a pretty bad look.

Saying they didn't respond would be fine for almost any other topic, but it's bad reporting for this one.


> The NYT will usually add an update to the online article if they receive a response after it's published

Then why didn't they update this article? Again, it's not like Tesla didn't say anything relevant. You're making what amounts to a process argument: Tesla didn't respond in the right way and so the Times is allowed to claim that they refused to comment because it's technically true.

But "technically true" isn't what journalism is supposed to be about, and the Times is usually pretty good about that in its hard news coverage. Which is why you can tell that this is really a business/investment story being pawned off as news.


> To be fair, that's a little disingenuous given that Musk was literally tweeting denials at press time.

Tesla says that Musk's tweets about autopilot are disconnected from reality:

https://www.theverge.com/2021/5/7/22424592/tesla-elon-musk-a...

So why would and why should anyone else take his tweets seriously?


“Two killed in Autopilot crash”

“A Fatal Crash Renews Concerns Over Tesla’s ‘Autopilot’ Claim” (Wired)

“2 Killed in Driverless Tesla Car Crash” … “no one was driving the vehicle” (NY Times)

“Tesla on Autopilot crashes in Texas” (driving.ca)

“Tesla crash: investigators ‘100% sure’ no one driving car in fatal Texas incident” (The Guardian)

Reporters have a duty to verify their sources and information and not report on rumors or hearsay.


>Can you provide an example of a publication that "strongly implied or outright claimed" on their own that autopilot was involved?

Sure. Here you go.

https://www.washingtonpost.com/nation/2021/04/19/tesla-texas...

Headline: A driverless Tesla crashed and burned for four hours, police said, killing two passengers in Texas.

Lede: "Just before midnight Saturday, a Tesla drove swiftly around a curve, veered off the road, struck a tree and burst into flames in The Woodlands, Tex., a suburb north of Houston, police said."

"Driverless" is used here to relate the reader to "Driverless cars," aka "self driving car." Note that the police only said that the driver wasn't in the driver's seat at the time of the crash.

The lede puts the agency for the driving on the car: "a Tesla drove..."

Looks like NBC did it too:

The vehicle, a 2019 Tesla Model S, was driving at high speed when it ran off the road, hit a tree and burst into flames, he said.

https://www.nbcnews.com/news/us-news/2-dead-tesla-crash-afte...

Here's Yahoo News with a video, headline "2 killed after self-driving Tesla crashes into tree, bursts into flames":

https://news.yahoo.com/2-killed-fiery-tesla-crash-210032357....

But wait, isn't that "technically correct" (the best kind of correct) to what the original investigation said?

Sure! But then why did other news sources go to great lengths to avoid using the phrase? Here's The Verge:

https://www.theverge.com/2021/4/18/22390612/two-people-kille...

"Two people killed in fiery Tesla crash with no one driving"

"Preliminary reports suggest the car was traveling at a high rate of speed and failed to make a turn, then drove off the road into a tree."

Notice the hedging in the paragraph. Notice the use of passive voice "was traveling."

Professional writers know the meaning of words. They know what those words mean to readers. They use them deliberately.


Your whole argument hinges on the fact that the publications are intentionally trying to smear Tesla and their AutoPilot feature. The fact of the matter is that those articles very clearly state that they're quoting what the police say about the crash.

From your The Verge article, that supposedly is the example of trying very hard not to divulge this fact.

> Authorities in Texas say two people were killed when a Tesla with no one in the driver’s seat crashed into a tree and burst into flames, Houston television station KPRC 2 reported.

It's the first line of the article.

> Preliminary reports suggest the car was traveling at a high rate of speed and failed to make a turn, then drove off the road into a tree. One of the men killed was in the front passenger seat of the car, the other was in the back seat, according to KHOU. Harris County Precinct 4 Constable Mark Herman told KPRC that “no one was driving” the fully-electric 2019 Tesla at the time of the crash. It’s not yet clear whether the car had its Autopilot driver assist system activated.

That's the third paragraph. I don't think they're hiding that they're parroting police officials.


>Your whole argument hinges on the fact that the publications are intentionally trying to smear Tesla and their AutoPilot feature. The fact of the matter is that those articles very clearly state that they're quoting what the police say about the crash.

I don't think it has anything to do with smearing Tesla. I think that people have anxiety about self-driving cars and publications know that, so they play to that. This isn’t about Tesla, or AutoPilot. It’s about getting the details right.

If these publications are claiming to quote the police, they're doing a shitty job of it, because the police said only that there wasn't anyone in the driver's seat at the time of the crash. The police didn't comment on auto pilot, didn't comment on the car being "driverless" (which would imply that it never had a driver to start with, which it did. If someone takes their hands of the wheel to stop their kids in the back seat from fighting, does the car become driverless for that moment? If so, where are all the other news stories calling those crashes driverless? Car accidents happen all the time while the car is cruise control. Is that driverless? Why not? It's essentially the same technology that Tesla has for adaptive cruise control).


I'm not going to dig up the article but you should be able to find it very easily:

A writer in the LA times straight up called for federal regulators to step in based off this. I've seen newspapers go full activist before in the past but this was shocking because they were doing so based off of what is obviously a false premise.

It doesn't take long to speak to a few Tesla owners and realize that autopilot even the non-FSD doesn't operate this way.


These days, the cycle with the fake news is long gone by the time anyone knows any facts. The facts arrive too late. The clicks have been clicked. The damage has been done.

The media moves on to its next money-making event. No real repercussions are to be found.


> No real repercussions are to be found

This is simply not true.

Almost every day media around the world are subject to libel lawsuits. Recently, Newsmax and Fox News were sued by Dominion Voting Systems and their employees and Meghan Markle sued Daily Mail for publishing personal letters.

And most of the respectable news organisations still abide by the core tenets of journalism e.g. multiple, vetted sources.


> And most of the respectable news organisations still abide by the core tenets of journalism e.g. multiple, vetted sources.

And which ones are those?


The ones that don't claim that no reasonable person would take them seriously when someone sues them for defamation (Fox has done this multiple times).


Every opinion host out there uses this defense when the inevitable suit is levied.

https://www.bizpacreview.com/2019/12/28/rachel-maddows-defen...


"Respectable news organizations" is a logical fallacy called "no true scotsman."


It would also be great if we got retractions of all the "burned uncontrollably for hours" reports where the firemen "couldn't put it out, and Tesla didn't respond to calls to help".

Turns out, that was all nonsense too:

https://www.caranddriver.com/news/a36189237/tesla-model-s-fi...


Seems like its actually worse then what I read. Basically says they would be screwed if they had to deal with a battery fire on a highway.


It was "strongly implied" because no one was in the driver's seat. Now there is evidence there was a driver, after an investigation. Why should they retract anything?


Would be good form after a reactionary piece prior to the results of the NTSB investigation to see the publications that published such pieces take some responsibility for what they publish.

Better form would have been not publishing reactionary articles in the first place.


The actual reason they don't need to retract anything is that the sensational articles already paid off, and they're now off to write another unsubstantiated claim of a headline on something else.


Because lying tends to reduce trust.


So you are saying in the absence of evidence they claimed that there was no driver and they have no reason to correct that misinformation?


You do realize that it was the police who originally stated that there was no driver?


Why would a serious publication take statements of the police at face value much less as evidence of anything?

> It was "strongly implied" because no one was in the driver's seat. Now there is evidence there was a driver, after an investigation. Why should they retract anything?

A publication which does not retract or correct misinformation is a key part of the problem.


Oh, I don't know that AP was invented as a car feature for so long, I mean dozing and drunken drivers must "strongly implied" because no one was driving...


another anecdote: an acquaintance let a friend borrow their Tesla X plaid (or similar, I think this was before plaid). Upon leaving their subdivision the "friend" immediately totaled the X by launching it into a drainage ditch in Florida. I've never driven a Tesla X or any car with this much power. I am surprised by how someone can get into so much trouble in 550 feet - but clearly people are surprised and don't let up on the accelerator.


I just rented a similar Model X (P100D) last week. They are remarkably fast for the size of the vehicle, but I found the accelerator pedal to be the closest thing to an ideal throttle control that I've experienced. It's extremely gentle and forgiving in slow/close situations (e.g. parallel parking), has instantaneous response and backing off of it brings an aggressive regen braking effect that is very featherable (?) and useful.

I would guess they just panicked in a strange an extremely expensive car and weren't able to lift their foot off the pedal once doom was imminent. It happened to me when I was a kid, I ran over a fence while turning my dad's girlfriends car around in a tight parking lot. I accidentally goosed the throttle while backing up, panicked, then stomped it to the floor while backing over that poor fence. Very strange experience, almost like my leg was being shocked and I couldn't control it.


It could also be the carpets. The carpets suck. They have good velcro to stick to the floor, but there's a terrible glue that keeps the carpet attached to the velcro and it softens and unsticks in the heat.

If you have those carpets, I would recommend sewing the velcro patch to the carpet so it doesn't come loose.


This is why companies like Mercedes use a very firm snaplock (button thing).


They’re used in cheaper brands too like Seat, and probably others from VW.


My cheap ass mustang uses button snaps. It's not just "companies like mercedes" implying only luxury card brands do it.


And toyota


Yep, I’ve been there too. Mom’s new Camaro, ironically I hit my neighbor about 10 miles from our neighborhood. No one was hurt but I have never dreaded a phone call to mom more heh.


I was watching this video: https://youtu.be/i7yigpPSu_o

And he was arguing that the plaid 0-60 time of 1.99 seconds had an asterisk: with 1-foot rollout

Rollout is when you put your car tires between the two light beams at the drag strip. When the lights change, and the car starts moving and the front tire clears the first line that is the 1-foot rollout location.

What I found interesting

@ 1 foot a tesla will be going 5-6 miles per hour

@ ~100 feet it will hit 60 mph

@ 550 feet... a P100D can probably be going pretty fast.


Kind of a strange video. The crux of it is that Tesla is being misleading, but then it is pointed out that this is how the rest of the industry quotes 0-60 times. Does he really want Tesla to be the only "honest" one?


The problem is that the 0-60 times for the regular model is measured from a standing start, which is the standard for regular cars and which makes it appear worse, and without any visual indication on the main page that the times are measured in different ways.

Only when going on a sub-page do you get an asterisk with a footnote mentioning the difference.


It's probably a whisky throttle situation. The immediate panic causes you to floor it more thinking it's the brake.


Just a classic case of someone driving their extremely fast car just that bit too fast. Sad to hear.


Too early to say, I think.


A policeman would just write up "Too fast for conditions".


Possibly, though if the original articles were reporting that 'no one was in the drivers seat', then there must have been a fairly substantial impact to throw the driver into the rear of the car.

Plus, it was a 100D on a back road, I can't imagine they'd be doing a leisurely pace... Conjecture, of course.


Didn't one of the early statements from the investigation say that they knew that no one was in the driver's seat at the time of the accident? I don't know on what basis they would make that statement and of course it could be erroneous, but all the more reason to remember we all know so little we probably can't reason about it well enough for conjecture to be much better than random guessing.

edit: Indeed [https://www.cnn.com/2021/04/19/business/tesla-fatal-crash-no...]


Yeah, but despite his certainty the usual disclaimers about initial investigations apply. I think the officers weren't thinking about how many people would pick up on their words.


Back road? Says they made it 550 feet from his home in a culdesac before leaving the road. What's the quarter mile trap speed of a 100D and what are the power numbers? How fast could it really have been going, under its own power, before gravity took over?


125mph-ish is what they trap in a quarter mile, 1/8th mile (660ft) trap is in the 95-100mph area.


Electric instant on torque can be a dangerous thing. So assuming just under the 1/8 mile number, we're probably around what? 80-85 mph before leaving the road? That'll do it. I assume it's a TX culdesac, similar to mine and not dense pack like my old one in CA. Plenty of ace and temptation to do something I'll advised.


200km/h in 10.5 seconds, apparently.


it was going 30 lol. i’m sure they just got in the back seat for no reason too


And burning alive because they couldn't escape the car. No big deal.


This isn't uncommon in motor vehicle accidents, however.


The fleet of EVs is very different from the fleet of light duty ICE vehicle. You can't do a 1:1 comparison and have anything useful come out of it. There is no EV equivalent of a rusted out 1995 S10 with a gas tank held in with a ratchet strap for anyone to crash because EVs haven't been around that long.

I suspect once you control for vehicle age the difference is marginal. Statistically nobody is burning to death in late model luxury sedans (Tesla included) because they're well enough designed that practically nobody is getting trapped in the vehicle in the first place.


In Hollywood crashes maybe, but in the real world auto fires are not common at all. As of 2018, there are 56 fires per billion miles driven. Furthermore, only 3% of all fatal automobile accidents involved fire.

https://www.nfpa.org/-/media/Files/News-and-Research/Fire-st...


The first sentence in your article that you linked states that in one year in the US alone, 560 civilians burned to death in fires.

Wrangling statistics to try and suit your argument is one thing, but that's more than one person a day that's burning to death in a car.


Nearly as many die annually in bathtubs. As it stands the average person could drive about 400k years before dying in a fire.

It's not playing games with statistics to point out the effects of large numbers.


[flagged]


> An estimated 212,500 vehicle fires caused 560 civilian deaths

It's what it says in the article you linked. I'm not sure why they specified 'civilian deaths'.

I never said it was common, I said burning to death in a car isn't uncommon.

I'm no EV proponent, by the way. You seem to be pretty fired up about it though - pardon the pun.

Edit: Added that I'm also not sure why civilian deaths is specifically mentioned.


There is a footnote from the phrase "civilian deaths" saying it is excluding firefighters.


Still a strange wording. Firefighters are civilians.


Ah yep, that would make sense then


>> And burning alive because they couldn't escape the car. No big deal.

>This isn't uncommon in motor vehicle accidents, however.

Don’t gaslight me. You’re playing up vehicle fires to downplay the fact that lithium ion batteries burn easily and are hard to extinguish, and Teslas in particular are harder to exit when the power is out.

The saddest thing about this whole thing is that you’re carrying water for a charlatan.


> The saddest thing about this whole thing is that you’re carrying water for a charlatan.

Why do you think this?


3% isn't that small.

What's the percent for teslas?

I'd say the stupid doors are a much bigger problem than the ability to catch fire.


Do Teslas/electric cars burn more frequently in severe crashes than ICE vehicles? Last time I looked, a few years back, the data showed that ICE vehicles burned quite a bit more frequently.


That is a really good question, tbh. I've seen comparisons that show the Tesla's burning less often, but those weren't specifically post crash. A lot of ICE cars come from fuel leaks, often after major repairs and without a collision.

I'm not aware of a direct comparison of post-crash fires. Anecodally, it seems the post-crash fires are rare with the 3 and Y and much less rare with the S/X.


The average age of a car in a fire is over ten years. Electric vehicles should be less common based on their relative youth.


tesla’s are death traps. you can’t put out their fires with water. and since their doors aren’t mechanical you’re stuck inside once the battery dies. this fire took over 4 hours to put out with the fire department not knowing how to put it out


since their doors aren’t mechanical you’re stuck inside once the battery dies

This is maliciously false. There's a manual door handle just like you'd find in every other car; to the point where people new to them almost always use it by accident instead of using the button.

https://www.teslarati.com/tesla-model-y-vs-model-3-differenc...

It's in an identical place on the other models. Please don't trot out hot takes like this on something you don't understand.


> and since their doors aren’t mechanical you’re stuck inside once the battery dies

Tesla's front doors have internal mechanical levers that can be used in the case of power loss.


This sounds like you just have a bone to pick with Tesla. Their doors can be opened when the battery dies.


Doors that dont open on a power failure sound super fucking unsafe. Looks like on the Model s they do open manually in front if you pull the handle back enough and in the back via a hidden release cable under the seat which sounds less than ideal. I'm not sure if this is the case with the other models.

https://www.youtube.com/watch?v=01lXcD_Uz74 https://www.manualslib.com/manual/1234782/Tesla-S.html?page=...


The Model 3 has a manual emergency release in the armrest: it's one of the more awkward parts of the car, actually, because it's right where most people expect the door handle to be and the actual "door handle" is a button a little bit up.


The thing I continue to find interesting is the fear involved with an "autopilot" crash (even if this was not one).

Every day around the world, hundreds (1000s?) of people are killed in standard car accidents, whereas, as a proportion of the market, driverlss cars cause a tiny number of crashes, yet people panic, regulators bear their teeth and protesters speak up as soon as single accident was caused by (or assumed to be caused by) a driverless car.

I guess it is an illusion of control in normal cars, despite the fact that electronics should be better in almost every regard for safer driving.


> despite the fact that electronics should be better in almost every regard for safer driving.

Maybe in the future. For now electronics are still handicapped by AI that is not anywhere even remotely close to humans.

"AIs" ability to reason about traffic situations and, by extension, planning are laughable and even if that properly worked it would still be handicapped by image recognition I wouldn't trust to tell a trashcan from a car.

At this point it's pretty obvious that until better AI comes along we're stuck with terrible. Certainly just pouring ever more resources into "current gen" NNs won't get us anywhere.


It's a matter of trust. Do you trust a robot to make decisions of life or death? You can't empathize with the robot, you know it can't empathize with you, and you instinctively don't feel like you can understand or predict the actions of agents that you don't empathize with and don't think empathizes with you. Humans make mistakes but they're understandable or comprehendible mistakes; with a robot you don't know types of mistakes to expect. You don't have a theory of mind for a robot.


> Every day around the world, hundreds (1000s?) of people are killed in standard car accidents

Hundreds? Lol. In India alone 150,000 people die in road accidents every year. That is almost 500 a day for 1 country. So over the world it must be well into multiple 1000s.


5000 is still 50 hundreds if you want to get pedantic.


There is a group of people who are into cars and who agree that autopilot is generally safer. They consider themselves being exceptional drivers (and to be fair they often are pretty good drivers) and therefore they think that in their particular case it's safer to not use it.

Of course determining so on individual feelings is dubious (and perhaps there is or soon will be no person outperforming the AP).


I really don't understand how they can make this statement:

> The report states that when the car started, security video shows the owner in the driver's seat, contradicting reports at the time of the April 17 accident that the seat was empty when the car crashed.

Those two things aren't contradictory at all. The car's journey could have started with the driver in the driver's seat, but been empty when the crash occurred. A lot can happen in the time between the start and the end.


I suspect they tried to activate AutoSteer, didn’t realize they failed, and let TACC (which did activate) drove them straight into some trees. So much for automatic emergency braking.


Automatic emergency braking systems don't generally brake for stationary objects once the vehicle is going beyond parking speeds. They're really designed to reduce colission forces between vehicles in motion (or recently in motion), and some more recent systems try to reduce forces between vehicles and pedestrians.

A Tesla is happy to slam into a parked emergency vehicle at full speed, as has been demonstrated several times; it's not surprising they don't stop for trees either.


A Tesla will, yes. But even a small Toyota Yaris will absolutely smash the breaks if you try to drive above parking speeds against something. It's EuroNCAP mandatory for top rating so every 5 star rated car from 2021 has this.


An F150 will too. Are we sure a Tesla won't?


Teslas are 5 star rated, so how does that work?


My 2018 Subaru Forester will absolutely emergency autobrake at highway speeds. I've seen it happen when it detects a car exiting out of the lane as still being in the lane (road is curved so car is still in front, but I have no intention of following it).


> I've seen it happen when it detects a car exiting out of the lane as still being in the lane

The car it braked for was also in motion, or recently in motion, right?


Hmm, true, perhaps that was a bad example. It also brakes for cars that are waiting to turn (completely stationary) that it thinks I'm approaching too fast (say at 40 mph).


Not true for Subaru, mine emergency brakes with objects ahead at any speed


So it will emergency break on the highway if something flies in front of the car? Sounds dangerous and unlikely.


It is possible to see a difference between a fly, bird or car..


Right. I somehow doubt Subaru is ahead of the game compared to Tesla on that.

I took a look at how Subaru Eyesight works. It uses a dual camera system to identify a vehicle ahead and apply braking if it slows down, pedestrians and other specific patterns. Some of it is limited to a 19mph closing difference (surprise).

So the statement above that “mine emergency brakes with objects ahead at any speed” is, unsurprisingly, incorrect.

It doesn’t react to any object and would probably happily hit a tree under the right conditions, or a parked car. Happy to be proven wrong with a demonstration video of the car at full speed.


Uhh, no, it's totally correct that it will slam on the brakes.

Source: own a Subaru, have driven it at a cardboard box in the middle of the street, and have had it slam on the brakes. Have also had it slam on the brakes when swerved into my lane, saving me from an accident. So... worth it.


A Tesla Will also slam on the brakes in general. Swerving into your lane likely fits the 20mph window that Subaru is designed for. But that’s not what’s in question in this comment thread.

Have you tested yours again a stationary emergency vehicle or a tree? Otherwise there isn’t much to discuss here.


Did you read what I wrote? I drove it down the street at a cardboard box and it slammed on the brakes and stopped.


Tell that to my Lexus, which 100% does beep and will smash the automatic brakes if I try to drive into a stationary car behind me.


Are you doing that at parking speed or highway speed? The rules are different.


This is not true for Teslas, and I've tested it (in a safe environment) in my own. It uses vision, as well as radar, to detect obstacles, and will brake from highway speeds to zero when there is a car stopped ahead.


I don't think it's guaranteed though. I've noticed the same as you, but I've also had it lock the brakes after seeing a shadow (lol) and I've had it speed up to 70 mph on Topanga Canyon Blvd (speed limit 45 mph) and nearly run into a parked semi.


Except the kind of radar used in a Tesla cannot do that for you hence while the AP ignores static objects and crashes happily into parked cars, fire trucks, ...

Here's an explanation of why written on here a long time ago by @chowells: https://news.ycombinator.com/item?id=16239612

This user wrote:

To add even more detail, as there still appears to be a lot of confusion:

The radar systems in these vehicles send out a radio pulse in a broad approximate-cone forward. They get bounces back from everything that reflects radio in front of them. Distance from the object is calculated by time between pulse and response. Speed towards/away from the object is calculated from Doppler shift of the radio frequency.

There are two main things that these systems can't detect.

1. Speed of the object perpendicular to the direction of radio wave travel.

2. Location of the object within the approximate-cone the radio pulse travels in.

Note that thanks to the second, you can't calculate the first with higher-level object tracking, either.

So the data you get back is a list of (same-direction velocity component, distance) pairs. There's no way to distinguish between stationary objects in the road and stationary objects above the road, to the side of the road, or even the surface of the road itself.

Radar just doesn't provide the directional information necessary to handle obstacle detection safely.


This varies based on exact hardware revision, software revision including OTA updates, and active settings at the time. I don’t think anyone but Tesla can make any definitive statement about the behaviour of any given car at any given point in time.


That's insane! That alone should make it illegal on the road.


I agree and I can't fathom why this is an unpopular opinion. It's bad enough when my phone or computer suddenly works differently (or not at all!) due to mandatory OTA update. It'd be terrifying if it my car's safety features could do the same.


It's weird that the NTSB drove another Tesla car on the same road and concluded that because autopilot wasn't available for their car it couldn't have been on in the crash.

Aren't vehicle logs a thing? What if there was a bug which mistakenly turned it on in the original car? What if there was a software update sometime between the crash and the investigation which changed the behavior? The article even says that the investigators recovered the damaged "black box", but a month later we have no details about it?

Edit: changed year to month /facepalm


> The car’s restraint control module, which can record data associated with vehicle speed, belt status, acceleration, and airbag deployment, was recovered but sustained fire damage. The restraint control module was taken to the National Transportation Safety Board (NTSB) recorder laboratory for evaluation.

Also, the accident was less than two months ago, not two years. It's likely they're still recovering and analyzing data.


Teslas log a tremendous amount of data, but my guess is that it was melted in the fire and unable to upload before it was destroyed. There is a black box with some logs and I believe 8 photos that are taken when the airbags deploy but perhaps the fire destroyed that too.


> The car’s restraint control module, which can record data associated with vehicle speed, belt status, acceleration, and airbag deployment, was recovered but sustained fire damage. The restraint control module was taken to the National Transportation Safety Board (NTSB) recorder laboratory for evaluation.


I believe that the NTSB found that the logs were destroyed in the crash and subsequent fire.


The report is linked in the article. They report that the storage for the media control unit (where the video goes) was destroyed. The restraint control module (more limited data capture including speed/acceleration and airbag status) "was recovered but sustained fire damage" and has been taken to a lab for evaluation. They might get something off of it.


The vehicle’s main storage was destroyed in the fire.


Wow this investigation is garbage.

- The gross generalization saying that it couldn't have been enabled because they failed to enable it I their specific situation.

- The gross extrapolation of saying that the driver must have been in the seat when it happened because the driver was in the seat when the car started.

I mean, how useless is that of an investigation. Where is the log analysis, witness reports, and camera images of points closer to the crash site.

I feel very disappointed in the NTSB.


nothing to see here. they were just going 30 and hit a tree randomly and then got in the back seat and burned to death over 4 hours with a fire that was impossible to put out and the doors non operational

oh and i’m sure all these brake failures in china have nothing to do with it

https://www.cnbc.com/2021/04/23/tesla-in-china-pressure-moun...


False story by the media. The actual fire department said it took minutes to put out the main fire.

https://www.caranddriver.com/news/amp36189237/tesla-model-s-...


I see no confirmation of a 30 mph speed and it seems unlikely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: