> The horrific crash occurred at 9:35 p.m. at Market and Fifth streets after the traffic light turned green, giving the Cruise car and other car — which had been waiting side-by-side for the light — the right to enter the intersection where a woman was walking, according to video of the crash
In fact, the driver of the car did not have "the right to enter the intersection" if it meant they were going to run a pedestrian over, no matter what color the light was.
How did we get to the point where reflexive empathy for drivers over everyone else has completely trumped basic understanding of both the law and human decency?
> How did we get to the point where reflexive empathy for drivers over everyone else has completely trumped basic understanding of both the law and human decency?
This! I sometimes watch some of these dashcam accidents, and I can say most of them can be avoided if everyone had some basic human decency, but it always go “I have the right of way I don’t care” attitude, yes you have a green light but obviously you are seeing other car is crossing the road, try to stop/avoid it instead it of ramming it, especially you are still accelerating not already at full speed..
No the robot was left holding the bag after a disgusting person drove into this pedestrian and as a result of hitting the pedestrian with their car tossed the pedestrian directly in front of the driverless car. Which immediately freaked out and tried it’s best to pick a safe failure mode… in this case it picked wrong and that’s a problem… but a human is clearly the primary offender here and the driverless car was given an out of context problem because the human broke the law by doing a hit and run on this pedestrian
It did try to handle it, by coming to a complete stop as fast as possible and avoiding further movement. I definitely think it should be smart enough to not stop “on top of an obstacle” under the car… but as far as everything before it stopping on the pedestrians leg… seemed like it would be hard to get a human driver to do better given the likely lack of a lot of “hit and run happens right in front of me” training data… zebras vs horses and all that. I expect one they will definitely add more “accident close by” data to the training data… and two they will get the damn thing not to stop on top of an unknown obstacle.
>Pedestrians always have the right of way including when breaking the law.
This is not true, nor does it make sense. If you are in the middle of the highway going 70 mph and someone is standing on the side of the road wanting to cross you do not have to yield and stop your vehicle. Stopping in the middle of the highway is dangerous since it isn't excepted.
Pedestrians always have the right of way ... on roads where they can be. Pedestrians and cyclists are forbidden on 70 mph highways. If I ever see someone on foot on such highway, after doing my best to avoid that person, I'll call 911 asap to report it.
For anyone mad enough to claim right of way toward a pedestrian, in any situation, just think about the shitload of legal hurdles awaiting you down the road. Not worth it by any standard.
In Washington state (I don’t know CA rules), pedestrians have to yield to traffic if they are not crossing at an intersection, but they are still permitted to cross as long as they yield to traffic. The exception is if they cross between two intersections that are controlled by lights, then that is considered jay walking.
“Right of way” and “must yield” does not mean a car is allowed to hit a pedestrian who doesn’t follow the rules, of course. The pedestrian is just not following the rules and can be cited for that. But practically speaking, most collisions involving such situations are not faulted against the driver unless some other factor is involved. It’s still a mess for them, however.
They might not have the legal right of way, but you're still in trouble if you run them over. So by all means, just don't try to kill anybody even if you have right of way (I'm appalled I even have to say this).
Right of way has nothing to do with colliding with people or cars. If I enter a 4 way stop second, but ignore right of way and go first that doesn't mean I'm going to hit another car.
This is correct—CA Vehicle Code § 21950. Unless the pedestrian literally jumped in front of the car, the car needed to stop, even with a green light. Of course, the actual application of the law can be subtle.
Oh come on, you're really just twisting semantics to the extreme to try to score a political point. It's obvious the article wasn't actually saying the car had the right to hit the woman.
In what way is pedestrian safety political? I have no political interests in this topic other than not wanting to be run over while walking around my city. I literally cannot trust drivers to follow the laws and plan my walks accordingly because I know drivers will ignore stop signs, red lights, etc.
The way I understand it is: There are two cars waiting to go through an intersection. The Cruise and the Human-Driven car. The light turns green, and the Cruise's pathway through the intersection is clear.
A pedestrian is crossing the intersection in the path of the human-driven car, but not in the path of the Cruise.
Both cars enter the intersection, the human-driven car strikes the pedestrian, who then rolls off the human-driven car into the path of the Cruise, and is run-over the Cruise.
Of course, I haven't seen the accident, but if that's broadly correct, I'm not sure that this is the damning evidence of "driverless cars are unsafe" some people seem to be spinning it up to be.
It is a tragedy, but it seems to be a tragedy caused primarily by a human driver who struck a pedestrian, and a human pedestrian who ignored a traffic signal.
Should the bar for driverless cars be "able to predict when other vehicles will hit a pedestrian and roll that person in their path"?
Would a human-driven vhicle have done any better? If the places were reversed, would the Cruise have struck the pedestrian in the first place?
I think we're all right to be skeptical of machine-driven cars, but I'm not certain that this specific incident is a point against them.
Could the Cruise vehicle even "see" the pedestrian in this case before they'd already been struck by the other vehicle and thrown into the path of the Cruise?
> Should the bar for driverless cars be "able to predict when other vehicles will hit a pedestrian and roll that person in their path"?
No, but in the case of pedestrians crossing the street it may be reasonable to expect a driverless car to avoid getting into a situation where that can happen. This will depend on details of traffic laws where it is operating.
What are the clearance rules in California for staying clear of pedestrians crossing the street?
Here in Washington it is that if they are in your half of the street or within one lane of your half you have to wait. For purposes of this rule your half of the street is all lanes moving in your direction.
No Washington driver should be going until either the pedestrian finishes crossing or there is at least one lane between the lanes in the car's direction and the lane the pedestrian is in.
If a human driver here goes early and hits the pedestrian and the pedestrian ends up in an adjacent lane they should still be safe from other cars that did not go early.
In California, aside from pedestrians in a crosswalk having right of way, a driver approaching a crosswalk with a pedestrian:
> ... shall exercise all due care and shall reduce the speed of the vehicle or take any other action relating to the operation of the vehicle as necessary to safeguard the safety of the pedestrian.
In addition, all of the rules for pedestrians (no jaywalking, don't dash off the curb in front of a car, even in a crosswalk) include a provision that state they do not "relieve a driver of a vehicle from the duty of exercising due care for the safety of any pedestrian within the [roadway or crosswalk]"
> ... shall exercise all due care and shall reduce the speed of the vehicle or take any other action relating to the operation of the vehicle as necessary to safeguard the safety of the pedestrian.
This is great in principle but in practice cars will just drive straight at you even when you are crossing on a walk signal.
Coming from a country where the walk signal means only pedestrians can be in the crosswalk this almost got me killed the other night.
I was hoping self driving cars would help things but it seems they drive straight for you too.
This regionally varies. Where I live (also in California), cars stop for pedestrians who appear they are about to cross at uncontrolled intersections. Other places it feels like playing a game of chicken.
The change on twitter to not show replies, mixed with multi-message splitting is infuriating and should 100% disqualify twitter links from being posted. (I feel the same about paywalled articles FWIW)
Depending on the geometry of the scene its entirely possible the pedestrian was not visible to the self driving car until after they were struck by the first car.
The state I learned to drive in didn't have an anti-gridlock law[1], so I had to look this up for California. The text of the anti-gridlock law is:
> Notwithstanding any official traffic control signal indication to proceed, a driver of a vehicle shall not enter an intersection or marked crosswalk unless there is sufficient space on the other side of the intersection or marked crosswalk to accommodate the vehicle driven without obstructing the through the passage of vehicles from either side
Assuming the pedestrian counts as being on the "other side of the intersection" and prevents there from being sufficient space to "accommodate the vehicle being driven" I think you are correct.
1: They would put up a "do not block intersection" sign anywhere that this became a problem, and then you could get ticketed for disobeying a road sign.
Interesting to know. I don't drive but I saw enough how getting a car in the way of other cars (or pedestrians to slightly lesser degree) without already having a way out is a dice roll. Thought it might not be allowed
> In fact, the driver of the car did not have "the right to enter the intersection" if it meant they were going to run a pedestrian over, no matter what color the light was.
I've been in this situation before. I'm the third car in a line going on a green light at a light, and all of a sudden an unhoused neighbor pops in the intersection crossing, I slam on my brakes because I don't have the "right" to hit him, but damn, that was close.
If someone essentially jumps into traffic, you probably will be found to be not at fault if you hit the person. Not that you wouldn't do everything in your power to avoid it.
In many cases you won't be found at fault, and probably in the vast majority of cases you won't be found criminally at fault (and in some jurisdictions, you are at fault just because you are in a car, and it doesn't matter how poorly the pedestrian behaved). However, it would still be a mess physically and mentally. No one wants that experience in their head for the rest of their lives.
An argument regularly put forward by Ashley Neale (YouTuber and driving instructor, and erstwhile soccer player) in the U.K.. Not being found at fault isn't the only factor.
I think this is probably just bad wording on the part of the author. But I do think it's bad wording partially caused by a messed up view of the world vis a vis pedestrians and cars.
Going to hold up the mirror here… how can you interpret this and excuse the driver from their responsibility to look where they are driving, to check the road ahead of them is safe and clear of danger to their fellow citizens be they motorists or pedestrians… they should never have an expectation that the road is clear… it’s their responsibility as a driver to continually be making sure of this fact… this is why we prosecute reckless driving, dangerous driving, driving without due care and attention, and a hundred other names from jurisdictions around the entire world… because if you push the gas pedal your fucking responsible for where the damn car goes… outside of extremely specific circumstances where we allow people to potentially have good reasons for their car not going where it should, faulty breaks and this sort of thing… other than that, it’s the damn drivers fault and they are responsible for operating the vehicle.
Um… the pedestrian was in the crossing area before the light went green. This isn’t the infamous suicide by Tesla type of scenario here… this was a pedestrian clearly already in the intersection… the intersection may not have been well enough lit but then once again it’s the drivers responsibility to turn on and use their lights for safety… the pedestrian clearly had the right to enter the intersection and while they may not have cleared it quickly enough to avoid being there when the light changed, I’ve seen shitloads of lights around the world that are set to absurdly short pedestrian crossing times which genuinely require an adult to move at a brisk jog or faster if they want to get across the lights before they move from “pedestrian Only” to “pedestrians shouldn’t enter the crossing area and cars are now given the green light to begin moving”
And while the full story has not yet come out, the story is quite detailed for a hit and run, we know the pedestrian was struck and rolled up over the bonnet and over the car roof before their trajectory carried them towards the edge of the car roof upon which they proceeded to roll off the rear side roof of the car and that’s when they hit the road directly in front of the driverless car… this pedestrian got dumped basically from the sky as far as the AI on the driverless car can understand… There’s going to be a lot more evidence recorded by the driverless car likely including video with the offending human driver’s license plate on it at some point… and I expect a detailed account of the accident will be possible due to the investigations being able to correlate the current information and witnesses with whatever data from the driverless cars sensors Cruise chooses to or is forced to (by the NTSB/etc)
"The video from the autonomous vehicle showed the front and left side camera angles and started when the AV is stopped at a red light, to the right of the suspect car, at the intersection of 5th and Market Streets. The light turned green and both cars proceeded through the intersection and approached a crosswalk. As the two cars approached the crosswalk, a woman was seen walking across the crosswalk despite the oncoming cars. She unsuccessfully tried to beat the manned green vehicle to the left side of the AV."
"When the pedestrian was hit by the green car, she landed on the hood of the car, flipped over the roof and rolled off the right side of the car. She slammed onto the pavement and landed right in front of the AV. The AV brakes engaged as soon as she hit the pavement and then stopped on top of her. She landed parallel to the lanes. It’s unclear from the video if the front tires ran over her or if she just ended up underneath the car."
Promoters claim self driving cars are safer than human drivers.
This is based on a rosy interpretation that deflects the blame for most of the accidents they are involved in.
Self driving cars are in accidents much more frequently than human drivers. However, most of these accidents are rear end collisions --- naturally blamed on the human driver.
But the reason human drivers rear end self driving cars is the fact that these cars are programmed to hit the brakes any time they can't comprehend the current situation --- which apparently happens quite a bit.
The reality is, human drivers can't comprehend or anticipate a driverless car hitting the brakes for no apparent reason.
Furthermore, we should expect a constant stream of news headlines about self-driving cars being involved in accidents, even when self-driving cars are safer than humans because they are newsworthy and interesting. No one writes "dog bites man" articles, they just cover "man bites dog" stories.
This is a bad faith comparison not using apples to apples. Humans drive in all sorts of conditions that self driving cars flat out don't, and this data doesn't control for that at all.
> Humans drive in all sorts of conditions that self driving cars flat out don't
that's exactly why they are not allowed in all sorts of conditions! Where did this all-or-nothing demand come from? If a self-driving car shows less accidents than humans in Phoenix, why would you apply numbers from e.g. Montana, how is that apples-to-apples? and if it's shown safe in Phoenix conditions, why not let it run in Phoenix, just like the current approach?
It's fine to point out the flaws in this data, but do you have any better data to point to?
The humans five times worse in that data. That is a very large gap. Even after adjusting for driving conditions, with a gap that large, self-driving cars are likely safer.
Also, don't assume a comparison is made in bad faith, just because it has flaws.
A comparison like this from any trained statistician would be treated as borderline fraud, and is akin to the "we proved cigarettes are safe to smoke" studies done in the 50-60's.
So to be clear, this is not a "flaw". This is a catastrophic, wrong from the get go, not mentioned and no effort made to control for, failure.
Deliberately misrepresenting data, or in the most charitable case, by sheer mass incompetence, when it involves the safety of people is never ok.
The fact that automated vehicles operate with much broader safety margins with regard to driving conditions definitely contributes to them being safer overall - humans choosing to drive in worse conditions objectively makes them more dangerous drivers. I also suspect that what others are getting at - that under the same driving conditions safety margins as robocars, humans have a lower accident rate - isn't true, considering the bulk of motor vehicle accidents occur during normal commutes during daylight hours in nominal weather at low speeds. I suspect that even if you control for the elderly, the drunk, and inclement weather, humans will still have worse statistics. I think that humans have a tolerance for risk and ambiguity that makes for statistically worse, albeit more predictable, driving behaviour. But much like you, I don't have any better data to back this up.
Given the right away privileges ice road trucks are given (vs. normal cars...since they can't stop easily!), it seems like this would be a better first application of self driving than more chaotic environments.
Waymo had 20 accidents by February of this year but categorized 18 of them as "minor collision events".
Unless the authors similarly recategorized the driver data that is suspect.
For instance 1.24 is the number of injuries per million miles for drivers so that doesn't sound like 2.98 excludes minor ones.
The reality is the jury is still out on self driving car safety. Driving a few million miles and extrapolating isn't terribly useful especially when you only drive at low speeds.
There is a hypothetical potential to reduce fatalities but it is still very hypothetical.
- States "Statistics researched: 18", but never tells us what those were.
- Finally it has the most inane clearly ChatGPT written commentary:
> This statistic is a testament to the safety of Waymo self-driving cars
> This statistic is a powerful testament to the safety benefits of Tesla Autopilot technology
> This statistic is a powerful reminder of the potential life-saving impact that self-driving cars could have on our roads.
> This statistic is a stark reminder of the public’s apprehension
> This statistic is a powerful indicator of the potential safety
> This statistic is a powerful indicator of the public’s confidence
> This statistic is a testament to the progress being made
> This statistic is a testament to the safety of Waymo’s autonomous vehicles
And so on.
Why wouldn't you even read the website you linked?
Oh well, at least you can take a free "Personality Leadership Test" to see if you're a stable diffusion version of Mark Zuckerberg, Bill Gates, or Elon Musk.
“ A pedestrian crossing a San Francisco street on Monday night was hospitalized in critical condition after she was hit by two cars — first a regular vehicle which hurled her into the path of a driverless taxi that then ran her over, stopping on top of her as she screamed in pain, according to witnesses, investigators and the autonomous taxi company.”
It’s a little difficult from the description to know what a human would have done in the second car. Can’t tell how much time there was to stop.
Also, when driving behind someone you should always be prepared to stop. It doesn’t matter why the car in front stops.
It's fair to argue that a human driver may well have hit her. It's less obvious that they would have parked on top of her until someone came to lever the car off of her.
No, you'd get out, realize what was happening, and get back in the car and put it in neutral or otherwise gently move it 6" to stop crushing the person.
I think in a scenario where this just happened you'd want to remove the pressure immediately. The pressure is going to lead to lower or no blood flow to the limb, which will greatly increase the chance of losing it.
They had to drive the car remotely, right? That wouldn't have 10% of the finesse of a human in the front seat with a hydraulic brake pedal. But I wasn't there so just speculation.
What surprises me is that people didn't lift the car off her. 3-4 adults should be able to lift even an EV a few inches.
Just removing the object crushing a person could result in their death. You need medical professionals on hand to ensure that removing the object doesn't lead to untreated issues.
> Human drivers tailgate. If you have a safe cushion in front of you, an erratic stop won’t cause a collision.
This is why in my state, if you rear-end someone then you are automatically presumed to be at fault. "They slammed on the brakes unexpectedly" is not a defense. You should have left adequate braking distance so that wouldn't cause an accident.
I’m genuinely surprised “human drivers can't comprehend or anticipate a driverless car hitting the brakes for no apparent reason” is an argument people can relate to.
You absolutely should be expecting the driver in front to slam on the brakes at full force without warning. There’s an infinite number of things that could cause any driver - human or autonomous - to do this. If you drive without this expectation then you’re a very dangerous driver and I wouldn’t want to be anywhere near you on the road.
A common insurance fraud trick is to change lanes close in front of someone and then slam on your brakes to cause a rear-end collison, then file an injury claim for back or neck pain against the other driver. They don't have time to react but are presumed at fault. One of many reasons to have a dash camera in your car.
That doesn’t really have anything to do with what we’re talking about, but many of those types of accidents can also be avoided.
There’s a significant number of people who feel personally attacked by a car merging in front of them, despite the fact that a merging vehicle has right of way (in all countries I’ve ever driven in). There’s a video that keeps popping up on my Instagram where a car changes lanes, and the car behind has plenty of time to react and slow down slightly, but they instead crash into the merging vehicle intentionally without touching the brakes at all. Most of the comments side with the driver who caused the incident, not the merging vehicle who did nothing wrong.
Anyway, my point is, there are a lot of people who refuse to give space to merging vehicles, which increases the chance of these accidents happening (fraud or not).
If you put your ego aside and immediately back off when a vehicle in front merges, you’ll be far more likely to avoid this type of accident. Alternatively, if you tailgate a merging vehicle out of “principle” you’re just inviting an accident.
It's fault of the person who didn't have enough room to stop. Full stop.
It doesn't change the fact though that human drivers don't slam the brakes out of the blue because their "AI" got confused about nothing in particular, and that it is dangerous.
You are conflating “unexpectedly” with “no reason”. The parent comments said “unexpectedly.”
There is a world of difference in those word choices. Saying “no reason” certainly gives you an excuse to write a sanctimonious comment like this, but it’s misrepresenting what was said.
Yeah, the tailgating makes no sense unless preparing to pass. When I catch myself doing it it's because I'm emotional and feeling rushed, and I take a breath and remind myself to back off until it's safer to pass.
During a freeway slowdown I try to dampen the brake-waves by going slow and not braking so much, but if I leave too much space in front of me (for coasting), another driver moves in and we're back to playing accordion.
If all cars could sync up and move together like train cars on the freeway there'd be less brake dust and more efficient travel, but then I'd rather take a train anyway.
Heroic. I try to do this too but everyone I know thinks I'm being neurotic when I explain it to them. I still see so many people accelerating into stoplights/brake scenarios.
> I still see so many people accelerating into stoplights/brake scenarios.
Every time I have to use the brake - rather than downshifting - for anything other than holding the car in place, makes me feel like I've committed a failure in planning and/or situational awareness.
I think it should be seen the other way around. Why should I have to stop moving, meaning using my brakes and subsequently the gas, just because the other drivers wanted to rush up to the light and wait? It's much easier for everyone if we do this smoothly
All those people you caused to miss a left turn light because you didn't feel the need to clear the turn-lane entry, because you wanted to keep slow-rolling, are cursing you. I have had that exaqct situation more times that I can count and every time I get angry at the other drive who thought "Why should I have to stop moving".
National Law Review (look it up or see associated link below).
>>>
Human drivers tailgate. If you have a safe cushion in front of you, an erratic stop won’t cause a rear-end collision.
<<<
That's the theory --- and the misinterpretation.
It's true that almost all human driver's tailgate at some point --- but they do so while being involved in half the accidents that self driving cars are --- 4.1 accidents per million miles versus 9.1 for driverless.
> National Law Review (look it up or see associated link below)
Your Carsurance source cites the National Law Review, which cites the Carsurance source for that statistic [1]. It’s important to consider from when they polled data and how they’re segregating by manufacturer.
EDIT: Found the study [2]. It’s for 2012 to 2015. Given these are decade-old data, I’m actually surprised the gap is only 2x while fatality and injury rates were already lower.
> However, most of these accidents are rear end collisions --- naturally blamed on the human driver.
Rear end accidents are practically always the fault of the person who didn't stop in time. If someone rear ends a driverless car, even if it stopped because of some glitch, that driver is at fault. They shouldn't be driving around city streets with such speeds and limited following distance that they couldn't stop in time.
Exactly. The car in front might be reacting to some road hazard that just appeared that is not visible from behind it or it may have some mechanical failure. It doesn't matter why it comes to a stop, it is always and all times the responsibility of the trailing car to maintain enough distance to be able to come to a complete stop given the speed and road conditions.
It’s easy to say that but you can’t just ignore the reality that this is how people drive, and self driving cars in that environment do lead to more crashes.
I'd argue we should bar people from driving who collide with stopped vehicles without any other extenuating circumstances. Maybe adding self-driving cars to the mix will increase these collisions for a few years, but if we bar people from driving for doing these kinds of accidents we'll quickly weed out these massively unsafe drivers from the driving population and be much safer overall. Maybe we'll finally start decreasing the pedestrian fatality rate instead of increasing it.
If you refuse to drive safely you have no place to be on the road. We shouldn't just accept people driving in massively unsafe fashion.
I wouldn't mind this but it's completely politically unfeasible. There is absolutely no chance of political support for banning people from driving for anything other than a DUI. Even "accidentally" killing people or being so old that you can't see enjoys wide support of still being allowed to drive. As long as that is true self driving cars need to be able to handle bad drivers.
> Promoters claim self driving cars are safer than human drivers.
And opposers always ignore or brush off the fact that humans have proved themselves to be terrible drivers. A human driver ran over this woman first yet you didn’t even recognise this anywhere in your argument against autonomous vehicles.
Everyone is saying "is AI better" not saying humans are good.
As terrible as humans are we do have a very important trait. Being able to handle unexpected information efficiently.
All the current AI models do have ways of slightly generalizing but only slightly. You still need to effectively train "when there are cones you need to switch lanes" and "when there are barriers you can drive on the line" kinds of things.
And while yes everyone mostly talks about low stakes examples that is mostly because the high stakes ones haven't been handled by AI yet.
E.g. a Tesla driving under a semi because it thought it was a billboard.
I look forward to any replacement to human drivers (although eliminating cars from Metro areas would be much more efficient I digress..) but autonomous vehicles need to be better before they can be that.
> E.g. a Tesla driving under a semi because it thought it was a billboard.
People don’t like what they can’t understand. Most of the accidents caused by autonomous vehicles are weird and probably would have been avoided by human drivers. However, this isn’t the whole picture, because those same vehicles also avoided a bunch of accidents that humans would have caused.
My interpretation of the data is that autonomous vehicles are already statistically safer than human drivers.
Problem is, humans are emotional beings who aren’t always capable of pure rationale. If you tell someone they’re statistically safer in a Tesla, but it might accelerate into a wall at 60mph, many people won’t take that risk.
>As terrible as humans are we do have a very important trait. Being able to handle unexpected information efficiently.
You’re joking right?
The human driver didn’t handle the “unexpected information” of a human being on the road when the light was green.
The human driver didn’t handle the “unexpected information” of running over a human being and drove off instead.
Humans are awful at unexpected information. This is exactly the moment that emotions take over and emotions are what makes humans really bad at tasks like driving.
A Tesla drove under a semi because "that flat thing isn't moving it can't be a car it must be a billboard".
A jay walker in Arizona got ran over because "people are only in crosswalks" (I don't remember the company offhand)
Autonomous driving isn't automatically safer. Companies keep focusing on the potential for improved safety without technically claiming they are that much safer.
The potential is great but that means nothing. What matters is what we make.
Americans drove 3.2 million million miles (aka trillion).
Comparing things that have managed in total barely 1/100,000th of a years usage means anecdotes don't matter.
Because there is at minimum one unavoidable aspect here: we don't know what accidents an AI could prevent and way way more importantly we don't have enough data on what accidents an AI would cause at highway speeds.
Insufficient data is available for how effective AI is at preventing accidents.
The Arizona one I gave for instance, the company was found not liable since a person might have done the same but could they have slammed on the breaks turning a fatality into an injury? We will never know and can never no.
The best we can do is measure real world performance and pay close attention.
> In the US, it’s always considered the fault of the person who read ends someone.
This is simply false. Brake checking is explicitly a traffic offense in some states, and can get you cited under something like reckless driving in some other states. Your insurance company is not obligated to find the other driver at fault if you're cited for brake checking.
First, post something false, prepare to be corrected. "In the US, it’s always considered the fault of the person who read ends someone" just isn't how things work.
Second, I'd expect Cruze to at least have civil liability if their cars glitch and brake spontaneously, resulting in an accident. Since spontaneous braking is something autonomous vehicles have struggled with (Tesla at least), yes, it's relevant.
I've been brake checked by a cruise car in SF. I had a dash cam and had I hit it I'm confident I would have been cleared and the robo car that stopped suddenly randomly in the middle of traffic would have been at fault.
They literally have signs on them that say "warning, stops suddenly" , not a joke, literally
Both of the cars were stopped beside each other at a stoplight when the pedestrian was crossing the road.
Here is what a human driver could have done in this situation that the Cruise car did not:
1) Look over at the other driver for whom the coast is not clear. Are they paying attention? Do they see the pedestrian? Are they texting on their phone?
2) Be aware that sometimes distracted drivers will start moving their car again when they see that the car beside them has started to drive.
3) If the other driver is distracted, wave to try to get their attention. Roll down your window. Yell "hey!" and point to the pedestrian. As a last resort you can use your car horn, but that may make them lurch forward into the pedestrian.
4) If your attempts to make sure the other driver is paying attention have failed, you should NOT go, even if the light has turned green, until you have assured that the pedestrian has safely crossed the road. You should be prepared to motion to other drivers that come after that they also need to stop to prevent the pedestrian from being run over by multiple cars.
I get it, all this situational awareness stuff is difficult to code, but it's an important part of driving.
That is an extremely provocative photograph. Where are the Cruise PMs trotting out the video footage from the car now? In their last incident with a fire truck [1] they showed a journalist video from the vehicle. How could such footage possibly show them in a good light now?
It's all very complicated. I'm not going to die on the hill of Big Autonomous Vehicle, at the end of the day, someone, somewhere will make a good AV, and maybe Cruise's in particular just suck.
But if you just shared photos from Fifth through Seventh on Market, where this crash occurred, that would also be provocative. It would provide some context that a Cruise PM cannot possibly discuss. If you shared photos from lethal highway crashes, that would be extremely provocative.
Surely lawmakers who put together the CPUC authority and those rules considered the possibility of lethal and provocative accidents with AVs. So nothing is going to change.
I welcome David Chiu to come to Fifth and Market and deliver an intellectually honest plan to reduce mortality among people there crossing the street that involves reducing AV driving during the day.
I love how the media doesn't care at all about pedestrians until a driverless car is involved. Then again, CNN will have trouble running their car and truck advertisements if they reported on the actual damage that these companies have done to our country.
The framing for these reports is always the same: the journalist refers to other recent incidents involving autonomous vehicles, ignoring the very high base rate of incidents not involving autonomous cars. Accidents that don’t involve autonomous cars are not interesting to them or their readers because they are so common.
If the SF Chronicle actually wanted to inform their readers they would publish a table of all motor vehicle crashes in the city by type of vehicle.
The thing that stands out to me is that the driverless car parked on top of the victim until the fire department arrived and extricated the victim. There are few vehicle-pedestrian crashes where the offending vehicle parked on top of the vehicle and wouldn't be moved; screaming bystanders are usually effective at compelling a driver to roll off the victim.
This isn't a simple matter of "humans good drivers, robots bad drivers"; it's an example of "humans can improvise and respond reasonably to novel situations; robots cannot reliably interpret and respond to novel situations."
It’s not at all clear to me that a hypothetical scenario where the driver moved the car after running over the person would have a better outcome than waiting for first responders to jack up the car (see e.g. https://news.ycombinator.com/item?id=37753867).
You claim “ There are few vehicle-pedestrian crashes where the offending vehicle parked on top of the vehicle and wouldn't be moved” - what is your evidence for that? 2.3 million people are injured in car accidents in the United States per year.
Finally, it is plausible that AVs will have a higher frequency of certain very specific failure modes relative to human drivers, while having dramatically lower overall frequency of accidents where people are injured. This type of reporting obscures that.
> According to Cruise, police had directed the company to keep the vehicle stationary, apparently with the pedestrian stuck beneath it.
> While spokespeople for the Police Department did not have an officer’s report Tuesday morning to clarify exactly when police made the order, they said it is standard procedure not to move cars at the scene of wrecks — regardless of whether a human is operating them.
It’s because of the nature of some these accidents and incidents by autonomous vehicles being under circumstances that would be very unlikely with even the most careless driver.
They highlight the helplessness and cold hard reality of being at the wrong end of a miscalculation.
So even if the overall number of accidents is less, the number of times a human driver would park on top of a victim of a hit and run and turn on the hazard lights is near zero.
Cruise gets a ton of positive press from being on the road compared to self-driven cars. There are no stories about a person buying a car and driving it around their city because it's not interesting or novel.
At some point it won't be news when driverless cars are involved in accidents but for now they are. Personally, I think the public should be informed about what a new technology is doing and capable of whether it's good or bad. Especially one that can kill people rather easily.
Right now Waymo is operating at a larger scale than Cruise, and driving in more difficult conditions, yet seems to have an order of magnitude fewer problems. The fact there's no way to objectively differentiate the two seems like a regulatory failure.
I don't know about Cruise but Waymo has a training video[1] that shows a first responder hitting a button to contact dispatch, then explaining that they're a first responder and they need to move a vehicle, and finally being given control. It looks like the process could take 30 seconds in the best case scenario, which is likely too long in many life or death scenarios, but they do at least have a process.
Good, but in a poor reception area, or in an earthquake or more conventional cell network outage situation, a link to the control center may not be accessible.
This seems like pretty a straightforward, but overlooked systems engineering concern.
Yeah I think we're in agreement that the current solution is not very good. I can understand why they opted for it initially though, given the thing that happened with protesters putting traffic cones on AV's to disable them. Imagine if instead of doing that, protesters hijacked cars and drove them into situations where the AV couldn't easily navigate out of, like driving up onto sidewalks for instance. The bombastic headlines write themselves. In the end I bet they end up making a kind of "master key" that works with all AV's to distribute among first responders. I'm in the "it'll get sorted out eventually" camp on this one.
At this point autonomous cars have been in multiple incidents that have blocked traffic, impeded emergency access vehicles, and now sat on top of accident victims. I understand it can be tricky with multiple human orgs and interactions in play but, the time to sort it out is now at limited scale. It only gets harder to implement reasonable standards later.
A rolling camera laden sensor platform seems like a poor target for vandalism - a simple label with: recording starts when emergency controls are accessed would seem to discourage most & the car doesn't need to move fast or far for emergency situations.
The editor tried REALLY hard to get that driverless thing in front in the title... fucking scum.
"Woman, on red light, gets hit by human driver and gets yeeted onto autonomous vehicle", ah, yes, make the article about autonomous car that just stood there.
The slant on these articles is so wild. It's probably more noticeable because I'm generally a fan of AVs. But wow, if you're a journalist writing this go take a look in the mirror and ask what happened. She was literally hit by another human driver who then fled the scene of the accident. I'm sure we'll find out more about the AV, since you know, it didn't flee the scene, but it's so funny to see this article focusing on the AV.
Half the article is about the EV. I’d encourage the curious to read it. There’s a lot of info about how the Cruise vehicle responded and how people had to work around the vehicle’s response.
I mean, the police directed Cruise to leave the vehicle in place. What exactly would you expect them to do when directed to not move?
From the article:
According to Cruise, police had directed the company to keep the vehicle stationary, apparently with the pedestrian stuck beneath it. Each car contains a speaker through which Cruise staff can communicate with passengers or with law enforcement trying to give instructions.
While spokespeople for the Police Department did not have an officer’s report Tuesday morning to clarify exactly when police made the order, they said it is standard procedure not to move cars at the scene of wrecks — regardless of whether a human is operating them.
“When it comes to someone pinned beneath a vehicle, the most effective way to unpin them is to lift the vehicle,” Sgt. Kathryn Winters, a spokesperson for the department, said in an interview. Were a driver to move a vehicle with a person lying there, “you run the risk of causing more injury.”
Listen, driverless cars have an uphill battle that I frankly doubt will win. From job displacement to things like this, they will encounter opposition every on step of the road (no pun intended).
The other thing is that if a driver were to do a thing like this, he'd be accountable for his/her actions. With driverless cars, the company could get a way with this by paying a fine or giving money to the family effectively putting a "price tag" on human life. Heck, facebook and google don't even seem to care anymore when some government hit them with fines. Now, imagine a mainstream driverless car company. We'll get shock at first, but when convenience starts setting in our daily lives, the shock factor will start dwindling until we see these events as part of "things that happen when we hit the road".
I’m generally anti “self driving” as it’s demonstrably not safe in the general case, but this particular case seems like something that anyone could have done, and possibly impossible to have avoided (the self driving car braked immediately).
If you’re following a car around a corner, and that car hits someone and does not stop, your first indication that there’s a person would be a ground level pedestrian momentarily appearing after the car had finished running them over. This is actually an ideal case for self driving tech as their camera positioning gives them a lower vantage point, and it’s possible the victim would not ever be visible to a human driver.
Why is it always Cruise in the headlines... it didnt hit the woman but it did run her over ...pinning her to the ground where instead of a human driving taking action right away no EMTs had to contact Cruise for them to disengage self driving and allow EMTs to get the woman out from under the Cruise car. Those seconds to minutes of agony with a car on top of her for the sake of robot AI self driving progress disgusting ... i guess Cruise might have learned something and changed their algorithm at the expense of her mangled body yet hopefully she survives.
and somewhere 2 teams of Software Engineers are tasked with Test Case 7462628: a Video classifier to detect when the AV has stopped on top of human and pinned them beneath its wheels
While news media are over-reactive to accidents involving self-driving cars, if such an accident is caused some vulnerability in the driving algorithm that is not so easy to fix or improve, its impact is potentially widespread, because the same vulnerability is shared by other cars.
That's an interesting detail! Perhaps she was lying down & the Cruise smarts didn't perceive her as a pedestrian, but more like an obstacle / 'bump' in the road?
Just speculating here, but would be useful to find out.
If human drivers see a pedestrian get hit or someone falling down, we instinctively know that's a human there on the pavement (and try to avoid at all costs, just like for pedestrians walking around). Or avoid human-shaped / human-sized objects just in case.
If a self-driving car would lose track of a human as soon as it stops looking like a typical pedestrian, that would be a problem.
I'd like to hear how Kyle Vogt is going to spin this one that the Cruise car is still safer overall than the humans despite continually being the ones in the spotlight as being a safety hazard that doesn't belong on the road.
Don't forget that if your way of making sense of the world is based entirely on what's in the spotlight, then you are always at the mercy of the spotlight operator.
It would have been worse if it drove off instead of stopping. Having training data for this scenario seem difficult, likely need some sort of reworking for this edge case.
Goes without saying really since we already have proper prosecution of persons. Driverless cars killing people seem like they will inevitably trade lives for fines.
I’m not convinced that it does go without saying. As a victim, I don’t think I would care whether the entity that hurt me is now in prison or short of a few dollars knowing that the probability of it happening again is the same.
Self driving cars will reduce the rate of accidents which is the important part.
> Self driving cars will reduce the rate of accidents which is the important part.
So says the people trying to get various municipalities to sanction the experiment. How do you think the first Walmart was sold to a town? Do you think they said up front that it was going to be a giant fucking rugpull?
What is the chance self driving cars get super popular, everyone forgets how to drive over the years, then the VC $ drys up, the enshittification happens and prices get way too high. The people with no where else to turn with few human driven cars around end up demanding safe human scale transit like bikes, trains and busses?
Not sure this is the exact most plausible scenario but I feel fairly confident that enshittification will happen and now am curious about some other, possible ways, it might manifest.
Like, faster route being more expensive?
Mandatory delay at arrival with an extra fee to unlock the door immediately?
Making trip info public / sharing with advertisers / employer / spouse unless a higher fee is paid?
dunno, but transportation is up there with health care, education, child care, housing, etc as things that people can't really do without (unless you're still healthy and young and can ride a bike).
it feels like it should expand to consume available income with subscription fees.
Under what reading of the First Amendment is that allowed? Some historical argument based on the Alien and Sedition act, because criticizing self driving cars is a threat to national security?
Note that I was being sarcastic. The headline was extremely misleading, to the point of propaganda. I'm not actually saying it should be illegal- lying is legal.
Note that I was being sarcastic. The headline was extremely misleading, to the point of propaganda. I'm not actually saying it should be illegal- lying is legal.
self-crashing and/or moving-disaster cars are a pointless and stupid innovation beyond the curiosity factor and research. AI should just be an aid not a driver
Horrible - if a human had been driving they could have moved the car to stop pinning the victim, but a self-driving robot just sits there with a screaming woman stuck under it.
The last thing you should do is move the car when someone is stuck under it.... Jack it up and pull them out. Good lord, imagine getting pinned under the middle of the car and the driver hops out and sees you and is like hang on i got this and then drives over you again.
if a human had been driving they could have moved the car to stop pinning the victim
FWIW, that's not necessarily better. In some cases, when suffering a crushing injury, it's best to leave the crushing weight in place until EMS personnel arrive and can transition immediately from "removing the weight" to "corresponding intervention". You can see a bit of the discussion around what goes into treating "crush" type injuries here:
Just got back from the grocery store and it was still full of fresh meat and vegetables so its safe to say there are still a few people that haven't moved to the city yet.
If a human driver had been driving, they also could have fled the scene, like the other human driver who caused the whole problem in the first place, and left the woman laying on the road in the dark, you know, just waiting for the next human driver to drive over her. Lets not forget that the woman was crossing the intersection on red, violating the laws that are meant to keep her safe, which again is something that the robots tend not to do. Basically, what we have here is two humans fucking up and yet we blame the robot for doing the best it could given the circumstances.
In fact, the driver of the car did not have "the right to enter the intersection" if it meant they were going to run a pedestrian over, no matter what color the light was.
How did we get to the point where reflexive empathy for drivers over everyone else has completely trumped basic understanding of both the law and human decency?