> Cars with Level 2 automation can perform key driving tasks like steering, acceleration and braking to keep a set distance from the car ahead, center the car in its lane and stay at a certain speed. Yet they can't handle every situation unmonitored and need the driver to pay attention to the road ahead in case they need to take over the wheel.
Designing a system like that but blaming PEBKAC whenever something goes wrong is going to kill a lot of pedestrians. If you set impossible expectations and people don't meet them, there is a problem with your expectations.
/edit
I'm not against automated cars in general, but if Level 2 means a decrease in fender benders, but a higher level of kids running out in to the middle of the road getting killed because the driver is zoned out, then I think it should be skipped.
The study showed the group who saw the most bears were the experienced drivers with L2 assist. So L2 appears to make things more safe, rather than less.
That's certainly my experience. I love L2 for night driving, it lets me spend more time scanning the ditches for animals.
I'm not convinced the relationship is casual. For example, couldn't the people who already had experience be enthusiasts who are more interested in driving generally?
The simpler explanation is that drivers familiar with the feature aren't task-saturated managing the feature.
It's like learning to drive for the first time: you're task saturated just keeping the car driving straight and at a consistent speed. After a while you learn to manage that and then your attention turns to other aspects of driving the car.
It truly doesn't make sense that regulators are not only allowing this, but allowing ridiculous overexaggerations in their monikers like Autopilot and Full Self Driving and shit.
I guess that's not true, it makes perfect sense that regulators don't do anything anymore in this crony capitalist world, but I guess it's disappointing.
The legal system needs to blame someone. If the system fails and it kills someone, whos fault is it? Currently its most likely the drivers fault because its their job to ensure everything is ok and brake manually if they see something.
If something is known to be dangerous for any human driver, and is implemented anyways, then that seems like reason to have liability with the design of the system. If your brakes required solving a rubik's cube to activate, that would be poor design. This is no different. Humans cannot maintain attention on tasks that require sporadic but high-intensity attention.
If that happened to me and the legal system tried to blame me, I would bring a pile of papers and several experts on human attention research. They would testify that humans don't work that way and that the car maker was being either stupid or malicious.
I believe that the only reason Uber, Waymo, and friends have gotten away with "it's the drivers job" is because there aren't that many autonomous cars around and no one was brought to court yet.
> I believe that the only reason Uber, Waymo, and friends have gotten away with "it's the drivers job"
Waymo doesn't do L2 or L3, only L4 autonomy. In the case where the AV is at fault for an accident, it would be due to the system as there is no human driver in the vehicle.
There are plenty of cases where a human preventable accident held the manufacturer of some object to be financially liable, this is how self driving will end.
A cute upper middle class child doing something innocent will get killed by a car being operated mostly correctly with a driver who could have prevented the accident if they were more alert to the controls.
Corporations have been killing people for a long time. The early cars were death traps that would almost certainly kill you if you got in an accident. Cigarette companies kill their customers and used to tell them it was good for their health. I’m very skeptical self driving cars will be stopped by people dying. The company would be liable and there’d be a payout, same as always.
No sane person will be driving on autopilot (metaphorically or literally) through a school zone. And if your car even allows you to turn it on when the speed limit's 25mph/40kph because it has a map system that knows the speed limit on every street you drive, your car has a glaring fault that needs fixing.
But you are going to use L2 driving a highway or even just decently paved road that happens to be going through a forest.
> No sane person will be driving on autopilot (metaphorically or literally) through a school zone.
Of course they will, sane or not. Do you really think millions of drivers will stay alert when they've gone months or years without an accident?
When the car starts beeping, by the time the driver looks up from their phone and understands the situation, half of little Timmy is already 100 meters behind the vehicle.
I think you give people too much credit. Anecdotally, I can't tell you the last time I went even on a short trip (fewer than five miles) without seeing many people checking phones/scrolling. I think folks are going to be much more cavalier when they've got an "autopilot".
Is that a really fair comparison? I'd say it's location dependent. A street in a small neighborhood, yes kids running out in front of you could be a possibility.
But I don't know if humans can handle that well. It's safer to brake instead of swerve if an animal darts into the road. Of course a kid darting into the road is more serious and traumatic, but the typical driver may not be able to avoid a crash.
The idea is that human-only drivers will be paying more attention in general. It’s not like kids are jumping out from bushes in camo, they’re present in an environment that can be observed as you get closer and closer to them.
Distracted driving was already a problem on the rise, and self driving mostly seems like it’s going to enable that type of inattentiveness even more.
I always expect kids running out into the street to be in lots of danger anyways. But the big problem is that weird unusual events while driving are rare, at least in the developed world. That’s why that kid running into the street has a much better chance surviving in crazy Beijing where drivers are very aggressive but on edge all the time vs LA where traffic is much more sane, organized, but drivers get too complacent.
I felt the same driving in Peru vs driving in Sweden or Spain. While the traffic is a lot more organized in Sweden and Spain, people were a lot less attentive so close-accidents happens a lot more than in Peru, where people communicate via honking and pay attention to the road 100% all the time.
While the traffic was a bit crazier than I'm used to, I actually felt safer on the road in Peru.
I felt this too, I wonder if this has a name. NYC is crazy aggressive, but I felt pretty safe, everyone was paying attention. In DC some people want to drive as aggressively as in NYC, but they don't pay attention, or at least most drivers don't. The result is that I felt much safer in NYC than in DC, even though it's much more congested.
If you’ve ever been to China, you might have had the unsettling experience of hearing a truck SPEED UP as you step out in front of it coming down the street.
There is also the cruel practice of “finishing off” a pedestrian injured by a car collision.
It is much, much, more dangerous to be a pedestrian in China. Be careful!
Most of the reported “finishing off” incidents happened in Taiwan, actually. However, I’m sure it happens in the mainland as well.
I’ve seen a taxi hit a girl riding a bike through a red light in Beijing. You never really forget the thunk. I’ve also seen a car’s tires all pop off on a ring road in the middle of the night while going 120kph or so (I’m guessing a bad mod job).
Still, pedestrians and other cars doing weird crap is just so common that everyone is expecting it. It is definitely dangerous to be a pedestrian even if you are following the rules, but trying to cross four lanes of traffic without a light or crosswalk doesn’t help. Still, the fact that most people do it and it’s considered normal means all the driver’s are on edge and paying deep attention.
True. A driver in the states is unlikely to ever encounter a kid running out in front of their car, while a driver in China might encounter that situation multiple times a day. Now, more kids are going to die in China (they have higher rates of traffic fatalities), but for kids that run out into traffic, more will survive in China than in the USA where no one is really expecting that to happen (with the caveat that there are much fewer kids running out into traffic in the USA vs China).
It really is just a matter of whether some event is rare or not. In countries where the traffic is more chaotic, drivers are necessarily paying attention much more than places where traffic is much more orderly.
(Do people in China respond faster because they're on edge, or do they respond slower to the single risky event, because they're distracted from all the other noise?)
(Do children in China actually run into streets with fast moving cars more often? Perhaps they're more aware of the risks if the road is indeed more chaotic. Is the road filled with a slow constant traffic jam (something I experienced in India)?)
(Etc.,)
This could be one of those common misconceptions that can found out to be false if investigated properly (that being said- I don't know if it's true or false)
I don't think that's the kind of thing we can have a direct study of. Nobody is going to send children out into the streets to see how they fare and we don't have numbers of how often kids run into the street because nobody is there recording that.
Wikipedia[1] shows deaths per 100k vehicles per year. It puts the US at 14 compared to China's 104. A slightly better statistic would be deaths per billion kilometers traveled but that's not reported for China.
If I had to bet, I'd bet that there is marginal benefit from having lots of experience with children running out in front of you. Maybe there's a slightly higher chance per incident of child-street-running resulting in death in the US, but I don't think there's any evidence for that opinion.
Several years ago I visited my uncle who lived in a small town in the middle of nowhere in Thailand. At some point I drove his old motorcycle around town, and he explained the traffic system this way:
In the West, the core concept in traffic is "Right of Way". Everyone on the road figures out who has the right of way, and that person shouldn't have to change their speed or direction.
In Asia [1], the core concept is "don't hit the thing in front of you". Basically, you should be aware of what everyone else on the road may try to do, and if you ever hit something in front of you for any reason, it was your fault.
([1] Obviously "Asia" is a big place, and traffic laws and customs will be different everywhere you go. This explanation matched the behavior I've observed in both Thailand and central China.)
So for example, in the West, if you're overtaking someone and need to switch lanes, it's your responsibility to check to see if there's already a faster car in that lane; because if there is, they have the right of way; if they have to slam on their brakes or swerve to avoid you, that's your fault.
In Asia, if you're overtaking someone, it's their responsibility to notice that you may need to switch lanes and give you space. If they hit you, it was their fault. So most people just swerve out without checking.
Same thing with pulling out onto the road -- in the West, if you're pulling out from a side street onto a main road, it's your responsibility to check that there's no traffic. In Asia, it's the job of the person on the main road not to hit you; so people will just pull right out onto the road without checking first.
I'm pretty sure "right of way" results in fewer accidents overall; but "don't hit the thing in front of you" definitely results in much more careful drivers in my experience. You just can't go 60 miles an hour down a city street if you have mopeds shooting out of random side streets left and right.
According to a quick search, Thailand is #5 in road deaths per capita. So I don't think they fare very well.
But your anecdote reminded me of what the instructors in my driving school used to tell us (20 years ago..I'm old..)
- whenever you sit into the driver's seat, your left buttcheek is actually sitting in jail and your right buttcheek is laying in a morgue. Drive accordingly.
- everyone else on the road is a psychopath that's out there to kill you. Drive accordingly.
That was of course mostly meant to calm down a bunch of 18 year olds who were eager to drive, but I think the basic premise of "don't even forget that driving is really, really dangerous" is correct.
In Japan there is traffic priority (right of way) but all drivers also have a legal obligation to not hit anything, which means that in nearly every accident blame will be split between both parties, usually 80:20.
It depends on where you are driving in the US, but I've had kids run out in front of me at least twice in suburban areas (stopped in plenty of time). One was the TV classic beach ball appearing from behind a parked car, with a toddler not far behind.
Do self driving systems know that "ball" means "kid"?
What people forget with self driving cars is that they don't have to always go at the speed limit.
I love taking trains rather than driving, I can put all my attention on netflix or reading a book. Sometimes the train goes at 100mph sometimes it doesn't, sometimes it stops for 5 minutes, sometimes it just continues slowly at 10mph. At first you find it annoying when a train appears to stop for no reason for a few minutes before continuing, but after it happening a few times, you get used to it, stop noticing and stop caring.
Self driving cars can do the same. It's ok for a car to slow down to 5mph when there are hazards, it's ok for a car to go fast when there aren't. The car doesn't need to know that "ball" means "kid" if, every time it sees a ball or a plastic bag or a puddle, it slows down to 5mph and continues at that speed until it's sure there is no more hazard.
Humans are very bad a driving. They treat the speed limit as a target rather than a ceiling. I live on a road with 2 schools, the speed limit is 30mph and yet many people drive much closer to 40mph past children walking to school. I'd be willing to bet big money that it's much more likely that that impatient driver driving at 40mph is going to hit a child than a self driving car that changes its speed according to the current situation.
This is a great poitlnt, and a good driver would do the same things ng. If there's an unknown hazard, you slow down to make sure it isn't going to cause a problem. Self driving cars can have the same behaviour programmed in.
But they won't, because for acceptance they'll be programmed to drive "naturally", which means at or above the speed limit, following too closely, et cetera.
The standard complaint for an under-developed L2/L4 system is that it's too timid, it slams on the brakes for a paper blowing in the wind, et cetera.
A car that can avoid hitting a paper blowing in the wind is a car that can avoid hitting a child. Maybe we should just accept a high number of false positives.
Alsmost all of these happen here when a kid is jumping from behind an obstacle or running after something like a ball. The way you are taught to anticipate is to spot the children playing ~before~ they are even close to jumping in front of you car. I would not be surprised if any driver, LA or Beijing, would hit the kid if it jumped in front of the car and they were not aware kids were playing.
The difference is that the kid in Beijing is chasing the ball out into a crowded street with multiple lanes of traffic. The kid in LA is chasing the ball out into a quiet residential street...and even that is rare.
The title, while correct, is misleading. The actual results are:
Drivers using Level 2 ADAS who are experienced with L2 ADAS have more situational awareness. The ones using it for the first time have the least, while regular drivers were in the middle.
Presumably the ones used to ADAS relax more and look around. Is that good or bad? Unclear: does it make them more likely to note and deal with unexpected risks or not? The article guesses the answer though the actual paper may not have.
There's also an interesting secondary observation.
"Drivers who missed every instance of the bear spent more time looking at various parts of the car's dashboard than any other group."
I hypothesize that, while some of that was surely trying to navigate a radio or air-conditioning setup they were unfamiliar with, much was likely spent monitoring the speed of the vehicle.
An simple HUD illumination reflection on the wind shield for the speed might help as a safety measure. Or possibly changing laws and enforcement so that police were focused on non-revenue generating actives and mostly about curbing any behavior that harms the safety of a steady and orderly traffic flow.
Right, the general conclusion of this study is positive for driver assistance, albeit with a small sample size, but the headline crowbars a negative interpretation into it.
And this is on the front page of HN because that headline reinforces the common worldview here: that partial self-driving is more dangerous than none at all, and full autonomy is further away than manufacturers like to suggest.
If the result that experience with L2 seems to improve situational awareness is robust, then that raises the question of why that is so. Has experience taught them that they need to pay attention, in a way that is more effective than merely being told they need to? Are drivers new to L2 overconfident in its abilities, or perhaps merely unsettled by the unfamiliarity of the situation?
Note that the article also mentions that other studies have found increasing complacency with experience.
> Does it make them more likely to note and deal with unexpected risks or not? The article guesses the answer though the actual paper may not have.
Given that noticing a problem is a prerequisite of dealing with it, this is not an unreasonable supposition, though one should also wonder if their reaction times are slower. Note that even if that is so, noticing more things, or noticing them sooner, could more than compensate for this.
Similar semantics with aircraft and autopilot systems. It reduces operator workloads but you train or disable it and hand fly the aircraft for fun and to refresh your experience periodically.
Someone who uses this technology because they dislike driving will always pose a danger. Those who enjoy driving will be more diligent.
I think people who seek enjoyment in driving will actually pose a greater danger. If you need to get your enjoyment out of the driving itself you will start to exhibit more dangerous behaviour, accelerating faster, driving higher speeds, overtaking more. Nothing is more boring for me as a driver than staying between 2 white dotted lines on a 4 lane highway going 20 km/h under the speedlimit trailing a truck or other traffic, so I either start overtaking frequently or I get bored and distracted because of to little stimulus. But with ADAS I don't care, I just relax and let it do most of the work while I take in the scenery or an audiobook.
One common thread among Tesla owners; TM3 owner here; is that while the car will do the driving and such a good number of owners are actively looking for edge cases. Seems everyone wants to be a tester. Plus I figure the majority of Tesla owners are a bit more jaded than many expect. Many of us have had our cars for nearly three years and all we really have seen is party tricks.
In my experience the software developer in me looks at every odd action or lack thereof in the frame of, why this and not that and how do I think it should have performed.
Now there is a big claim out there we will all be able to opt into the new FSD beta and I truly want to give it a whirl.
It's interesting reading articles about Level 2 and how it can adversely affect driver attention. I was concerned when I started driving a Tesla for the same reasons. Would I grow complacent? Stop paying attention?
The opposite happened. I'm more focused on the road since I can relax and not multi-task. On long road trips, I'm not feeling that ache in the knee or the tension in the shoulders/back from constantly keeping the car aligned in the lane. In fact, I wish Tesla would just introduce a touch interface into the wheel so it knew your hands were on it, but not require constantly resisting the wheel to trigger the sensor.
The conclusion I've come to: Level 2 can make good drivers safer and make bad drivers more dangerous. I don't look at my phone when I'm driving. I don't text at red lights. I don't constantly fiddle with the screen. If you do these things, you're a dangerous driver. If Level 2 increases this because you feel a false sense of safety, it'll make you more dangerous.
People just need to take driving a 2 ton heap of metal 70mph seriously and we'd all be much safer.
> In fact, I wish Tesla would just introduce a touch interface into the wheel so it knew your hands were on it, but not require constantly resisting the wheel to trigger the sensor.
We recently rolled out capacitive steering wheels in several Mercedes-Benz models for that reason. It's pleasant.
> I wish Tesla would just introduce a touch interface into the wheel so it knew your hands were on it, but not require constantly resisting the wheel to trigger the sensor.
Morons will use devices that clamp to the steering wheel to defeat this sensor. Actively turning the wheel is harder to spoof.
edit: looks like we went through this process already.
Why is this a problem? My car is at a complete stop and is not legally allowed to proceed. What's the threat? Doing things while moving (even creeping along in stop/go traffic), sure, I'll agree, but stopped at a light?
If your head is down in the car, you lose track of what's going on outside and have to reaquire situational awareness from scratch when you 'wake up'. That takes longer at best, and you may miss something important at worst.
Small things like bikes and pedestrians are frequently occluded and you may miss the glimpses of them you needed to predict their future trajectory.
We could argue about the absolute importance of this effect, but it's definitely real.
Especially when you don’t notice the light has changed, someone honks at you, and now you have half a second to reacquaint yourself with what’s going on in front of you before stepping on the gas. I see many drivers looking down (presumably at their phone) at a red light who get honked at and almost immediately take off without any safety checks. If, say, a child had run out in front of the car, that child would get run over.
The problem is not texting itself, it's not paying attention to your surroundings, while moving or standing still.
If you're sitting at a red light fiddling with the stereo, the light turns green but you missed it, so the person behind you honks, chances are you'll start moving forward without actually being fully aware of the scene in front of you. And if there is someone in front of you, you could hit them. So why take the chance? Always pay attention to your surroundings.
Man, this is why self-driving cars can’t come soon enough. Everyone else’s replies to this already sort of answer your question, but for me this hints at the bigger issue: people don’t take driving seriously. My counter is: why do you need to be looking at your phone in a car? What text is so important that you can’t pull over and read it? Why give up the situational awareness that is absolutely required to safely operate a machine that can kill people?
Life is about trade offs and constant cost/benefit analysis. How is reading/responding to a text message slightly sooner worth the decrease in situational awareness and possible accident involving a 2 ton vehicle?
I think we have normalized the dangers of driving a vehicle to the point where we sort of just shrug at the monumental number of vehicular deaths. I’m confident that when my kid is an adult, he’ll look at us with astonishment when we say we used to drive vehicles around, poorly trained, staring at phones, half-drunk, kind of paying attention sometimes, a bit sleepy and it was not only legal but _normal_. It’ll be like us hearing about how our great-grandparents had a 2 year old work dangerous farm equipment alone. You think “How the hell did anyone survive?”
> It’ll be like us hearing about how our great-grandparents had a 2 year old work dangerous farm equipment alone. You think “How the hell did anyone survive?”
This reminds me of a old news article from Sweden I came across some weeks ago. The headline was something like "Driver crashed into house, judge released him without any fines because he was drunk so couldn't properly control the car"
In the context of the modern world, it's hilarious. But one could wonder how that news headline was reacted to back in the day.
> In 1928, a materially serious car accident occurred on the national road between Örkelljunga and Åsljunga.
> It was an Örkelljunga resident who drove over a merchant from Skånes Fagerhult.
> At the subsequent trial in Klippan, it was said that Örkelljungabon, who was the cause of the accident, received a mitigating sentence because he had drunk brandy and therefore had difficulty controlling.
That was the conclusion in the article too. The editor chose to highlight the less likely (in the real world) situation where the user is new to their L2 equipment.
Yep, I was just highlighting the fact that most articles about Level 2 or, in general, < Level 4, are written with a concern for situational awareness. In my experience, it was the opposite: it improved it.
So the actual numbers are: 5/10 drivers on L2 cars with no L2 experience missed seeing the bear all three times, vs 2/10 drivers with no L2 experience driving a non-automated car. That’s hardly statistically significant.
Furthermore, 6/10 drivers with L2 experience saw the bear all three times, as compared to 2/10 drivers using a non-automated car. So a stronger finding (again, ignoring the laughably small sample size) is that experienced L2 drivers driving automated cars are better than inexperienced L2 drivers driving normal cars. In other words, we should put everyone behind a L2 car but require additional training.
This is what happens when you design experiments with 3 buckets, 3 tests, and tiny sample sizes. You can draw any conclusion you want.
Not to mention there is no non-L2 driving control group to speak of... I have to wonder if there actually WAS a control group and it didn't reinforce the goal narrative so it was omitted from the results.
I have a 2019 Subaru Forester with Eyesight (lane keeping and adaptive cruise) and Driver Monitoring System (beeps whenever you look away from the road and automatically adjusts seat and mirror settings based on facial recognition).
Compared to other systems it's far from perfect (it bounces between lane lines, it often looses track of the lanes even in good lighting), it doesn't always see the car in front of you especially if it's stopped or out of lane) but every time I drive it I find myself feeling safer to look around at my surroundings a lot more, especially my rear. I've saved my rear-end from two near misses, and countless reckless and speeding drivers because I look at my mirrors so much more frequently.
At the same time I find myself looking out of the side windows at the scenery, until the driver monitoring system beeps at me to keep my eyes on the road. It has performed automatic emergency braking a few times because I wasn't keeping an eye on the car stopping in front of me.
The one thing that really works for me is the speed/cruise control. I find that setting it takes my mind off watching the speed limit and gives me more time to look around for people who are crossing the street etc. When I recently bought a new car I eliminated some cars from consideration because they refuse to engage their speed controls at city speeds. I don't really understand why Toyota, as an example, refuses to engage the speed control below 45 km/h. My old VW would allow setting the speed control at any speed, even at a walking speed in 1st gear on a manual transmission.
Huh. I have a 2021 Outback with Eyesight. I don't think it does any facial monitoring. The lane-keeping is more aggressive than the 2017 model I had previously, but beeps at me for not being hands-on enough. As far as I can tell, this definition of "hands-on" means slightly fighting with the steering wheel at regular intervals.
But I'm not sure what that means, is it good or bad to see the bear. You are driving not looking at pretty things. You want to reduce dimensions. It's a 1 tonne vehicle, color shouldn't matter.
Agreed; I don't like to jump in to be playing devil's advocate, but I usually don't pay much attention to things that don't pose much of a risk to me, while I'm driving.
The only reason I would think about paying attention to the bear is because I sometimes am aware of the risks from other's carelessness: "what if one of the ropes holding it snaps, and it tumbles onto my lane?"
But, in this case, if it's only people that are new to an autonomous system (like, new owners, or rental car users), then I can imagine that there may be some learning curve involved that distracts you from being really focused on the driving task, because you're learning to use your 'tool'.
I don't know any studies, but I always thought overall situational awareness was a bit part of defensive/safe driving. Always being aware of other vehicles around you and other potential hazards. Not getting tunnel vision.
Sure, but this study does not convince me that the drivers lacked situational awareness.
They may have been paying attention to the pertinent details like whether the car was going to ingress their lane, overtake speed, etc. rather than the car's decorations.
Drivers may have even seen the bear but simply not committed it to memory since they were focused on driving.
In contrast, it was the drivers that were familiar with the car that most often remembered the bear, perhaps this was because they were paying less attention to defensive driving since they felt more comfortable and had attention to spare allowing them to remember passing car decor.
Ugh, it doesn't take automated assists and cruise control to distract drivers. They're distracting themselves enough as it is in conventional cars. Looks like from the article this just makes them even more distracted. I wonder if the same increase in distraction was seen when automatic transmissions and power steering were introduced. It seems with every new assistive technology, drivers become farther and farther removed from the fact that they're driving.
When the pandemic is cleared, get someone to drive you down any US interstate highway as a passenger during rush hour and count the number of drivers playing with their phones and texting. I'm amazed the highways aren't even deadlier than they are, given how little attention people pay.
The article suggests that it is only new users who had reduced situational awareness. Experienced users most often noticed the bear. So it doesn’t seem as simple as “more technology makes them farther removed”.
From the article:
“ ‘Our data suggest that Level 2 driving automation has the potential to improve a driver’s [situational awareness] once he or she is familiar with the technology, although it does not guarantee it,’ the report reads.
Per the IIHS study's results, nearly all of the experienced users of Level 2 systems were able to notice the bear during their drive, and more drivers in this group were able to correctly note the number of times the bear drove past them than anyone else. Meanwhile, over twice as many of the drivers who weren't experienced with Level 2 automation but used it for the test didn't remember the bear at all compared to either of the other groups in the test.”
One might conclude that we need to reduce this “learning period” with formal driver training for these systems, perhaps the way we certify people for drivers licenses or something.
>Looks like from the article this just makes them even more distracted.
No, that's not at all what the article says! It says the opposite
>"Our data suggest that Level 2 driving automation has the potential to improve a driver’s [situational awareness] once he or she is familiar with the technology, although it does not guarantee it," the report reads.
Beyond that, even if nobody noticed the teddy bear, there were no controls who were driving with no assistance features, so you wouldn't be able to make that conclusion anyways.
Day of the anecdotes for me, but I'd like to add, that I would approve this.
I've got an electric car with level 2 assistance since last year and although I didn't drive as much as expected due to Covid-19 and lockdowns I feel how driving has changed.
At first I was very "afraid" of the systems doing stupid things and probably killing me while doing that. I was very focussed on controlling the slight steering it does when getting near white lines (Germany here) or how it would brake when following another car.
As that fear passed, I became much more concious of my surroundings, paying more attention to what other drivers might do (especially on the Autobahn) and how I could improve my "driving style" to create a better flow of traffic.
Can't wait to experience higher levels of assistance systems. I'd LOVE to get into the car, position myself behind a truck with 90 km/h on the right lane and let the car do it's thing while I'm not watching what it does.
My wife's car simply alerts her to nearby cars. That alone is enough to make her driving more dangerous. Every beep, every flashing light and she swerves (unnecessarily) to avoid imaginary conflict.
E.g. a car coming up in the next lane left will trigger an alert, resulting in an involuntary swerve to the right. This is not improving her safety at all.
You can say 'just learn to deal with it', but its been a year and no change.
This seems somewhat "harsh", why does the system work like that?
I got a Hyundai and it has a lane assistant which keeps you in your lane, but it doesn't freak out when somebody passes you.
Braking, when you run up to a slower vehicle in front of you, is sometimes a bit stronger (depending on the speed difference), but it's not alarming either.
I know of dead spot assistants which light up in your mirrors if you have a car in the dead spot, but those don't beep either.
What kind of assistance system does your wife's car have? Just to avoid that... :-D
I didn't make myself clear. My wife freaks out, not the car. She responds badly to every alert. Mostly because, I think, she's quit paying any attention and the alert startles her into checking out her mirrors etc.
Aligns perfectly with my experience. As soon as my wife's car started alerting her to cars around her, she quit looking. Instead, when an alert sounds, she jerks the wheel at the sound and swerves involuntarily to avoid the imaginary conflict.
Its harrowing to be a passenger. I drive myself everywhere now.
My wife's car just has "Parking Assistant" which beeps if you're within 2 feet of an obstacle in any quadrant of the vehicle. Basically it scares the shit out of us when we're trying to park. So I can understand the harrowingness of that system.
I think a better design than an audible alarm is to show the driver using quadrants or a diagram when an object is within the "close," "very close," or "near-collision" zone around a vehicle and beep once but only when the object passes into another zone to alert the driver to a change in state. The tones should change for each transition so you can learn the tone for "object moved from close to very close in right driver's-side zone" and distinguish that from "object moved from very close to close in right driver's-side zone."
Having to constantly pay attention to an alarm tone is extremely nerve-wracking. In medical devices this will often result in "alarm fatigue" where you spend so much time dealing with false alarms that you miss a real alarm.
> So, they attached something to a passing vehicle that would be unmissable to anyone who doesn't live inside a permanent art car parade: one gigantic stuffed animal in a bright silly outfit.
I think I'd find this study more compelling if we didn't have evidence that in general, human beings filter out state that is harmless but extremely discordant to the context.
The headline for the same study on a different site:
"Level 2 systems like Tesla Autopilot can improve drivers’ attentiveness: IIHS study"
Both headlines are accurate, both are misleading. Great example of showing you can't trust headlines. I don't even think it's possible to get an accurate summary into the headline.
I fear, as I get older, that younger people aren’t going to know how to drive anymore. People will have a lower attention span and not pay careful attention thereby increasing the risk to everyone only to blame car when they cause an accident.
No, you're not. I'm personally under-convinced this test is measuring anything more relevant than who in the sample group notices interesting-but-irrelevant details.
And then there are those of us who already have low awareness because we're ADD and have been in multiple accidents and Level 2 automation is making us far safer on the road.
The "first-time-L2-on" results are terrifying but unsurprising, it's exactly why I am very much against half-baked driver assists. Cars are not planes.
Also call me an arrogant asshole, but I firmly believe that any driver who drove normally without the assist and missed the bear (even once) has no place operating a vehicle at highway speeds.
Yeah, but planes aren't generally any better at this either. In almost everything except jet aircraft and especially jet airliners, autopilot can be as crude as "maintain this heading and altitude"; i.e. you still need to manage your speed (the engine throttle lever) yourself, including in response to changes of wind direction or intensity.
Then we get to airliners, which have flight management computers. You can give your flight plan to that, and they will even calculate optimal speeds along the way, which the autothrottle can maintain for you. This allows you to, for example, know when the optimum descent point from cruise is; too early and you need more fuel to maintain altitude when you get down to where you need to be, too late and you need to turn more to bleed off more altitude.
The more advanced flight management computers are also loaded with a complete navigation database that includes e.g. speed and altitude restrictions at each waypoint, so they know, for example, that you shouldn't descend below 12,000 feet or be doing more than 260 KIAS at <waypoint x>. This is basically what's on your charts, but it's still up to you to verify that you're complying with the chart, even if you let the flight computer handle it.
Then we get to the autopilot itself. The autopilot on a modern airliner doesn't mean the plane can "fly itself". They still need to be told what to do, and told correctly. If you take off from Paris and head south-east and tell it to maintain 8,000 feet, it /will/ fly you straight into the side of the Alps. The Ground Proximity Warning System will tell you that you're in imminent danger of smashing into a mountain, but the autopilot will not react to that. If you tell the autopilot to descend into the ground it will. If you tell the autothrottle to fly too fast you will damage your aircraft. If you're doing 280 KIAS in a 777 and you throw out flaps, the autothrottle will not slow the aircraft down, and you will damage them. If the autopilot is about to fly straight into another aircraft, the Traffic Collision Avoidance System will tell you that, but the autopilot will not react to that either.
Note that some of this doesn't apply to Airbus aircraft, as Airbus thinks they know better. Whether that's a good thing is up for debate; it did result in, for example, Sullenberger not being able to make an optimal ditching approach into the Hudson, because the Airbus thought his nose-up input would stall the aircraft (which it would have eventually, but he was only a few feet above the river anyway, so that's inconsequential). Then there was Air France 447, where the usually fastidious Airbus flight management computer would usually prevent the pilots from making input that would cause a stall, but because their speed sensors were not working, the computer didn't do that. The first officer held that plane in a constant stall for their entire 38,000 feet descent into the ocean, probably because they were relying on the system to not let them do that. The ground proximity stuff also doesn't apply to some fighter jets, which won't let a pilot aim the aircraft at the ground (because it could be because the pilot has passed out during a high-G manoeuver). Still, the vast majority of it applies.
The point is, these systems require the same kind of checks, instruction, and constant supervision that Level 2 autonomous driving systems do.
> Yeah, but planes aren't generally any better at this either.
> The point is, these systems require the same kind of checks, instruction, and constant supervision that Level 2 autonomous driving systems do.
Well, that was basically my point. The solutions and more importantly the mindset (the machine will just fly/drive itself) can't be easily transferred from aircraft to road vehicles and just because we're doing one thing (half-assedly) in planes doesn't mean we can (or even should) attempt to do a similar thing in cars.
Designing a system like that but blaming PEBKAC whenever something goes wrong is going to kill a lot of pedestrians. If you set impossible expectations and people don't meet them, there is a problem with your expectations.
/edit
I'm not against automated cars in general, but if Level 2 means a decrease in fender benders, but a higher level of kids running out in to the middle of the road getting killed because the driver is zoned out, then I think it should be skipped.