Hacker News new | past | comments | ask | show | jobs | submit login

>Handing over driving responsibility completely requires extremely particular circumstances. Right now, Drive Pilot can only engage at speeds under 40 mph (60 km/h in Germany) on limited-access divided highways with no stoplights, roundabouts, or other traffic control systems, and no construction zones...The system will only operate during daytime, in reasonably clear weather, without overhead obstructions. Inclement weather, construction zones, tunnels, and emergency vehicles will all trigger a handover warning.

Those are some big caveats that mean that you won't be able to use this in most situations. It is basically only good for stop and go highway traffic which is a situation that other driver assist features handle pretty well.




Yes but it's (according to the article) the only system that takes on any legal responsibility and guarantees a fairly long takeover window within which Mercedes will still be at fault for an accident.

All other companies don't go that far and make absolutely no promises. Sometimes their marketing wink-wink-nudge-nudges you and implies that they do the same but in reality they don't.

If this is successful you will soon see customers and regulators requiring the same for all competitors.


> Yes but it's (according to the article) the only system that takes on any legal responsibility and guarantees a fairly long takeover window within which Mercedes will still be at fault for an accident.

I don't think it's that big of a deal but it's clearly well done from a PR/marketing standpoint.

Insurance/warranty is just an expense from a companies POV. You control it via a combination of

a) building a well working product

b) limiting use of that product

c) putting a high enough premium on it

__

a) is what all customers want. b) is what Mercedes is heavily leaning into right now, because looking at the restrictions it's pretty clear that they don't think they have accomplished a) to any satisfying degree.

I am sure they will be setting c) to where their insurance math says it has to be to be financially viable.


> I don't think it's that big of a deal but it's clearly well done from a PR/marketing standpoint.

This is a really big deal. If you are required to instantly take over, you need to permanently pay attention to the current road situation, at which point the autopilot is really just a fancy cruise control. People still stop paying attention, of course, but that's actually a massive risk.

A longer takeover window actually allows you to do something useful, such as read or look at your phone, without taking this risk, since you will have time to adjust to the situation if necessary.


There is not ever going to be a takeover window long enough to allow the driver to read a book at their leisure. That would require tens of seconds for context switching during which the traffic is going to be changing behaviour.

How will the car detect a construction zone it can’t see yet with enough time to hand over to an inattentive driver?

I look forward to seeing this system in operation. I have significant doubts about the feasibility of its operational claims.


To ensure operation inside the limited legal responsibility, there are just two options:

a) The system would have to be allowed to disengage automatically when conditions change unfavourably, in which case you would still have to be alert, all of the time for when that happens

b) It would not be allowed to do that automatically and you are liable from the moment autopilot drives into an area that is exempt from its legal responsibility as laid out by the insurance coverage limitations

For example, take a look at the exemption of "construction sites": Either the car disengages and says "from here on our it's your job, not ours anymore" or it does not, and then in the case of an accident you are not covered by their limited legal responsibility. What the autopilot can definitely not do, is making the construction site disappear or guaranteeing that the car will never hit one after having been engaged.


You missed:

c) The system needs to detect worsening conditions early either prompt you to take over with enough time to spare (or fail gracefully).

That's the big thing that Mercedes guarantees here: You'll have enough time to take over even if you're doing something else; if the system does fails to give you a warning in time and you crash, Mercedes takes the responsibility. In all other systems, once the autopilot prompts you to take over, you are responsible. With this system, once the autopilot prompts you to take over, Mercedes is still responsible for the next ten seconds, which should be more than enough time to take over in an emergency.


Oh well, we seem to have different opinions about whether a 10s-to-react-window in quickly moving car qualifies as having to be alert all the time. Fair enough.


Absolutely.

Regulators also love b) and now that there is precedence we should see the next step by some of the a) companies.

If their product is better they should be able to easily match or exceed Mercedes (and I think they will. Either voluntarily or by law).

To me this is a really exciting step along the way to autonomous vehicles.


Fair enough. I definitely agree that autonomous driving has to be insured by the company providing the service and at least in so far it's a big step in the right direction.

However claiming they have beaten Tesla seems like a bit of a stretch given the circumstances.


I might have written that poorly.

Tesla probably has the lead on average but can't (or won't) guarantee that their system is safe in a specific set of best case conditions.

It's a different approach if you ask me.


If Mercedes would apply the same safeguards as Tesla, then Tesla would look not so nicely anymore.

I am still waiting for the promised full autonomous cross-country trip from East to West. I think it was promised for 2019.

While Mercedes had a full autonomous trip including small town traffic and several round-abouts.

Mercedes just applies higher standards on what they deploy on average Joe.


Musk may have promised that in 2019, but he famously promised it in 2017.


> I think it was promised for 2019.

https://youtu.be/o7oZ-AQszEI / https://old.reddit.com/r/teslamotors/comments/s7vea9/fsd_bei...

Although his earlier claims in 2014 / 2015 were met, as autopilot works reasonably well on roads and highways. The claim for LA->NY was 2016[1]

1: https://www.businessinsider.com/elon-musk-autonomous-tesla-d...


How does Tesla have the lead? Their autonomous technology seems pretty far behind Waymo and Cruise.


You just don't know about the autonomous technology from Mercedes. Because they don't talk about. They don't produce blog post about every small step they do, like Waymo and the others.

The automotive industry is more like Apple. They don't talk and show off until its done. Like: Never over promise and under deliver.


You must be joking. This is the industry that has annual travelling car shows where every manufacturer displays "concepts" that are never thereafter produced.


In particular the automotive industry has been sued often enough to be careful what they say. I fully expect Tesla to be found at fault for some situation where they officially say the driver is in charge, but the courts decide marketing messages mislead the driver.


>Tesla probably has the lead on average but can't (or won't) guarantee that their system is safe in a specific set of best case conditions.

Do people not see how much of a nightmare it would be if liability is constantly switching back and forth depending on the circumstances the self-driving system is being used under? The only practical approach is to either have the driver always be responsibile when the system is on or the manufacturer always be responsible when the system is on. The choice to use the latter approach means the system has to be limited to very narrow circumstances. That doesn't mean the system is necessarily safer than the competitors in those circumstances, it simply means that the companies are approaching the same problem from differenct angles.


They very much have beaten Tesla, or does Tesla have a similar approval and infrastructure in place? Will Tesla take liability from you when the car crashes in Autopilot? They won’t, while Mercedes actually will and has that legally backed by the German government.


As the top comment said, it's only in extremely tight circumstances. Tesla Autopilot / FSD beta (the one that you can use if you have >98 safety score) will work anywhere your car can see lane lines and it'll try its best to work in rain/snow or at 1am with limited visibility.


It is as tiny a degree as between software 95% ready and 100% ready. Or a software development contract with or without the pesky word ‚guarantee‘ in it. At some point a few degrees make the difference between water and vapor.


Tesla will never take legal responsibility for their autopilot with current cars.

There's a reason for that and it's because of a lack of confidence in the technology.


They will try to avoid taking responsibility anyway. The courts will decide if they do or not.


Insurance doesn't cover brand image, which is very important for Mercedes.


> I don't think it's that big of a deal but it's clearly well done from a PR/marketing standpoint.

If this is true why don't all of companies do it?


If I can read a book then how will I know if we come to a construction zone? They cant take such conditional responsibility, it will lead to bad things.


The car will probably detect construction zones on its own. Tesla's already do that (and have since at least 2019).


The fundamental problem is that in order for the company to accept legal responsibility the self driving system must follow all traffic laws, and there is no way consumers will accept a car that won't go faster than the speed limit in no traffic situation.


These two are interesting for comparing Daimler approach, systems used, and engineering philosophy. From the first presentation, Slide 9 is interesting as review of the different levels of automation.

From Slide 23: If you don't take over when requested, after an automated stop, car will unlock the doors call to your emergency response center.

"An Automated Driving System for the Highway - Daimler" [PDF]

https://group.mercedes-benz.com/documents/innovation/other/2...

"A Joint Approach to Automated Driving Systems - Daimler" [PDF]

https://group.mercedes-benz.com/documents/innovation/other/v...

Edit:

"The technical requirements"

"When the DRIVE PILOT is activated, the system continuously evaluates the route, traffic signs, and occurring traffic incidents. As a layperson it’s hard to imagine how sophisticated the hardware and software of the S-Class is in order to be ready for Level 3. Even the “normal” latest-generation Driving Assistance Package has the following:

• A stereo multi-purpose camera behind the windshield.

• A long-range radar in the radiator grille.

• Four multi-mode radars (one each on the right-hand and left-hand sides at the front and rear bumpers).

The optional parking package additionally includes:

• A 360°-Camera consisting of four cameras in the right-hand and left-hand exterior mirrors as well as in the radiator grille and at the trunk.

• Twelve ultrasonic sensors (six each at the front and rear bumpers).

"For the DRIVE PILOT, many additional components are needed besides the sensors of the Driving Assistance Package. The long-range radar in the radiator grille is combined with a LiDAR (light detection and ranging) system. Whereas radar uses radio waves, LiDAR employs pulses of infrared light in order to optically determine an object’s speed and distance and to create a highly precise map of the vehicle’s surroundings. This combines the strengths of both technologies: LiDAR sensors operate with higher precision, while radar is advantageous in bad weather, for example."

"The rear window is equipped with a rear multi-purpose camera that scans the area behind the vehicle. In combination with additional microphones, this device can, among other things, detect the flashing lights and special signals of emergency vehicles. The cameras in the driver’s display and MBUX Interior Assist are always directed at the driver so that they can determine if he or she falls asleep, turns around for too long, leaves the driver’s seat, or is unable to retake control of driving for other reasons."

https://group.mercedes-benz.com/magazine/technology-innovati...


> These two are interesting for comparing Daimler approach

The company is now officially called Mercedes-Benz as of recently, BTW.

The Daimler brand is now with Daimler Trucks, following a split.


It's a marketing parlor trick. They take legal responsibility for something that cannot happen and immediately drop to human control when things turn south, like all so called "Level 3" systems.

I'm also sure they will not take any responsibility if someone rear-ends you when the car stops in confusion on the highway.


Did you read the article? They guarantee a 10 seconds manual take over time and remain responsible in those 10 seconds. That's not the same as the instant dropping the ball that Tesla, Volvo, GM etc all do.


Their system (confirming to German law and as mentioned in the article) gives you a 10 second takeover window within which they are still liable.

Yes they do. That's the point of their announcement and the reason why they only allow it under very specific and favourable conditions


This is remarkable, twice over.

First, a software vendor accepting responsibility for the software's actions? Wow.

Second, they're confident of being able to predict accidents ten seconds in advance? That's up to 160m away and I think that's great, even if they limit the circumstances sharply and allow many false positives.


I don't think they're predicting exactly; I think they've decided that's what is reasonable to ask of a consumer, and they're building their Ts & Cs to fit. They'll then make the technology fit as best it can, but if they can't, it's their fault.


This is not "a software vendor accepting responsibility for the software's actions" this is a car, placed between human, unpredictable, drivers operating 2 tonne machines. Far from Photoshop working on your PC or an embedded system for your fridge.


So it should be much easier to get a warrant of fitness for your fridge, if not the photo-editor, right?


No, they are confident that they can detect at least 10 seconds in advance if one of the many conditions required to operate the autopilot will be violated. Upcoming tunnel, construction, etc.


Right. And any accident falls into at least one of these three classes: Something they won't need to pay for (even via insurance premiums), something the software can avoid and lastly, most significantly, something for which the software can provide ten seconds' warning.

I don't find it remarkable that the software has many reasons to disengage. I find it remarkable that potential accidents >10s into the future are on the list of reasons, even in limited circumstances.

The first software I bought came with a warranty that covered nothing: It explicitly said that the software wasn't guaranteed to perform "any particular function". As I read that text, that vendor had the right to sell me three empty floppy disks. You've seen similar texts, right?

And here we have Mercedes guaranteeing considerable foresight in limited circumstances. No matter how limited the circumstances are, that's a giant leap.


I'd expand that to four classes of accidents. Four, something they /will/ need to pay for (both in money, as they're committing to, and in PR). Inevitably some accidents won't fall into your three "preferred" categories -- making a system like this successful is about managing the size of bucket four, not eliminating it entirely.


If that is the only use case Mercedes feel their safety assurance can justify what does that say about other self driving cars for which their manufacturers will not accept liability?


It says that they’re a responsible company that isn’t comfortable playing fast and loose with your safety. Unlike some US companies that don’t come from a background of safety but a background of “move fast and break things” and a business model of regulatory arbitrage.


But this is quite a bad product in that respect. If this is all it takes, Tesla would implement the same restrictions and get the same fanfare, but alas they're not constantly aiming for "PR stunt" levels of driving.


You're saying that Tesla of all companies isn't aiming for PR stunts with their automated driving? Tesla, the company that calls their level 2 automation "Full Self Driving" isn't out for PR stunts? I would be hard-pressed to name many companies that seem more prone to PR stunts than Tesla.


Assuming that Mercedes Benz (and all other manufacturers at that) know their shit, which they do, it says more about the self-driving competition and self-driving tech in general thann it does about Mercedes' offering.


I personally would rather have a system that I can actually use when I want even if that means I need to accept liability while using it.


> I personally would rather have a system that I can actually use when I want even if that means I need to accept liability while using it.

It's all a trade-off. My accident-rate thus far is one serious accident in over 700000km of driving. My fender-bender rate is three FBs in over 700000km of driving.

My understanding of the Tesla system[1] is that it requires roughly one intervention every ~5000km of driving in order to avoid an accident. For me this is an unacceptably high risk, because not intervening in 4999km will definitely (100% certainty) mean that I will be in a poor position to react when the intervention is necessary.

Now, you might claim that the driver has to be alert while not in control for 4999km to avoid the accident on the 5000km mark, but if drivers were that good at being alert while not engaged with the act of driving, then the self-driving system is redundant anyway.

[1] I read the stats a long time ago, so maybe they've changed.


The statistical POV is a very good point, thanks for bringing this up!

It reminds me of this other excellent discussion about taking risks the other day: https://news.ycombinator.com/item?id=30264760


From my point of view, the actual act of driving is not that difficult (after the first 50k miles or so). The issue is the mental effort required to continually pay attention to what's going on to drive safely.

Tesla's system still requires me to pay attention to what's happening to exactly the same degree as normal because I might need to intervene (and in fact makes it harder to be ready to do so). All it does is take away the (to me) trivial aspects of pressing a pedal and turning a steering wheel.

Whereas this system introduces a set of circumstances in which I don't need to drive at all. No need to pay attention at all. And it's the most tedious form of driving there is - stop start traffic on a motorway.


> And it's the most tedious form of driving there is - stop start traffic on a motorway.

As someone who has used autopilot in the circumstances you describe, I'd be shocked if you need to pay more attention during autopilot than the Mercedes system. Autopilot generally requires intervention in danger situations, which are incredibly rare in stop/start traffic.


This will be treated same as drunk driving. Just because you want to take responsibility doesn't/shouldn't mean anything to legal framework.


I'm not sure that I understand your point. Which party takes responsibility for an accident says nothing about the frequency of accidents or the kind of threat this car would pose to other drivers.


Of course it does (indirectly). Tesla for example can't take responsibility, since they know that their "full self drive" or "autopilot" systems are never reliable, in any driving circumstance.

What you're saying is like "the engineers that built this bridge never drive over it, but that doesn't mean it's shoddy" - technically correct, but almost certainly wrong in practice.


You are assuming the motivations behind these decisions are purely based on safety rather than a philosphical difference in approaches.

From a practical standpoint, liability needs to be all or nothing. You can't have a driver worrying about whether they are going 40mph or 41mph. You can't immedaitely give the driver liability if it starts to drizzle or the sun sets.

Mercedes is taking the approach that they are always responsibile. Other manufacturers are taking the approach that the driver is always responsible. The end result is that the Mercedes system is much more conservative in how it can be used. This says nothing about the quality of their technology in comparison to their competitors. It simply says they are focusing on the easiest problems first while their competitors are taking a more holistic approach trying to design a system that has more broad usability.


As everyone knows by now, literal full self driving (as in get in your car, tell it to take you to the other end of the country and wake you up when it gets there) is entirely out of reach to current technology, and will stay out of reach until we design new sensors and possibly general AI*.

So, the current goals must be to achieve something similar in certain well defined limited conditions, and with reliable automatic checking that you are still within those conditions - hopefully conditions that one is actually likely to encounter. Until we have that, letting self driving cars on public roads is a menace.

Current self driving cars are at best at the level of a driver going through their first driving lessons, and one with very bad eyesight at that. Having a human act as the driving instructor, theoretically prepared to step in whenever the AI makes a silly mistake, is not enough to make these cars as safe as the average (non-drunk, non-sleep-deprived) human driver.

What Mercedes seems to be doing is responsibly pushing the state of the art further. Having a car that is safer than a human driver without depending on your constant vigillance is a huge step forward. Obviously, this only works in certain conditions, but the car itself detects when those conditions are no longer met, and gives you quite ample warning to assume back control.

* Elon's shameless lies about having your Tesla act as a self driving taxi and generate a profit for you while you work in the coming years have well been put to rest.


The point is that if the designers are not even confident in saying "this works without a hitch under X and Y circumstances", allowing its use on public roads at all (and you choosing to use it) are bad ideas.


Yes, but you are missing the essential point of this move.

They are saying that they have enough confidence in the system that they will pay for anything that goes wrong. This means that they have run the calcs to work out how much its going to cost.

Tesla have basically gone: "fuck me its bad, lets just legal boilerplate ourselves out of the consequences. Oh and charge people to QA our shit"


Here's the thing, every customer want this way, until something happens then it's always the engineer's fault.


It doesn't say anything really. This only speaks to Mercedes and the performance of their driverless tech.


This is insurance. It is all probability.


More Actuarial Science but your point still stands.


I like this because these restrictions paint a very realistic picture of the current state of the art of autonomous vehicles. Any claims beyond these are likely just marketing fluff (at this point in time).


Highway traffic is the one thing I would trust a self driving car with right now, and that's how most of the time is spent on long trips. If can legally watch a movie or browse the web while my car is simply following the line (including traffic jams), then I'd be perfectly happy. Just alert me 5 minutes before the exit or intersection and let me handle that.


Handover warning is the key limiting their liability.

Mercedes will accept legal responsibility only while DP is engaged so driver still has to pay attention and react(so called manual take-over).

DP max speed is 60km/h or 17m/s. System is likely to detect and issue under 100m away so driver has less than 5s to take over and sort it out.


They give you a 10 second window from telling you to take over control that they will still be liable for any issues.


I saw the mention of 10 seconds reaction window but I don't think that's universal. As in my example, if there's an orange light blinking ahead and car is driving 60km/h it's impossible for the car to give the driver 10 seconds to take over.


> Drive Pilot can only engage at speeds under 40 mph

Going below 60km/h on the Autobahn... This seems to only be useful for the narrow use case of very bad traffic.


Not unheard of on German roads.


> Those are some big caveats that mean that you won't be able to use this in most situations

It still seems like we need to adapt the roads to autonomous vehicles and not the other way around. In the UK, and there's no indication of it whatsoever at the moment, I hope we never end up with the US's crazy car-centric town streets where as a pedestrian you can only cross at crossings, and if you get hit by 2 tons of steel, it's your fault.


Huge caveats for the initial rollout. Over here, the speed limit for divided highways with no traffic control systems is 80km/h at a minimum. I guess you need to start somewhere though, even if that is 'traffic jams on highways'.


> Drive Pilot can only engage at speeds under 40 mph (60 km/h in Germany) on limited-access divided highways

Don't all highways in Germany have minimum speed of 70 km/h? So basically there's no road where Drive Pilot can be used?


> Don't all highways in Germany have minimum speed of 70 km/h?

No. The only legal limit is that the vehicle has to be type-licensed for going more than 60 km/h, which an S-class clearly is. Actual minimum speed requirements are rare.

Nevertheless this is for now clearly primarily useful as an assistant for being stuck in bad traffic, and you can get ticketed for going slow enough to hinder traffic flow without a good reason, and if you can argue a good reason depends. An upgrade to 80 so you can stick it behind a truck in the right lane would make it a lot more practical.


That’s the trade-off between level of responsibility and capability at our current level of technological progress. Level 2 systems operate in much broader conditions but have high enough rates of failure that a driver must be attentive at all times. The jump to level 3 is mostly about responsibility. That higher level of responsibility means that the car is not going to drive itself unless it’s absolutely sure it can do it by itself.


It's weird how to this day Tesla Autopilot can only really be semi-reliably trusted in exactly the same situations this works

... but because Tesla has the reality distortion field in full effect you never hear these complaints about their solution.

Driving up sidewalks, killing people by slamming into stationary objects, people trying to sleep in your glorified ADAS guided car, all the cost of being able to claim you work in more situations.


>It is basically only good for stop and go highway traffic

I mean if you commute regularly in a major metropolitan area that could easily be 90% of your driving.


I wonder if there will be a long term unintended consequence where as self driving progresses into more difficult use cases people will have less overall experience and will struggle when they have to take over for more difficult scenarios like heavy rain or snow?


It's already happening with existing functionality. I know several younger drivers that struggle with parking without all the cameras and auto-park features.


Unlike Tesla, you can rely on it in those circumstances.

In Tesla you have to have hands on wheel all the time.


You will be able to use it in one of the most annoying situations; Heavy highway traffic with stop&go.

Nobody likes driving in that, a free highway can actually be fun, but as soon as the traffic gets dense it becomes the opposite and quite high stress.


I'd bet anything the restrictions are for insurance reasons otherwise the premiums would be too high


It’s not about insurance, it’s about the definition of level 3 automation. It requires the vehicle to have very high confidence that it can drive without mistakes, because a human is not monitoring the driving.

Level 2 systems can often work in more conditions because they have a lower requirement for reliability. They are permitted to make mistakes without realizing it, because the human driver is required to monitor and override the system.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: