Hacker News new | past | comments | ask | show | jobs | submit login
A year of digging through code yields “smoking gun” on VW, Fiat diesel cheats (arstechnica.com)
180 points by AndrewDucker on May 28, 2017 | hide | past | favorite | 145 comments



While I certainly don't support the 'rogue engineer' excuse that VW originally gave everyone, the idea does bring up an interesting point.

An engineer did write this code. Almost certainly, many of them, working together. Now, their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs... But it's still pretty clear that this was not right. They still did it, and then didn't tell anyone. And that doesn't sit well with me.

In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.


The problem with this is that it localizes the consequences of defiance on the weakest part of the chain, all but guaranteeing failure. You can't even expect engineers to blow the whistle on their soon-to-be-former employer.

Ethics is about protecting the weak from the strong, not offering up the weak to get eaten by the strong. Only heroes put the good of the many over their own needs, and you can't expect everyone to be a hero.

Only within the safe, protecting bosom of a strong professional organization can we place moral obligations on workers. But it will never happen while the idea of a guild / union for software engineers is still anathema.


> Only heroes put the good of the many over their own needs, and you can't expect everyone to be a hero.

> ...

> it will never happen while the idea of a guild / union for software engineers is still anathema.

Sounds like you're waiting for a union and that's the only solution.

In the interim, we can and should encourage people to be heroes. Let's face it, there are other programming jobs out there. Do you truly need to work for an unethical boss or company?

Everyone may not come up with the same answer depending on their individual situation. We should at least support potential whistleblowers as a community. That's one first step towards becoming a unified group.


I'm not waiting for anything. I'm simply recognizing that "encouraging" engineers to be heroes is the lazy way out and is not a real solution to anything. This is a political problem, it can only be met by an appropriately collective effort.

> Let's face it, there are other programming jobs out there.

OK, let's run with that for a moment. Where else would you have expected the VW whistleblowers to get jobs? Not every engineering job is a fungible web dev position. Those skills are useful for that one industry only, and that worker would have to uproot his family and move to get a less-desirable position.

No matter how you shake it, leaving a job is a major life decision for most people not in Silicon Valley, yes, even most engineers. And even in the fungible sectors like finance and the web, word gets around.

I have wondered if a distributed whistleblowing network ala Wikileaks but focused on putting public pressure on the most egregious offenders would be a worthwhile thing to build. Essentially we'd run outreach campaigns for high-impact sectors that use lots of software engineers and encourage them to spill the beans, which would end up into a select group of news reporters / lawyers who are positioned to take action.

I have a feeling that's not quite the right approach, but I think with a bit of refinement we could come up with something better.


Maybe, maybe not. This is speculation, unless you've done a study to sample a statistically significant portion of the population. I agree, your speculation sounds reasonable. But neither of us knows, and both of us have certain biases that this speculation confirms - and therefore it should be mistrusted until confirmed.

Frankly, I think there is a bigger problem than job security itself. It's that the whistle will be ignored by the general population. Even if you are willing to sacrifice to do what is right, it may not matter because no-one wants to believe, or the spin is so good that no-one believes it anyway.


Everything is speculation until someone makes it real.


> I have wondered if a distributed whistleblowing network ala Wikileaks but focused on putting public pressure on the most egregious offenders would be a worthwhile thing to build.

Perhaps WikiLeaks could be rebuilt. But I think we should try to learn something from v. 1.0 which became politicized very quickly. I think the takeaway is you want someone who is very trusted within our community to run it. An entrepreneur with a good track record of being honest, and perhaps also has a law degree.


> In the interim, we can and should encourage people to be heroes.

In reality it's just not black and white like that. I expect a lot of engineers aren't sure if they should be blowing the whistle about a particular thing. And they have no one to turn to for advice, especially not with the NDAs they will typically sign.

So the position they'll find themselves is this: blow the whistle on a morally gray issue which may or may not justify doing so, and guarantee that you will be fired while not at all guaranteeing that the morally gray project will stop. A probable outcome is that the engineer gets fired, the ethics issues don't get escalated, and someone else is hired to continue the project.

Everything is stacked against the engineer. Telling them to fall on their swords for a maybe outcome is neither realistic nor ethical in itself.


> In reality it's just not black and white like that.

I made it clear in the last paragraph it isn't black and white. Everyone has their own situation. I can still support people to come out before they actually do it.


If you're bit older than a graduate then you have much more to lose and nothing to gain. Mortgages, kids and ageism when looking for new job are not your allies.

So support - OK - support as in how? "like" on facebook? That is not enough to make it work.


Support as in donate to non profits that do this kind of work such as the ACLU, and suggest this opportunity to help to others.

Raising awareness can and does help make it work.

If you don't believe communication is effective then I don't know the point of this conversation.


> In the interim, we can and should encourage people to be heroes. Let's face it, there are other programming jobs out there. Do you truly need to work for an unethical boss or company?

Assuming that the only consequence is needing a new job is naive. Whistleblowers find it very difficult to get hired again.


Encourage as in create a fund for whistleblowers which would cover their legal and life expenses until they get hired again?


Sure, I think that's a great idea. Lots of ways to encourage.

Note there may be existing support groups for this such as the ACLU. I'm not exactly sure which non-profits employ a legal defense staff and cover tech issues.


Every person on the chain has own ethical responsibilities. Both management and engineers. If you knowing participate on something like that, you share responsibility. I know that not participating has a cost and somethings it is too high. However, plenty of times it is not that high and you actually can get different job. Take people who write spyware or write fishy parts of Uber code (not everybody working there!) - they have a choice.

I guarantee you that there were some engineers who recognized situation and left or were let go sooner. It is not just magical heroes. Normal people, men and women take decisions of the "less cool company, less cool project but it is fishy here so I leave" normally. Maybe we should praise and celebrate this sort of decisions more instead of automatically considered them losers. As of now, engineers nor valley don't really value doing the right thing nor are willing to discuss what is the right thing - it values winners no matter what path they took.


Whenever I've raised the issue with other developers they've been open to the idea, but to be perfectly honest I have no idea how you'd even go about starting something like that.


As with everything else hackers do, the way to learn would be to just do. If I were to do it, I'd start with a website and an outreach campaign. Sign people up to a mailing list, then create a membership structure and dues. Set some goals, and put governing procedures in place. Some endorsements from existing outfits like YCombinator itself and more traditional unions would help.

Eventually once you have enough money, you can create professional certification bodies. This is when you can really start ramping up your dues because then you'll be solving an actual industry problem. You could probably take a lot of money from big tech companies to help you out here once you're established.


A professional organization can only be strong, safe, and protecting if it has enough teeth to put moral obligations on workers. That seems unlikely.


Ford did that by paying his workers very well; imagine Foxconn paying their workers enough to buy iPhones! On the odd side he had a team of moral police who would show up at your home unannounced to make sure you were living like a good American.


If I were an evil VP tasked with creating this loop hole, I'd tell my project manager we needed a configurable system that let us control emissions for our intneral testing environments or some other plausible and legal requirement. I'd keep the knowledge of the system obscure and communication between dev, testing and deployment sparse. I'd then have only one person change the configuration before deployment. I'm sure it could be arranged so that no single engineer would realize that evil configuration was released and if they did, it could be excused as a mistake.


If I was an evil VP, I would give my eningeers the incentive to do something without giving them a direct order. So, you might tell them: If you can improve gas millage by X% you get a huge bonus and I don't care how you do it.


This is how modern companies break the rules. Management doesn't break the rules, either overtly or covertly. Management scrupulously follows the rules, while placing requirements on their workers that can only be met by breaking the rules.

Want your workers to work more, but you don't want to pay overtime, or you run into trouble with regulations about consecutive hours on the job? Just bump up how much work they have to get done, threaten to fire low performers, and make it clear that under no circumstances is anyone allowed to work overtime. Your workers will start working off the clock, and better yet they'll hide it from you, so you can legitimately plead ignorance if the law comes after you.

Want to cut corners on safety to save money? Tell your people that safety is the top priority but you need to see an X% reduction in costs, and it works itself out. They may fudge or falsify metrics, but if you're really lucky they'll find loopholes in the metrics instead.

(I have a friend who worked at a warehouse and fell victim to this. They were officially big on safety, which included bonuses for everyone if they went a certain period of time without any safety incidents. Unofficially, this meant that incidents wouldn't be reported unless it was unavoidable. The one way to ensure that an incident had to be reported was to see a doctor for your injury, so people were heavily encouraged to wait to see if they got better on their own before they got medical attention, which often made things much worse. I'm sure upper management's metrics looked great, though.)

As an added bonus, this sort of thing gives you a lot more control over workers. If you want to get rid of a troublemaker, do a little digging and you'll surely discover that they're violating safety rules or working off the clock or whatever. Everybody is, but selective enforcement is a wonderful thing.


This is also how we end up with "crazy" regulations. Case-in-point: nuclear waste handling. Engineers create complex six-sigma safety plans which are then slowly eroded as the regulations and safety protocols are wacky overkill, right?

The WIPP nuclear isolation site had a 15 year run so management began to cut corners. Then three unlikely events all lined up and nearly got people killed. It started with a truck catching fire, which prompted operators to bypass the HVAC's filtration system. They stopped the bypass for a few days to perform maintenance on the only underground radiation detection unit. That unit gave a false alarm during testing, but was fixed and placed back into service so they started ventilating again. Then a cask was breached ~midnight because someone upstream had used organic kitty litter instead of clay kitty litter. The operator assumed it was a false alarm due to the previous false positive and kept things running. It wasn't until the next morning that they realized they were blowing radioactive particles above ground.

Had management kept maintenance up, enforced protocol, or done more than the absolute bare minimum (i.e. installing multiple underground radiation detection units) US tax payers could have avoided paying $500 million dollars. And this isn't a one-off thing, there are dozens of instances just like this where the US dodged a bullet.

0: https://lessonslearned.lbl.gov/Docs/2091/OES_2015-02%20-%20R...


What a fascinating DB you linked to. I'd never known these types of things were public or common.


Seems like the "organic" fad was the problem


The problem was substituting an unsuitable material because of mistakes when revising procedures and insufficient review of the revisions. The fact that the mistake involved an "organic" product is coincidental.


You have made some very strong points - thanks for summarizing it together.

I would like to ask: What do you think should an ordinary employee do when they see a behavior like you describe from their management? How to efficiently protect themselves (and their colleagues) from this kind of treatment?


I've just observed this, not experienced it, so I'm not sure. It's far easier to see the problem than figure out a solution!

Much will depend on your job prospects and financial position. If you're a fancy programmer type who's constantly bugged by recruiters, move on until you find an ethical company. If you need this job to eat, you'll have to be a lot more careful.

In general, I'd say:

1. Point out the impossibility of the requirements to management. Gently if need be. It's possible they don't realize what they're doing.

2. Contact the local department of labor or whatever regulatory agency would be interested in what's going on. They may be able to take action if management is pushing violations in a quiet way like this. If not, they may be able to at least take action against the workplace if people have started breaking the rules.

3. If you can afford to risk the consequences, follow the rules as much as you can. Don't work off the clock, don't break safety rules, etc. If being fired will make you homeless then maybe this isn't an option.

4. Document everything. If regulators weren't interested originally, they may be interested once you can show a pattern. Upper management may be blissfully ignorant, and you may be able to get them involved once you can show them what's going on. Whatever happens, if things come to a head then it will probably be useful to be able to demonstrate that this wasn't your own doing.


The list of 4 items is excellent. Under no circumstances should you act in an insubordinate manner until you've exercised other channels of communication. Acting before things get too far is the easiest remedy.

I'd recommend the following order of operations:

Convey concern over associated risk to your immediate manager. Verbally first during the meeting, switch to written (email, a paper trail of opposition) if no action is taken.

When documenting the paper trail, simply reference meetings which you voiced opposition. Your notepads should also be able to back up the talking point you're referencing.

Being asked to briefly switch hours or work late is often listed in your job description so stopping suddenly at 5p can be considered insubordinate. Time should be compensated in a time off or paid OT arrangement promptly. If your verbal requests go without action, again, switch to email.

Simply documenting the events as they occur makes it easy for when you need to go above your immediate supervisor ( more senior manager, corporate hq, department of labor) for help.

Key is to be polite in all interactions. Innocent mistakes happen. Managers are under deadlines too. Paper trial should be maintained regardless of action or inaction.


This is why the financial regulators in the UK have changed to consider the corporate culture when dealing with breaches. A company whose senior staff "live a compliance culture" will be penalised less for the same breach than one operating as described above.

Now you can argue efficacy, being able to pull the wool over the eyes of regulators etc but I think it is a good direction to head in.

All that is way beyond the industry described with institutionalised cheating in tests, it wounds more like the graphics card industry than one with regulators.


Just imagine how Facebook must have (initially) pushed the datr cookie on its engineers: "We only need to track everyone like this to check against DDoS attacks."

Still, even such an excuse should have raised alarm bells, but I assume most developers would just shrug their shoulders and develop the feature anyway, as they would've liked to keep their nice-paying job and juicy stock options.

In reality, Facebook only recently used the DDoS protection excuse for its datr cookie, well after it announced that the cookie would be used for advertising purposes, which also happened a few years after the cookie was introduced.

I imagine whatever Facebook told developers then was even less subtle than "using it for security purposes", and that most of the developers figured it out right then that the datr cookie would be one day used to track users across the web for advertising purposes.


I'm pretty sure people who work for Facebook are completely aware of the company's methodologies and are completely OK with it.

The conversation would be more along the lines of "We need to track people who aren't logged in, suggestions?"


Yeah; in weapons, it might go like this:

* Govt research agency awards contract to study smallpox, including stockpiling smallpox. Researchers are happy, they're protecting the world.

* Govt weapons agency gets notice from govt research agency that "it's ready now."

* Govt weapons agency confiscates smallpox stockpile and data, makes more, spoons it into the tips of missiles.

Original researchers are more or less legitimate victims, whose goal to help humanity was used to obscure the Govt weapon's agency's goal to kill humanity.


Great point, it all depends on how the problem is framed.


There's a great New Yorker article [1] which addresses this -- particularly that "pretty clear that this was not right" might not be that clear after all. Excerpt:

"...sociologist Diane Vaughan described a phenomenon inside engineering organizations that she called the “normalization of deviance.” In such cultures, she argued, there can be a tendency to slowly and progressively create rationales that justify ever-riskier behaviors... If the same pattern proves to have played out at Volkswagen, then the scandal may well have begun with a few lines of engine-tuning software. Perhaps it started with tweaks that optimized some aspect of diesel performance and then evolved over time: detect this, change that, optimize something else. At every step, the software changes might have seemed to be a slight “improvement” on what came before, but at no one step would it necessarily have felt like a vast, emissions-fixing conspiracy by Volkswagen engineers, or been identified by Volkswagen executives. Instead, it would have slowly and insidiously led to the development of the defeat device and its inclusion in cars that were sold to consumers."

"But assuming all of this is true, how could it have persisted for so long, and without more people stepping forward and speaking out?... Volkswagen was heavily invested, both financially and culturally, in producing a clean-diesel engine. That the company was failing to meet the standard required by American emissions tests would have been embarrassing and frustrating to its German engineers. Some may have seen those tests as arbitrary, and felt justified in “tuning” the engine software to perform differently during them—even as it now looks, to the outside world, like an obvious scandal."

[1] http://www.newyorker.com/business/currency/an-engineering-th...


Similar to graphics cards driver developers facilitating "cheating" on benchmarks. The major vendors get caught cheating all the time. Back in the nineties, it was as simple and naive as detecting whether the running program was a benchmark or a real game: If a game was running, use the normal driver path. If a benchmark was running, you switch over to a completely different code path that basically rendered at shit quality but did it very quickly and made your rendering speed benchmark number look great. I worked at a graphics company as a 3D driver developer at the time and was asked to help implement such a system but politely declined. No problem, I just got assigned to a different part of the driver. There were plenty of other developers who had no ethical problem with cheating a benchmark.


Who was it that asked you to do that? Your mgr? Marketing?


> In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.

And, IMO, yours is absolutely the correct one.

As engineers, we are in a unique position to understand the intricacies and associated risk to any one of our functional requirements we satisfy. True, we are under marching orders from a PM who may (or may not) have provided adequate backstory to understand the goals of the company, but it's our duty to express any concern of risk it may present to the company or society.

When verbal warnings don't work, go ahead and put it in writing.

In the US, part of my engineering curriculum included courses on engineering ethics case studies to drive this point home.

I've written exactly 2 of these letters in my 10yr professional career. On both, the PM sharply changed course, demonstrating their knowledge of failure of the ethical litmus. Putting written responsibility upon them to act encouraged further discussion and ultimately a better resolution for all.


You ignore the fact that the 99% of the cheat probably has or had a good reason to be there for one reason of another. The actual responsibility is probably multi-layered with the majority of contributor unknowingly participating in something fishy.

I remember that I worked on system that allowed a financial firm to change a account value outside of all the accounting routine and a way to transfer money from a shared account to any random external account. Sounds fishy, but the explanation was reasonable: this was the way to model dividend. Actual dividend money was coming in one account, the payment/reinvestment from another with a reconciliation done later on. This reconciliation was done by that department operational floor from data coming from other departments, with systems upstream and downstream. I may have unknowingly created the final piece of a massive international fraud system, but I have no idea and all the people had a valid business reason and nothing to gain.

Because that's the point, if I knew and I was willing to actually participate in fraud, I would like a cut in it and a serious one as I would literally provide a gun with my real id from a regular shop knowing you will use it for a bank robbery.


The problem is that nobody wrote a specific defeat device. You have to understand, Bosch has one basic (extremely complex) piece of software that goes in pretty much every single engine controller. There are some specific functions that are developed at the request of specific customers, but the basics are always the same.

When this general piece of software is set up to control a new engine from an OEM, a multi-year process of calibration starts. The engine starts of failing the emissions tests drastically and over time the calibration engineers tweak tens of thousands of parameters to try to get the emissions to be legally compliant.

This is not an easy process and the calibration engineers have a target, they have to get the vehicle certified for a certain emission class, which in turn is determined by the FTP-75 driving cycle. The news keeps talking about "defeat devices" because the law says that they are not allowed, but reality is much messier. There is nobody sitting there thinking about how to make crazy obfuscated defeat devices, rather there are a bunch of people trying to figure out how to pass the extremely demanding emissions requirement tests without any defeat devices.


This. A thousand times this.

They didn't write a defeat device to "fake" emissions, they wrote legitimate code aiming to make the car truly compliant with the requirements.

Then someone made this code not the default and, except when the car is under testing, loaded an alternate mapping of some key parameters which is more tuned for performance and mileage, at the expense of emissions. This is the criminal part, not the first.


I don't doubt it's this big mess you are describing, but if the engine is actively looking for test/not test and does things like turning off particle filtering after 30 minutes of driving because it knows any test will be over, it seems to me the intention is in fact to lie about the real emissions.


The thing is that detecting test situations is good, certainly useful, perhaps necessary.

E.g. detecting you are on a test stand, you might

* deactivate plausibility checks which would normally limit engine power * deactivate air bags and other inherently dangerous systems * ensure correct system behaviour even if sensors give conflicting information * suppress error logs for certain failed plausibility checks

The only illegal thing is switching to a "massaged" parameter map under these conditions. And presumably a single person controlled this switch.


But somebody had to write code that recognizes that an emissions test is being performed, no? I mean that sounds like it would be more involved than just loading some different set of parameters.


> In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.

Support Ticket 57: Assigned to: John

Hey, can you change this limit field here, currently it is a 5, can you make it a 9.


While I agree with you, it's a real easy thing to say and a much harder thing to actually do. Having been in a similar situation myself (not quite as drastic—no lives were in danger and no laws were broken), we (the engineers) argued vehemently with the CEO (there may have been yelling involved), but eventually capitulated with a "ok, but this is on your head" attitude.

Maybe if it were more important I would have fought harder. But I honestly don't know.


It appears that the firmware was written by Bosch and not VW themselves. I can't believe that only VW engineers were involved in writing the spec that was sent to Bosch and that only one of those engineers understood that "acoustic" really meant "emissions".

I would imagine that Robert Bosch GMBH and co. are quietly developing a war chest of cash right now. They appear to have been caught writing deliberately naughty code to order.


The firmware is ultra-generic but highly configurable. The default device isn't in the firmware itself.

Many different engine types will use the exact same firmware, the manufacture supplies a huge array of configuration values to make the ECU run the engine correctly.

All the firmware knows is that under some manufacture configurable situation, the engine will switch from one manufacture configurable mode of operation to another manufacture configurable mode of operation.

This presentation from 32C3 explains it in much more detail: https://media.ccc.de/v/32c3-7331-the_exhaust_emissions_scand...


> their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs

I prefer to think about this. Management gets what it wants, you can always find someone to carry it out. I think we should be very careful about moving to blame engineers first.


While engineers may have an obligation to public safety in other fields, this is, strangely, not the case when writing software that ends up in automobiles.

If we had a good way to blow the whistle, without ending in jail, we would see less unethical code being written and less shit storms from greedy companies.


What is unethical? Engineers make guns, missiles, drones to bomb people, warheads, nuclear submarines, ... .


How do you know engineers were involved in writing the software? If there were engineers involved they should have said something if it does endanger the public. After all, it is in their responsibility/regulation to uphold the safety of the public.


Did it actually endanger the public? A silly question, however, what was the actual measurable consequence of this cheat? Did air quality become measurably worse during the period of this? Was anyone actually and measurably endangered?

An older car on the road puts off more emissions than the modified VWs, yet older cars are still allowed on the roads.

Laws should be followed, so I am not excusing the behavior, but making it sound like the safety of the public was materially compromised is a bit of a stretch.


When the news originally hit, it was estimated that some number of air-quality related deaths were attributable to ait pollution contributions


It is true that people who are employed are obligated not to do unethical or criminal acts on the say so of their employer, but the employer is also obligated not to command their employees do unethical or criminal acts. And the employer is the one who holds the power in the relationship.


May be the code was written by Russian contractors? The ones that normally write bots for twitter to manipulate elections?

The sad truth is that many programmers have no ethics at all. If we all did, there would be no spam, no maleware, no Windows 10. (And no systemd!)


This is basically the independent contractor on the death star issue. It boils down to the fact that they are morally responsible because they do the work and take the money fully aware of the end result of their work.


You are correct. "Just following orders," is sometimes an understandable action (in the sense that I understand why someone might not want to refuse an order and risk their job), but that doesn't make it an excusable one.


I agree. Engineers have the same ethical obligations as anyone else and that means you don't delegate your ethical decisions on management.


It depends how drastic. You are talking about people risking their careers. We all have our own moral standards to live by. Maybe they justified it by saying all the other car companies are doing the same thing (and by the way there are some indicators they did.)


Don't forget this exact same company literally made weapons for the nazis. Lying on emissions seems nothing.


IBM made computers for the Nazis too; are we going to hold today's IBM culpable for the Nazis?


>In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.

I respectfully disagree. Engineers have an obligation to do whatever is asked of them by management. If that means building nuke-hand-grenades so be it.

Being an engineer( be it software or mechanics ) should be a morally neutral thing. Do the job and keep your morals out of it. This is the only way we can have some semblance of sanity in the field.


'Once the rockets go up, who cares where they come down? That's not my department! says Wernher Von Braun.' https://www.youtube.com/watch?v=kTKn1aSOyOs

I really really really strongly disagree with this. As the ones who are most knowledgable about the systems they're building, I definitely think engineers have a moral responsibility to make sure they're at the very least following applicable laws and regulations - I would argue that we have a moral responsibility to act ethically even in cases where it's not covered by laws or regulations, but I'll admit that's a bit more controversial.


The problem with "moral obligations" is a person's moral axioms are fundamentally arbitrary. Would you still want engineers to be guided by their moral convictions when you fundamentally disagree with them yourself?


Inasmuch as I want everyone to be guided by their moral convictions, yes. I mean, even if I don't agree with them, I'd much rather an engineer's decisions be guided by 'what they feel is right' rather than guided by nothing at all, which seems to be what parent was implying.

Also, this is why we have engineering codes of ethics, at least in Canada and the US (and while I'm not familiar with elsewhere in the world, I would assume similar things hold in most first-world countries). We don't necessarily have to agree on everything, but there is a baseline for what we consider ethical, and engineers are expected to uphold that baseline, otherwise they are not permitted to practice engineering. Unfortunately the line between 'engineers' and other practicioners isn't as well-defined for software engineering as it is for most engineering fields - but that doesn't mean we should ignore it completely.


This is stupid. There is no such thing as working without morals, and you aren't going to have 'sanity' by asking engineers to adopt someone else's (also arbitrary) morals rather than their own.


Real engineering disagrees: http://www.onlineethics.org/2959.aspx


You are still a human when you are at work.


eww


It's imho not so clear cut whether this code is unethical or not. There is a legitimate argument that increasing NOx to reduce CO2 is better in the long term. Remember that the two are inversely related: the hotter and leaner your engine runs, the more NOx you produce, but your fuel efficiency also goes up.


In the long term, making their cars cause accidents that killed many people would reduce the environmental impact of humanity, so would also be arguably ethical by that logic.


> In 2015, regulators realized that diesel Volkswagens and Audis were emitting several times the legal limit of nitrogen oxides (NOx) during real-world driving tests. But one problem regulators confronted was that they couldn’t point to specific code that allowed the cars to do this. They could prove the symptom (high emissions on the road), but they didn’t have concrete evidence of the cause (code that circumvented US and EU standards).

I don't understand this from a regulator's point of view: as a regulator, all you have to do is test for symptoms. You don't have to explain root causes. You drive the vehicle in conditions as close as possible to real ones, measure emissions, and decide whether or not they're above the norms.

Why would regulators do this in a lab? It's like health inspections that would ask restaurants to send food to be tested, instead of showing up anytime, unannounced.

Regulators should pick up real cars from real owners and test them on the road, at regular intervals.

Or, modern technology should allow to test a car all the time and report emissions and fuel efficiency, etc. during its lifetime.

People cheat, and if cheating is easy they cheat more. The one thing a regulator cannot do is trust the industry.


This kind of thing seems to be especially rampant in the auto industry. It's probably the result of a powerful lobby and widespread corruption.

When tires are tested, apparently the manufacturers are asked to send in the tires to be used in the tests. Why on earth would a tire testing entity do that, unless they are receiving bribes from the tire manufacturers? [1]

[1]: http://www.autoblog.com/2016/02/26/nokian-tire-test-cheat-re...


> Why would regulators do this in a lab? It's like health inspections that would ask restaurants to send food to be tested, instead of showing up anytime, unannounced.

Because the law on how regulators work was written by the auto lobby. (At least, that's how it is in the EU. Don't know about the US.)


Funny you should mention the lobby. The US auto lobby was the reason the US NOx diesel emission requirements are more stringent than EU in the first place, specifically disadvantaging European diesel cars (which are much cleaner overall).


>The US auto lobby was the reason the US NOx diesel emission requirements are more stringent than EU in the first place

No, that was a result of the Clean Air Act Amendments of 1990 for the reduction of acid rain. Though it was targeted towards industrial emissions of SO2 & NOx, but stricter regulation for vehicles were an additional effect.

>specifically disadvantaging European diesel cars

That's a weird argument given that diesel passenger vehicles in the US are held to the same standard as gasoline ones, but to a separate standard from their petrol counterparts in the EU. I mean, one could argue the opposite, that an EU emissions policy favorable to diesels amounted to an equivalent 13-16% import tariff. [1] Several domestic rather than just foreign diesel engine manufacturers were also penalized for using defeat devices in 1998.[2]

> (which are much cleaner overall)

That's quite arguable, trading lower CO2 & CO for increased NOx & PM.

[1]: http://www.eugeniomiravete.com/papers/MMT-Diesel.pdf

[2]: http://articles.chicagotribune.com/1998-10-23/news/981023011...


>but stricter regulation for vehicles were an additional effect

Surely car manufacturers didn't have a say, which is why US and EU emission standards look like this https://longtailpipe.com/wp-content/uploads/2015/10/us-europ... .

>same standard as gasoline ones

Well duh, let's keep diesel cars to petrol standards so that their benefits don't matter and their disadvantages are prohibitive!


>Surely car manufacturers didn't have a say, which is why US and EU emission standards look like this

As per the source of the image says, "On the other hand, American regulators are focused on smog and health impacts of air pollution." Which the graphic you provided well indicates.

Look, California was probably the first governmental entity to regulate tailpipe emissions. Such so that it's written in the Clean Air Act by name to run its own regulatory scheme to enact stricter regulation(with federal waivers, but that's another issue). The reason being, that LA's unique geography makes smog worse. Heck, in the 1940s, they had an episode severe enough they thought they were under chemical attack by the Japanese. As such, CARB's emission standards were focused on reducing the more directly harmful pollutants like hydrocarbons, ozone, NOx & PM. So, given California's influence on the original 1970 Clean Air Act and the 1988 California Clean Air Act's influence on the subsequent amendment in 1990, I don't see how that graphic would support your argument. I mean, had they such hypothetical power, they could have also blocked the banning of leaded gasoline that was in the same amendment.

>Well duh, let's keep diesel cars to petrol standards so that their benefits don't matter and their disadvantages are prohibitive!

Emissions vs fuel economy. You're being facetious, but if that argument was true, why bother importing diesel passenger vehicles into the states? They didn't even start reintroducing diesels in America until they thought they could harmonize emissions from Euro 5 with Tier II Bin 5.


I think part of it is proving the malicious intent. You could have a bug that does that, or a commit that says "we have to do it to bypass regulations". I assume the law says different things about malicious intent and unmalicious-yet-still-harmful code.


You have to prove malicious intent to bring criminal charges, but not to decide whether a given car is road-worthy.


The standards are very tight. If they aren't well defined and tested in a controlled environment it'd be basically impossible for manufacturers to know if their cars will pass or fail.


That's a valid argument in theory. In practice though it should be possible to agree on a set of conditions strict enough to be "fair" and loose enough that the regulator has some wiggle room to adjust the tests as they see fit.

And there could be an appeals process whereas when a car fails it can be tested again, and the tests monitored by a 3rd party, etc.


> Or, modern technology should allow to test a car all the time and report emissions and fuel efficiency, etc. during its lifetime.

This can become a privacy issue though. Suddenly, you're able to associate licenses with a set of persons with some probability. And you're able to collect side channel information about the vehicle being in motion, and the speed of the vehicle. And who knows what more you can read from the sensors of a modern vehicle. I know of a couple of vehicles with GPS sensors.

Overall, this discussion is very interesting to read, because this is the discussion about unit testing to a dot. We are unit-testing cars with a mock road - and unit testing fails with malicious and/or stupid workers building the unit. Every set of unit tests can be satisfied by a lookup table - this is happening right now in cars.

So now, the question is: Which of the bigger system control tools do we use. integration tests, so, taking cars on a race track? Property based tests - randomize the length of the test, define ranges of acceptable pollution. Live system monitoring? Maybe a pipeline of tests?


Someone finally said it. Basically, the test is to be blamed too. Like any other test - you should randomly pick up cars from the road and test for emissions. I really don't see why the regulator can't do something like it.


That way, we can complain about their tests being unscientific.


the problem is the wording on the emission regulation, you need to respect the limits during the testing cycle and you can't have defetaing measures to cheat - if something it's not intentional, it's a gray area.

a piece of code which purpose is specifically defeating the testing cycle instead, run directly afoul of the regulations, no arguing against multibillionaire company needed.


I may be missing something here, but if you take a step back this just seems like a tragedy of misaligned economic incentives.

Both the US & EU would have wanted this information from day one, and this whole fiasco cost VW billions in fines.

There would have been any number of engineers at VW and Bosch that knew exactly how this worked, but there was nothing in it for them to come clear about it. They were never going to get charged for writing that code, and they would have likely been out of a job or destroyed their career at those companies if they volunteered to authorities how this worked.

So why don't investigators just offer a huge cash prices to engineers at those companies who can provide details about exactly how this worked, along with immunity as long as they're forthcoming with information about who instructed them to implement this?

You'd have an army of engineers overnight willing to spill the beans, and you'd save millions in investigative costs, and quickly get to the real root cause of the corruption.

Instead some independent team of investigators is left digging through old firmware images posted on forums to reverse engineer how the defeat device worked.


I don't know for certain whether the concept of a bounty was considered, and if so why it was rejected. But I can think of a few possibly good reasons to not do this. For one thing, it could be thought of as rewarding the engineers who wrote the evil code, maybe even creating an incentive for engineers to add such code in the future. There are precedents for enforcement actions having the unintended consequence of causing more incidents. Another thing is that you may actually wish to prosecute the people who wrote the code, to serve as a disincentive to future engineers considering following management orders on a similar defeat device.


You may wish to prosecute them either way, and you might offer them immunity either way.

Doing this would just guarantee that if they were eligible for lenient handling or immunity that they'd get a handsome cash payout by ratting on their management or giving investigators details relevant to the case, but which they didn't think to look for.


>For one thing, it could be thought of as rewarding the engineers who wrote the evil code, maybe even creating an incentive for engineers to add such code in the future.

In the development model used here, it not like engineers just indepently decide to add a feature. An autogenerated two line procedure is going to have two pages of documentation.


Why do think the notion of whistleblowers or witness protection exists?


I'm pretty sure whistleblower protections are not a get out of jail free card. If you blow the whistle on actions that you took part in, you can still be prosecuted.


There would have been any number of engineers at VW and Bosch that knew exactly how this worked, but there was nothing in it for them to come clear about it.

The USA already has a very workable solution. It's called the "prisoner's dilemma".

They were never going to get charged for writing that code

Wrong. That's exactly what you threaten them with. Engineers aren't hardened criminals. When faced with the possibility of jail, 99% will instantly sing like canaries.

The problem is of jurisdiction. The engineers are in Germany and no fucking way will Germany extradite them. So the usual threats don't work.

Well, we did catch one moron who decided to travel to the USA for vacation even though he was a big player in the mess.

Edit: I'm presenting the current general situation for the USA. Your idea of bounties is not bad. It has worked spectacularly well in rewarding employees of Swiss Banks. Somewhat ironically, it's something that Germany has done: http://www.spiegel.de/international/germany/german-city-find...


> Firmware images were gleaned from car-tuning forums and from an online portal maintained by Volkswagen for car repair shops. Documentation, in the form of so-called “function sheets,” was harder to come by. The function sheets were necessary to give the binary context, but the sheets are copyrighted by Bosch and generally not shared with the public. The research team ended up turning to the auto-performance tuning community again. These hard-core hobbyists and professionals share leaked function sheets so they can make aftermarket modifications to their cars.

This is crazy. And, as we move toward self-driving cars, dangerous.

Can someone point me to a well-regarded source in the industry that makes a cogent and convincing argument for why all the software running in a car (save maybe for the entertainment panel) should not be required to be open source?

To be clear: I'm not asking for speculation, or even an explanation from an expert based on years of experience. I'm asking for a publicly accessible reference based on research data that explains the security benefits of not releasing the source code for life-critical software in the car.

Edit: clarification


More common is submitting their code to a regulatory agency, as in the medical and gaming (slot machine) industries. They can also confirm that the code submitted is what is actually running on the device.


I would be very surprised if such a thing exists. Are there any industries, safety critical or otherwise, where the default is open source? If not, why would anyone think to justify the decision to not open source code, and especially why would anyone think to make such a memo or piece of research public?


>Are there any industries, safety critical or otherwise, where the default is open source?

Cryptography.[0] When trustworthiness is paramount, as in crypto (and IMO in safety-critical applications like this), being able to inspect the code helps a lot.

[0] https://www.schneier.com/crypto-gram/archives/1999/0915.html...


FWIW, airplane or medical software is also closed.


Right, and you can see how well that's worked out: https://www.engadget.com/2017/04/21/pacemaker-security-is-te...


I would say that this argues for the opposite - making firmware closed source and as hard to reverse engineer as possible.

Let's say that the software is open source, and you can buy your own microcontroller and turn it into an ECU.

There will be people who make their own modifications to it to improve performance. Some of those modifications make the car fail an emissions test, so they make it so that the car acts like it's at stock settings during those tests.


What you describe already happens in the extremely-closed-source world of today. For most cars you can buy an ECU remap that can be switched from tuned to standard by the flick of a switch.

An ECU is not actually that complicated. It reads a few tens of inputs (temperatures, air mass flow, lambda sensors, crank position, etc.), and controls a few tens of actuators (throttle body, fuel injectors, spark plugs, maybe variable valve timing), with time resolution measured in tens of microseconds. An Arduino doesn't break a sweat - see the Speeduino project.


Crap. There will be more programmers in the future who will write defeat code for fun while learning, or pursue it as a business. That's obviously not great for the environment.

Does anyone else feel our current government is not prepared to handle the changes pervasive software will create in the future?


If you think defeat code/devices are the worst of it, you've obviously not seen the people on YouTube "rolling coal"...


Haha, you're right that I am out of touch with American youth. Just this last year I realized I've aged and I'm only 35. I won't attempt to understand people who are staunchly anti-environment. I will continue to believe they are on the fringe until I'm convinced otherwise.

I do wonder though if our way of achieving civilization will require significant change in the near future, since regulating code and encrypted code is going to be damned near impossible.


> The researchers also say that it’s high-time regulators dispense with the kind of lab tests that US and EU governments have required for years. Instead, some kind of active scan for illegal code needs to be developed.

Alternatively, combine the lab tests with imprecise real-world tests as a sanity check, or keep the exact nature of the lab tests a secret and vary them over time. Really, the first of these seems like a good idea regardless of what else you do.


The real world sanity checks for cheating make sense. But the nature of the regular tests can't be secret. Carmakers need to know the bar they are supposed to hit. Building something on the speculation that it might pass, maybe, is a whole different business model.


"Build a car that doesn't emit more than X of exhaust Y when a typical driver drives it a) ten kilometers in the city, b) a hundred kilometers on the highway."

That seems like a good enough requirement to me. Define "a typical driver" as "out of a hundred random test drivers, no more than 20 exceed the thresholds, no more than 5 exceed the thresholds by more than a factor two." That forces the car manufacturers to have a sufficient safety margin in their emissions.

I don't know how difficult the actual measurement is, but maybe you could pay a couple thousand people a reasonable amount of money to have some devices attached to their cars for a month or two and collect data. Or make the car makers pay for the procedure.


So this is going to happen once they've built the car + engine, and started selling it? Is 'car tester' going to become a side-hustle?

I agree that a degree of randomness is likely a good idea to avoid defeat devices, but one also has to consider that it could have two unintended consequences: 1) more expensive cars, due to more stringent QA procedures 2) relaxing of standards to ensure that companies can still practically make cars that conform to 'standards.'

As ever, it's important to consider that a layman's "seems reasonable to me" is another experts "that's not how things work."


"The average driver" is unlikely to change their driving style very quickly, so the tests would be reasonably reproducible from year to year.


Driving style has a profound effect on emissions. As does ambient temperature, uphill vs downhill, time at stoplight. Short trip vs longer (catalytic converters don't work till they are hot).

I'm sure they can find some way to address the cheating, but it's likely not the case that they are just missing some obvious simple solution.


Modern cars have enough sensors that they could store all necessary information about trips, have the service center read it out each time the car is serviced, anonymise, and send back to the manufacturer. That way the manufacturer would have accurate, real-world data about their 'average driver'. Could even make it country-specific.


These tests, though, are the ones you have to pass before you can sell that model of car, administered by government agencies. The agencies dictate the tests and measures.


Why anonymize it? Could be a new profit center.


Are you a software developer? Could you write the code for an application specified that loosely?

(At this point, every one of us is saying, yeah, I have. It just took a long time, a lot of change requests, and many round trips with a customer. Which is the point.)


Would it be feasible to have two tests? One public, and one where the details are hidden. The hidden test has a significantly lower pass bar, so that the only way you would fail it whilst passing the public test is if you are manipulating the outcome of the public test.


OK, but there are test details that are known to the manufacturers but seem irrelevant to the thing being tested. Like the bit about the angle of the steering wheel. Surely there are multiple ways one can test for the same standard?


The cheats are based around detecting when a car is strapped down to a dynamometer, indoors. That why the cheaters are looking at ambient temp, steering wheel angle, etc.


Doing a sanity check helps but the best thing that can be done to avoid a repeat of this is to make sure there are sufficiently heavy fines and hopefully also jail time from this case. If the penalty is heavy enough the detection probability doesn't need to be as high.


> In 2015, regulators realized that diesel Volkswagens and Audis were emitting several times the legal limit of nitrogen oxides (NOx) during real-world driving tests.

This means that they can detect emissions levels during real world driving tests. What's wrong with just making those tests the actual regulatory ones? So whatever ingenuity automakers can use will be put to minimizing emissions in the exact same scenarios that will be used in real life.


> What's wrong with just making those tests the actual regulatory ones?

Nothing. But Germany has a strong position in the EU and to a large degree does what VW/BMW/Mercedes want and they wanted to avoid stronger or better tests. This topic is quite often in the news here and it's clear that there is no political will to establish real word tests.

There are also a lot of other cheats in this firmware:

- Below 14°C? Just blast the emissions out. It's not like these cars are driven in the winter.

- Autobahn? Go blast out the emissions!

- and so on...

This should be a far bigger scandal than it is now.


Extensive PDF from the Umwaltbundesamt (comparable to the EPA - it's an independent agency) - http://www.umweltbundesamt.de/sites/default/files/medien/254...


> Below 14°C? [...] in the winter

It doesn't take winter for temperatures to drop below 14 °C (57.2 °F). The morning commute takes place between 6 and 9 AM. Even in the summer, temperatures are probably below 14 °C more often than above it at these times.


Indeed the Umweltbundesamt PDF (see my other comment) states that it's at least 50% of the time the case! 50%!


Are the tests that the hidden code is supposed to defeat performed only a few times on specific cars by regulators, or are they the regular smog checks that everyone has to pass each time they renew their registration [1]? If the latter, then all those smog check shops will have to get new mobile equipment (how much more does that cost?) and drive each car around town for a long time. [2]

This article [3] has a mention and photo of "A portable emissions measurement system at the Engine and Emissions Research Laboratory at West Virginia University."

"Mr. Carder and his team drew on their experience testing trucks when they got the contract to test cars in 2013. One challenge was to fit what amounts to a mobile laboratory in the car. At the time, the equipment available for such emissions testing had enough battery power only for short trips.

To make long hauls possible, the West Virginia University researchers bought portable gasoline generators at regular hardware stores and bolted them to the rear ends of the test cars. The generators made a terrible racket and frequently broke down because they were not designed to be bumped around."

[1] http://www.dmv.org/ca-california/smog-check.php

[2] https://www.youtube.com/watch?v=YZmCp7NocMA

[2] Researchers Who Exposed VW Gain Little Reward From Success: https://www.nytimes.com/2016/07/25/business/vw-wvu-diesel-vo...


-One problem with that approach is that it would make testing for compliance much, much more expensive.

One thing would be all the equipment and manpower every test facility would need to invest in (as DonHopkins points out below) - but also, in order to gain initial approval, you couldn't just rely on one or even a few random drives - in order for the numbers to make any sense, you'd need a large sample size - lots of different drivers, driving the cars under different conditions - until you had enough data points to come up with a meaningful figure.


So in order to get a model on the market, you need to first get in on the market, right?


Can you make those real world tests repeatable without increasing the costs by several orders of magnitude?

Back up real world tests are a very good idea, but the legal limits need to be defined under standardized, laboratory conditions.


>What's wrong with just making those tests the actual regulatory ones?

It's easier for automakers to throw resources at lobbying.


Weird conclusion. This kind of activity should just be deemed illegal instead of treating it as an arms race between the manufacturers and regulators. Have 3rd party code auditors look through the code and flag any shenanigans. Instead of treating the code as an unregulated black box.


What puzzles me about all this is why the investigators did not compel the companies to produce the exact mechanisms by which they cheated emissions as part of the settlements. I guess it doesn't really matter, since what you're chiefly interested in is the manufacturer ceasing the behavior and not repeating the offense, so that might be why they didn't.


Why they were analyzing time/distance curves, steering wheel angles, temperature, etc. instead of just using accelerometer to check if car is really moving?


I don't think there's any accessible to the software. They are used for airbags, but I think that's not on the CAN bus. If one were, it might be too dead a giveaway to poll it, since you already have speed via rotational sensors.


It smells more like a "smoking tailpipe".


An interesting twist to this would be to change liability. Make the driver liable (not the manufacturer) for driving a car breaking the emission standards. This will set in motion the following steps:

- Car users will be offered insurance against breaking emission standards

- Insurance companies will hire specialists to lower insurance rates of above insurance

- Consumers are disincentivized to modify their ECU in a non-compliant way

- Consumers are incentivized to buy cars offering more transparency in firmware (and car manufacturers will offer more transparency)

I'm a firm believer in: you buy it, then you're responsible for ascertaining its safety. If you can't do this, hire someone to do it for you.


> I'm a firm believer in: you buy it, then you're responsible for ascertaining its safety. If you can't do this, hire someone to do it for you

People elect government to take care of this for them. We can't all be experts in everything.

I firmly believe that when we work together on solving problems, we are better off.

Putting the onus on the individual is exactly what big businesses would like. It gives them even more of a free pass towards short term profits. Get caught doing something wrong? Close up shop and reopen another, earning profit while fooling the public again under another name. No thank you!!!


> I'm a firm believer in: you buy it, then you're responsible for ascertaining its safety. If you can't do this, hire someone to do it for you.

This seems like one of those plans that goes

1. Deregulate everything

2. ...

3. All irresponsible companies go out of business when intelligent consumers patronize responsible ones instead

Skipping over the "decades of death/disruption" in step 2, and being awfully optimistic about consumer intelligence in step 3.


I don't want to deregulate everything, especially not all security regulations on cars. What is needed is removing distance between regulators and users.

Would you rather buy a locked down laptop which is factory protected against viruses (with a known set of them) but no chance to modify nor understand the virus protection system. Or, choose for a regulating system that punishes those who help spread viruses because no scanner was installed?

The latter teaches the users more and keeps them closer to the product. It creates a market where more parties can enter and provide security advice.

Lastly, the 'stepping over death and destruction' is obviously a straw man. We have seen many deregulations that we in dire need and worked out well. It all depends on transparency and communication.


Problem is that the US Government guarantees that vehicles sold here have met minimum safety requirements. The public relies on the government to have done that legwork.

Beyond that you would immediately get into jargon and confusing packaging for insurance and vehicle emissions. I was just buying dental insurance and the 4 different plans provided by a single provider had 30+ different fields each of which was subtly different. How am I, as a standard consumer, supposed to make sense of all of that... Much less compare it with the 3 other networks. Then add onto that another factor (the type of car I drive) into the same bag and it becomes just absurd.

I don't have a solution to the cheating but I feel like mandating open source software is a good step.


Simple solutions: buy sensors to detect emissions or regularly check your car yourself. Spend money on experts, optionally working together with other owners of similar brand (community). Insist on buying models with open software.

The point is, regulation here is failing because it is not community led. We need to take matters in our own hands to understand the difficulties, simplify and source it back to the government, if needed.


Downvote why?


Who knows? Maybe it was by mistake. It really doesn't matter. Everyone gets them. Try to let it go and focus on making your next comment a good one :)


I feel like this would have the opposite effect.

No one will offer insurance against this liability at a reasonable cost.

Instead, car manufactures will offer a guarantee that the car will not break emission standards when they sell the car to the consumer.

This guarantee will be void the second you do anything the manufacture doesn't approve of, such as missing a scheduled servicing appointment, which of course must take place at the manufacture certified servicing center.

Car manufactures will take the opportunity to lock down the firmware even further and block 3rd party servicing centers.


- Create a market for road side emissions testing equipment. Your traffic stops just got a bit more thorough.

- Reduce emissions standards to levels that can be tested easily in nonstandard conditions.


Just what we need, more economic hocus pocus. People need to understand that buying toxic cars is wrong. No amount of tax or financial liability makes it OK. This is moral issue.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: