While I certainly don't support the 'rogue engineer' excuse that VW originally gave everyone, the idea does bring up an interesting point.
An engineer did write this code. Almost certainly, many of them, working together. Now, their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs... But it's still pretty clear that this was not right. They still did it, and then didn't tell anyone. And that doesn't sit well with me.
In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.
The problem with this is that it localizes the consequences of defiance on the weakest part of the chain, all but guaranteeing failure. You can't even expect engineers to blow the whistle on their soon-to-be-former employer.
Ethics is about protecting the weak from the strong, not offering up the weak to get eaten by the strong. Only heroes put the good of the many over their own needs, and you can't expect everyone to be a hero.
Only within the safe, protecting bosom of a strong professional organization can we place moral obligations on workers. But it will never happen while the idea of a guild / union for software engineers is still anathema.
> Only heroes put the good of the many over their own needs, and you can't expect everyone to be a hero.
> ...
> it will never happen while the idea of a guild / union for software engineers is still anathema.
Sounds like you're waiting for a union and that's the only solution.
In the interim, we can and should encourage people to be heroes. Let's face it, there are other programming jobs out there. Do you truly need to work for an unethical boss or company?
Everyone may not come up with the same answer depending on their individual situation. We should at least support potential whistleblowers as a community. That's one first step towards becoming a unified group.
I'm not waiting for anything. I'm simply recognizing that "encouraging" engineers to be heroes is the lazy way out and is not a real solution to anything. This is a political problem, it can only be met by an appropriately collective effort.
> Let's face it, there are other programming jobs out there.
OK, let's run with that for a moment. Where else would you have expected the VW whistleblowers to get jobs? Not every engineering job is a fungible web dev position. Those skills are useful for that one industry only, and that worker would have to uproot his family and move to get a less-desirable position.
No matter how you shake it, leaving a job is a major life decision for most people not in Silicon Valley, yes, even most engineers. And even in the fungible sectors like finance and the web, word gets around.
I have wondered if a distributed whistleblowing network ala Wikileaks but focused on putting public pressure on the most egregious offenders would be a worthwhile thing to build. Essentially we'd run outreach campaigns for high-impact sectors that use lots of software engineers and encourage them to spill the beans, which would end up into a select group of news reporters / lawyers who are positioned to take action.
I have a feeling that's not quite the right approach, but I think with a bit of refinement we could come up with something better.
Maybe, maybe not. This is speculation, unless you've done a study to sample a statistically significant portion of the population. I agree, your speculation sounds reasonable. But neither of us knows, and both of us have certain biases that this speculation confirms - and therefore it should be mistrusted until confirmed.
Frankly, I think there is a bigger problem than job security itself. It's that the whistle will be ignored by the general population. Even if you are willing to sacrifice to do what is right, it may not matter because no-one wants to believe, or the spin is so good that no-one believes it anyway.
> I have wondered if a distributed whistleblowing network ala Wikileaks but focused on putting public pressure on the most egregious offenders would be a worthwhile thing to build.
Perhaps WikiLeaks could be rebuilt. But I think we should try to learn something from v. 1.0 which became politicized very quickly. I think the takeaway is you want someone who is very trusted within our community to run it. An entrepreneur with a good track record of being honest, and perhaps also has a law degree.
> In the interim, we can and should encourage people to be heroes.
In reality it's just not black and white like that. I expect a lot of engineers aren't sure if they should be blowing the whistle about a particular thing. And they have no one to turn to for advice, especially not with the NDAs they will typically sign.
So the position they'll find themselves is this: blow the whistle on a morally gray issue which may or may not justify doing so, and guarantee that you will be fired while not at all guaranteeing that the morally gray project will stop. A probable outcome is that the engineer gets fired, the ethics issues don't get escalated, and someone else is hired to continue the project.
Everything is stacked against the engineer. Telling them to fall on their swords for a maybe outcome is neither realistic nor ethical in itself.
> In reality it's just not black and white like that.
I made it clear in the last paragraph it isn't black and white. Everyone has their own situation. I can still support people to come out before they actually do it.
If you're bit older than a graduate then you have much more to lose and nothing to gain. Mortgages, kids and ageism when looking for new job are not your allies.
So support - OK - support as in how? "like" on facebook? That is not enough to make it work.
> In the interim, we can and should encourage people to be heroes. Let's face it, there are other programming jobs out there. Do you truly need to work for an unethical boss or company?
Assuming that the only consequence is needing a new job is naive. Whistleblowers find it very difficult to get hired again.
Sure, I think that's a great idea. Lots of ways to encourage.
Note there may be existing support groups for this such as the ACLU. I'm not exactly sure which non-profits employ a legal defense staff and cover tech issues.
Every person on the chain has own ethical responsibilities. Both management and engineers. If you knowing participate on something like that, you share responsibility. I know that not participating has a cost and somethings it is too high. However, plenty of times it is not that high and you actually can get different job. Take people who write spyware or write fishy parts of Uber code (not everybody working there!) - they have a choice.
I guarantee you that there were some engineers who recognized situation and left or were let go sooner. It is not just magical heroes. Normal people, men and women take decisions of the "less cool company, less cool project but it is fishy here so I leave" normally. Maybe we should praise and celebrate this sort of decisions more instead of automatically considered them losers. As of now, engineers nor valley don't really value doing the right thing nor are willing to discuss what is the right thing - it values winners no matter what path they took.
Whenever I've raised the issue with other developers they've been open to the idea, but to be perfectly honest I have no idea how you'd even go about starting something like that.
As with everything else hackers do, the way to learn would be to just do. If I were to do it, I'd start with a website and an outreach campaign. Sign people up to a mailing list, then create a membership structure and dues. Set some goals, and put governing procedures in place. Some endorsements from existing outfits like YCombinator itself and more traditional unions would help.
Eventually once you have enough money, you can create professional certification bodies. This is when you can really start ramping up your dues because then you'll be solving an actual industry problem. You could probably take a lot of money from big tech companies to help you out here once you're established.
Ford did that by paying his workers very well; imagine Foxconn paying their workers enough to buy iPhones! On the odd side he had a team of moral police who would show up at your home unannounced to make sure you were living like a good American.
If I were an evil VP tasked with creating this loop hole, I'd tell my project manager we needed a configurable system that let us control emissions for our intneral testing environments or some other plausible and legal requirement. I'd keep the knowledge of the system obscure and communication between dev, testing and deployment sparse. I'd then have only one person change the configuration before deployment. I'm sure it could be arranged so that no single engineer would realize that evil configuration was released and if they did, it could be excused as a mistake.
If I was an evil VP, I would give my eningeers the incentive to do something without giving them a direct order. So, you might tell them: If you can improve gas millage by X% you get a huge bonus and I don't care how you do it.
This is how modern companies break the rules. Management doesn't break the rules, either overtly or covertly. Management scrupulously follows the rules, while placing requirements on their workers that can only be met by breaking the rules.
Want your workers to work more, but you don't want to pay overtime, or you run into trouble with regulations about consecutive hours on the job? Just bump up how much work they have to get done, threaten to fire low performers, and make it clear that under no circumstances is anyone allowed to work overtime. Your workers will start working off the clock, and better yet they'll hide it from you, so you can legitimately plead ignorance if the law comes after you.
Want to cut corners on safety to save money? Tell your people that safety is the top priority but you need to see an X% reduction in costs, and it works itself out. They may fudge or falsify metrics, but if you're really lucky they'll find loopholes in the metrics instead.
(I have a friend who worked at a warehouse and fell victim to this. They were officially big on safety, which included bonuses for everyone if they went a certain period of time without any safety incidents. Unofficially, this meant that incidents wouldn't be reported unless it was unavoidable. The one way to ensure that an incident had to be reported was to see a doctor for your injury, so people were heavily encouraged to wait to see if they got better on their own before they got medical attention, which often made things much worse. I'm sure upper management's metrics looked great, though.)
As an added bonus, this sort of thing gives you a lot more control over workers. If you want to get rid of a troublemaker, do a little digging and you'll surely discover that they're violating safety rules or working off the clock or whatever. Everybody is, but selective enforcement is a wonderful thing.
This is also how we end up with "crazy" regulations. Case-in-point: nuclear waste handling. Engineers create complex six-sigma safety plans which are then slowly eroded as the regulations and safety protocols are wacky overkill, right?
The WIPP nuclear isolation site had a 15 year run so management began to cut corners. Then three unlikely events all lined up and nearly got people killed. It started with a truck catching fire, which prompted operators to bypass the HVAC's filtration system. They stopped the bypass for a few days to perform maintenance on the only underground radiation detection unit. That unit gave a false alarm during testing, but was fixed and placed back into service so they started ventilating again. Then a cask was breached ~midnight because someone upstream had used organic kitty litter instead of clay kitty litter. The operator assumed it was a false alarm due to the previous false positive and kept things running. It wasn't until the next morning that they realized they were blowing radioactive particles above ground.
Had management kept maintenance up, enforced protocol, or done more than the absolute bare minimum (i.e. installing multiple underground radiation detection units) US tax payers could have avoided paying $500 million dollars. And this isn't a one-off thing, there are dozens of instances just like this where the US dodged a bullet.
The problem was substituting an unsuitable material because of mistakes when revising procedures and insufficient review of the revisions. The fact that the mistake involved an "organic" product is coincidental.
You have made some very strong points - thanks for summarizing it together.
I would like to ask: What do you think should an ordinary employee do when they see a behavior like you describe from their management? How to efficiently protect themselves (and their colleagues) from this kind of treatment?
I've just observed this, not experienced it, so I'm not sure. It's far easier to see the problem than figure out a solution!
Much will depend on your job prospects and financial position. If you're a fancy programmer type who's constantly bugged by recruiters, move on until you find an ethical company. If you need this job to eat, you'll have to be a lot more careful.
In general, I'd say:
1. Point out the impossibility of the requirements to management. Gently if need be. It's possible they don't realize what they're doing.
2. Contact the local department of labor or whatever regulatory agency would be interested in what's going on. They may be able to take action if management is pushing violations in a quiet way like this. If not, they may be able to at least take action against the workplace if people have started breaking the rules.
3. If you can afford to risk the consequences, follow the rules as much as you can. Don't work off the clock, don't break safety rules, etc. If being fired will make you homeless then maybe this isn't an option.
4. Document everything. If regulators weren't interested originally, they may be interested once you can show a pattern. Upper management may be blissfully ignorant, and you may be able to get them involved once you can show them what's going on. Whatever happens, if things come to a head then it will probably be useful to be able to demonstrate that this wasn't your own doing.
The list of 4 items is excellent. Under no circumstances should you act in an insubordinate manner until you've exercised other channels of communication. Acting before things get too far is the easiest remedy.
I'd recommend the following order of operations:
Convey concern over associated risk to your immediate manager. Verbally first during the meeting, switch to written (email, a paper trail of opposition) if no action is taken.
When documenting the paper trail, simply reference meetings which you voiced opposition. Your notepads should also be able to back up the talking point you're referencing.
Being asked to briefly switch hours or work late is often listed in your job description so stopping suddenly at 5p can be considered insubordinate. Time should be compensated in a time off or paid OT arrangement promptly. If your verbal requests go without action, again, switch to email.
Simply documenting the events as they occur makes it easy for when you need to go above your immediate supervisor ( more senior manager, corporate hq, department of labor) for help.
Key is to be polite in all interactions. Innocent mistakes happen. Managers are under deadlines too. Paper trial should be maintained regardless of action or inaction.
This is why the financial regulators in the UK have changed to consider the corporate culture when dealing with breaches. A company whose senior staff "live a compliance culture" will be penalised less for the same breach than one operating as described above.
Now you can argue efficacy, being able to pull the wool over the eyes of regulators etc but I think it is a good direction to head in.
All that is way beyond the industry described with institutionalised cheating in tests, it wounds more like the graphics card industry than one with regulators.
Just imagine how Facebook must have (initially) pushed the datr cookie on its engineers: "We only need to track everyone like this to check against DDoS attacks."
Still, even such an excuse should have raised alarm bells, but I assume most developers would just shrug their shoulders and develop the feature anyway, as they would've liked to keep their nice-paying job and juicy stock options.
In reality, Facebook only recently used the DDoS protection excuse for its datr cookie, well after it announced that the cookie would be used for advertising purposes, which also happened a few years after the cookie was introduced.
I imagine whatever Facebook told developers then was even less subtle than "using it for security purposes", and that most of the developers figured it out right then that the datr cookie would be one day used to track users across the web for advertising purposes.
* Govt research agency awards contract to study smallpox, including stockpiling smallpox. Researchers are happy, they're protecting the world.
* Govt weapons agency gets notice from govt research agency that "it's ready now."
* Govt weapons agency confiscates smallpox stockpile and data, makes more, spoons it into the tips of missiles.
Original researchers are more or less legitimate victims, whose goal to help humanity was used to obscure the Govt weapon's agency's goal to kill humanity.
There's a great New Yorker article [1] which addresses this -- particularly that "pretty clear that this was not right" might not be that clear after all. Excerpt:
"...sociologist Diane Vaughan described a phenomenon inside engineering organizations that she called the “normalization of deviance.” In such cultures, she argued, there can be a tendency to slowly and progressively create rationales that justify ever-riskier behaviors... If the same pattern proves to have played out at Volkswagen, then the scandal may well have begun with a few lines of engine-tuning software. Perhaps it started with tweaks that optimized some aspect of diesel performance and then evolved over time: detect this, change that, optimize something else. At every step, the software changes might have seemed to be a slight “improvement” on what came before, but at no one step would it necessarily have felt like a vast, emissions-fixing conspiracy by Volkswagen engineers, or been identified by Volkswagen executives. Instead, it would have slowly and insidiously led to the development of the defeat device and its inclusion in cars that were sold to consumers."
"But assuming all of this is true, how could it have persisted for so long, and without more people stepping forward and speaking out?... Volkswagen was heavily invested, both financially and culturally, in producing a clean-diesel engine. That the company was failing to meet the standard required by American emissions tests would have been embarrassing and frustrating to its German engineers. Some may have seen those tests as arbitrary, and felt justified in “tuning” the engine software to perform differently during them—even as it now looks, to the outside world, like an obvious scandal."
Similar to graphics cards driver developers facilitating "cheating" on benchmarks. The major vendors get caught cheating all the time. Back in the nineties, it was as simple and naive as detecting whether the running program was a benchmark or a real game: If a game was running, use the normal driver path. If a benchmark was running, you switch over to a completely different code path that basically rendered at shit quality but did it very quickly and made your rendering speed benchmark number look great. I worked at a graphics company as a 3D driver developer at the time and was asked to help implement such a system but politely declined. No problem, I just got assigned to a different part of the driver. There were plenty of other developers who had no ethical problem with cheating a benchmark.
> In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.
And, IMO, yours is absolutely the correct one.
As engineers, we are in a unique position to understand the intricacies and associated risk to any one of our functional requirements we satisfy. True, we are under marching orders from a PM who may (or may not) have provided adequate backstory to understand the goals of the company, but it's our duty to express any concern of risk it may present to the company or society.
When verbal warnings don't work, go ahead and put it in writing.
In the US, part of my engineering curriculum included courses on engineering ethics case studies to drive this point home.
I've written exactly 2 of these letters in my 10yr professional career. On both, the PM sharply changed course, demonstrating their knowledge of failure of the ethical litmus. Putting written responsibility upon them to act encouraged further discussion and ultimately a better resolution for all.
You ignore the fact that the 99% of the cheat probably has or had a good reason to be there for one reason of another. The actual responsibility is probably multi-layered with the majority of contributor unknowingly participating in something fishy.
I remember that I worked on system that allowed a financial firm to change a account value outside of all the accounting routine and a way to transfer money from a shared account to any random external account. Sounds fishy, but the explanation was reasonable: this was the way to model dividend. Actual dividend money was coming in one account, the payment/reinvestment from another with a reconciliation done later on. This reconciliation was done by that department operational floor from data coming from other departments, with systems upstream and downstream. I may have unknowingly created the final piece of a massive international fraud system, but I have no idea and all the people had a valid business reason and nothing to gain.
Because that's the point, if I knew and I was willing to actually participate in fraud, I would like a cut in it and a serious one as I would literally provide a gun with my real id from a regular shop knowing you will use it for a bank robbery.
The problem is that nobody wrote a specific defeat device. You have to understand, Bosch has one basic (extremely complex) piece of software that goes in pretty much every single engine controller. There are some specific functions that are developed at the request of specific customers, but the basics are always the same.
When this general piece of software is set up to control a new engine from an OEM, a multi-year process of calibration starts. The engine starts of failing the emissions tests drastically and over time the calibration engineers tweak tens of thousands of parameters to try to get the emissions to be legally compliant.
This is not an easy process and the calibration engineers have a target, they have to get the vehicle certified for a certain emission class, which in turn is determined by the FTP-75 driving cycle. The news keeps talking about "defeat devices" because the law says that they are not allowed, but reality is much messier. There is nobody sitting there thinking about how to make crazy obfuscated defeat devices, rather there are a bunch of people trying to figure out how to pass the extremely demanding emissions requirement tests without any defeat devices.
They didn't write a defeat device to "fake" emissions, they wrote legitimate code aiming to make the car truly compliant with the requirements.
Then someone made this code not the default and, except when the car is under testing, loaded an alternate mapping of some key parameters which is more tuned for performance and mileage, at the expense of emissions. This is the criminal part, not the first.
I don't doubt it's this big mess you are describing, but if the engine is actively looking for test/not test and does things like turning off particle filtering after 30 minutes of driving because it knows any test will be over, it seems to me the intention is in fact to lie about the real emissions.
The thing is that detecting test situations is good, certainly useful, perhaps necessary.
E.g. detecting you are on a test stand, you might
* deactivate plausibility checks which would normally limit engine power
* deactivate air bags and other inherently dangerous systems
* ensure correct system behaviour even if sensors give conflicting information
* suppress error logs for certain failed plausibility checks
The only illegal thing is switching to a "massaged" parameter map under these conditions. And presumably a single person controlled this switch.
But somebody had to write code that recognizes that an emissions test is being performed, no? I mean that sounds like it would be more involved than just loading some different set of parameters.
While I agree with you, it's a real easy thing to say and a much harder thing to actually do. Having been in a similar situation myself (not quite as drastic—no lives were in danger and no laws were broken), we (the engineers) argued vehemently with the CEO (there may have been yelling involved), but eventually capitulated with a "ok, but this is on your head" attitude.
Maybe if it were more important I would have fought harder. But I honestly don't know.
It appears that the firmware was written by Bosch and not VW themselves. I can't believe that only VW engineers were involved in writing the spec that was sent to Bosch and that only one of those engineers understood that "acoustic" really meant "emissions".
I would imagine that Robert Bosch GMBH and co. are quietly developing a war chest of cash right now. They appear to have been caught writing deliberately naughty code to order.
The firmware is ultra-generic but highly configurable. The default device isn't in the firmware itself.
Many different engine types will use the exact same firmware, the manufacture supplies a huge array of configuration values to make the ECU run the engine correctly.
All the firmware knows is that under some manufacture configurable situation, the engine will switch from one manufacture configurable mode of operation to another manufacture configurable mode of operation.
> their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs
I prefer to think about this. Management gets what it wants, you can always find someone to carry it out. I think we should be very careful about moving to blame engineers first.
While engineers may have an obligation to public safety in other fields, this is, strangely, not the case when writing software that ends up in automobiles.
If we had a good way to blow the whistle, without ending in jail, we would see less unethical code being written and less shit storms from greedy companies.
How do you know engineers were involved in writing the software?
If there were engineers involved they should have said something if it does endanger the public. After all, it is in their responsibility/regulation to uphold the safety of the public.
Did it actually endanger the public? A silly question, however, what was the actual measurable consequence of this cheat? Did air quality become measurably worse during the period of this? Was anyone actually and measurably endangered?
An older car on the road puts off more emissions than the modified VWs, yet older cars are still allowed on the roads.
Laws should be followed, so I am not excusing the behavior, but making it sound like the safety of the public was materially compromised is a bit of a stretch.
It is true that people who are employed are obligated not to do unethical or criminal acts on the say so of their employer, but the employer is also obligated not to command their employees do unethical or criminal acts. And the employer is the one who holds the power in the relationship.
This is basically the independent contractor on the death star issue. It boils down to the fact that they are morally responsible because they do the work and take the money fully aware of the end result of their work.
You are correct. "Just following orders," is sometimes an understandable action (in the sense that I understand why someone might not want to refuse an order and risk their job), but that doesn't make it an excusable one.
It depends how drastic. You are talking about people risking their careers. We all have our own moral standards to live by. Maybe they justified it by saying all the other car companies are doing the same thing (and by the way there are some indicators they did.)
>In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.
I respectfully disagree. Engineers have an obligation to do whatever is asked of them by management. If that means building nuke-hand-grenades so be it.
Being an engineer( be it software or mechanics ) should be a morally neutral thing. Do the job and keep your morals out of it. This is the only way we can have some semblance of sanity in the field.
I really really really strongly disagree with this. As the ones who are most knowledgable about the systems they're building, I definitely think engineers have a moral responsibility to make sure they're at the very least following applicable laws and regulations - I would argue that we have a moral responsibility to act ethically even in cases where it's not covered by laws or regulations, but I'll admit that's a bit more controversial.
The problem with "moral obligations" is a person's moral axioms are fundamentally arbitrary. Would you still want engineers to be guided by their moral convictions when you fundamentally disagree with them yourself?
Inasmuch as I want everyone to be guided by their moral convictions, yes. I mean, even if I don't agree with them, I'd much rather an engineer's decisions be guided by 'what they feel is right' rather than guided by nothing at all, which seems to be what parent was implying.
Also, this is why we have engineering codes of ethics, at least in Canada and the US (and while I'm not familiar with elsewhere in the world, I would assume similar things hold in most first-world countries). We don't necessarily have to agree on everything, but there is a baseline for what we consider ethical, and engineers are expected to uphold that baseline, otherwise they are not permitted to practice engineering. Unfortunately the line between 'engineers' and other practicioners isn't as well-defined for software engineering as it is for most engineering fields - but that doesn't mean we should ignore it completely.
This is stupid. There is no such thing as working without morals, and you aren't going to have 'sanity' by asking engineers to adopt someone else's (also arbitrary) morals rather than their own.
It's imho not so clear cut whether this code is unethical or not. There is a legitimate argument that increasing NOx to reduce CO2 is better in the long term. Remember that the two are inversely related: the hotter and leaner your engine runs, the more NOx you produce, but your fuel efficiency also goes up.
In the long term, making their cars cause accidents that killed many people would reduce the environmental impact of humanity, so would also be arguably ethical by that logic.
An engineer did write this code. Almost certainly, many of them, working together. Now, their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs... But it's still pretty clear that this was not right. They still did it, and then didn't tell anyone. And that doesn't sit well with me.
In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.