> You bring up the practice of software engineering, but I think you need to place it in context of how software engineering was practiced around the time that the electronic throttle system was developed. MISRA, which HN readers are becoming aware of, didn't exist.
DO-178B was created in 1989, Ada was created in the 70's, standards for mission and safety critical systems have been around since the 70's, I was reading about Ada and safety critical programming as a kid in the 80's (I was a weird kid).
Whether you accept that multiple independent experts in safety critical programming are right or not is up to you however the stuff that came out of the trial painted a very clear picture to me of just how bad this software is.
A cynical person might consider that if Toyota is this poor how many other manufacturers are also as bad.
EDIT: I'm not saying that the UA was caused by this, without the source code and some proof I can't say that however the general setting is pretty terrifying.
> DO-178B was created in 1989, Ada was created in the 70's, standards for mission and safety critical systems have been around since the 70's, I was reading about Ada and safety critical programming as a kid in the 80's (I was a weird kid).
I'll hazard a guess that the HN readership consists of many formerly weird kids. :)
DO-178B was around then, but I'm not aware of any automotive groups using it then or now—I'm not in the automotive sector, so I could easily be mistaken. I would argue that IEC 61508 would have been the best model to follow in lieu of something more specific (especially since IEC 26262 is an adaptation of it), but I don't think even 61508 existed at the time outside of a draft. Ada was around, but it was a different beast than it is now and the tooling wasn't very good at the time (my opinion). I can't think of many that were using Ada outside of the government mandate.
For general consumer products (i.e., outside of aviation and military), I'm having trouble thinking of industry standards for mission- and safety-critical electronic systems that existed at the time that Toyota's electronic throttle system was developed. Maybe SAE had something at the time? A cursory Google search didn't pull up anything.
> Whether you accept that multiple independent experts in safety critical programming are right or not is up to you however the stuff that came out of the trial painted a very clear picture to me of just how bad this software is.
To clarify, I'm not saying that the software isn't poorly written by modern standards, but I am questioning that it was uniquely poor relative to the rest of the industry at the time it was developed. That said, Toyota was/is arrogant, and ignoring their own processes and requirements at the time is unquestionably the wrong thing to do. I remain skeptical that the software component was responsible for the unintended acceleration issues.
> A cynical person might consider that if Toyota is this poor how many other manufacturers are also as bad.
Isn't the null hypothesis that they were all equally bad?
I think the relevant standard for comparison, from a safety point of view, is with mechanical throttle linkages that were used before throttle-by-wire. And it's pretty obvious that the mechanical linkages are less reliable: they involve a long cable snaking around the engine bay. A sticky throttle used to be a common complaint. With these systems, it's not. And of course, it's the multi-channel redundant brakes that are the critical backup safety system here.
BTW, I read through Barr's attempted reengineering of Toyota's ECU (his slides are linked in the comments). It's just making me angry. After going on and on about how bad Toyota's code is (spaghetti, blah blah blah), he starts presenting his failure modes: suppose that a random hardware memory error flips some bits in the CPU's task table. Then the task monitoring the pedal angle is going to die. Now to get actual unintended acceleration as described (when the driver is pressing the brake), you also have to suppose that the throttle position variable is corrupted at the same time. Other than the general unlikelihood of this, consider: suppose that Toyota's code did not contain any "spaghetti" or global variables. Suppose in fact that it was beautiful enough to make angels weep tears of joy. Would that make the slightest fucking difference, pardon my Japanese, when you start flipping bits in the task table? Of course fucking not.
His complaint amounts to amateur backseat engineering: you protected variables A and B from corruption by having multiple copies, so why not C? Your watchdog will restart tasks X and Y when they die, so why not Z? And so on. Which is an OK suggestion for the future, but how are they liable for something for not making an already extremely safe system slightly safer, when it's much safer than previous systems, and has a fantastically reliable backup?
Among other things, what I found interesting was that Barr was giving expert testimony on engineering and "engineering certainty", when as far as I can tell he doesn't have a PE license (if he does, he's the first engineer I've ever heard of that doesn't conspicuously advertise it). I was under the impression that wasn't permitted in any state.
edit: Toyota did plenty wrong. Chiefly, I'd say that ignoring their own documented processes should be at the top. Having a system utilization of >70% would be another, as is using recursion. All are things that, at the time, were no-nos. I don't however think that one can fairly argue that it's practical or sensible to implement emerging standards during the multi-year car development process, which many seem to be arguing. I also remain skeptical that the unintended acceleration events are software related. It seems that for one to buy the bit flip argument from the trial, one also has to assume that the drivers depleted the service break vacuum, and that combination doesn't seem probable to me.
DO-178B was created in 1989, Ada was created in the 70's, standards for mission and safety critical systems have been around since the 70's, I was reading about Ada and safety critical programming as a kid in the 80's (I was a weird kid).
http://users.ece.cmu.edu/~koopman/pubs/koopman14_toyota_ua_s... has a very good overview.
Whether you accept that multiple independent experts in safety critical programming are right or not is up to you however the stuff that came out of the trial painted a very clear picture to me of just how bad this software is.
A cynical person might consider that if Toyota is this poor how many other manufacturers are also as bad.
EDIT: I'm not saying that the UA was caused by this, without the source code and some proof I can't say that however the general setting is pretty terrifying.