I think the article is interesting, but I think it glosses over the very real cost of working like this.
As someone with a career in safety-critical software (aerospace, automotive), it just doesn't make sense to me that anyone else would _want_ to be doing TDD or other similar methodologies. It's _so_ much more effort to do tests before code.
Sure, if you write some bog standard piece of code you've written a million times before it's basically a wash. But if you're writing something novel that you have to iterate on, you eat at best a 2x constant factor on all the work you do*.
The above is especially true if you're doing exploratory work. Which, of course, you don't always know ahead of time is where you'll end up. Often you learn something new - about architectural constraints, about an API, about performance, about the hardware, or most often about the true implications of the requirements/specification - and then your entire initial plan goes out the window.
This all said, obviously if you don't have tests your product is broken. But if nobody dies when it breaks, just write the code and then add the tests with the biggest bang for your buck. Spend the time you saved becoming a better programmer, as ultimately that will catch more bugs than anything else per hour spent.
---
*Frankly I think the truth is actually that the larger the feature (bug fix, change, work packet, what will you) the more the slow-down factor grows.
>It's _so_ much more effort to do tests before code.
It is?
For me, tests are (or should be) a programmatic representation of the spec. I usually find it's easier to write the spec before the tests and the tests before the code.
I worry about tests written after code, actually, coz they have a greater tendency to be aligned to the behavior of the code rather than the spec.
> it just doesn't make sense to me that anyone else would _want_ to be doing TDD or other similar methodologies. It's _so_ much more effort to do tests before code.
This statement only makes sense if you can provide figures comparing the costs of defects with the overhead of (A)TDD/BDD.
I'm not disagreeing, I'd just be genuinely interested in the true cost of forgoing TDD/BDD versus TDD/BDD. Unfortunately this would require rigorous documentation of the product lifecycle including number of defects, time to fix, losses due to defects, etc. I'm not sure there's a lot of reliable data available, though.
True, that's a good point and I pretty much agree with you entirely. I didn't really make a distinction between argument and anecdote. Basically I think it's being glossed over that there is a blatant up-front cost to testing. Write the tests first and you guarantee that you eat this cost. An argument can then be had as to whether the upsides are worth it or not.
I would love to see data on this, but to my knowledge there is none. (And it's unclear how to even design such an experiment/study to get meaningful results.) The statement I made that it's a lot of effort to do testing before code is of course pure anecdote. But that goes both ways - so are the claims in the original article as far as I can tell.
As someone with a career in safety-critical software (aerospace, automotive), it just doesn't make sense to me that anyone else would _want_ to be doing TDD or other similar methodologies. It's _so_ much more effort to do tests before code.
Sure, if you write some bog standard piece of code you've written a million times before it's basically a wash. But if you're writing something novel that you have to iterate on, you eat at best a 2x constant factor on all the work you do*.
The above is especially true if you're doing exploratory work. Which, of course, you don't always know ahead of time is where you'll end up. Often you learn something new - about architectural constraints, about an API, about performance, about the hardware, or most often about the true implications of the requirements/specification - and then your entire initial plan goes out the window.
This all said, obviously if you don't have tests your product is broken. But if nobody dies when it breaks, just write the code and then add the tests with the biggest bang for your buck. Spend the time you saved becoming a better programmer, as ultimately that will catch more bugs than anything else per hour spent.
---
*Frankly I think the truth is actually that the larger the feature (bug fix, change, work packet, what will you) the more the slow-down factor grows.