Hacker News new | past | comments | ask | show | jobs | submit login
Cognition versus Programming: The Big Two – Why TDD? (danielbmarkham.com)
28 points by todsacerdoti on March 18, 2022 | hide | past | favorite | 45 comments



"We humans suck at programming. We suck at reasoning. Any honest look at a sufficiently large and older piece of code will show you that our coding is atrocious in anything but the super small."

These are the opening sentences and I think they are completely false. If anything, we have millions of lines of code in any sufficiently complicated program that we actively develop and refactor. This statement is just absurd. (Not even mentioning that said programs were mostly NOT developed using TDD.)

"You can't say that something represents something else unless you have agreed-upon executable tests that validate that representation. Otherwise it's just words."

Facepalm. Ever heard of a thing called the "compiler"?

Sorry, I was not able to read any more of this article.

If you want to read something meaningful instead, look at the articles from James Coplien "Why most unit testing is waste" and the "Segue". A few years old, but maybe there are some who have not yet heard of them.


Didn't know this article thanks!

"Why Most Unit Testing is Waste" https://rbcs-us.com/documents/Why-Most-Unit-Testing-is-Waste...


> Facepalm. Ever heard of a thing called the "compiler"?

If you'd read the very next paragraph you'd have encountered:

> In TDD, we're always looking for the smallest possible thing we can test, starting with zero, i.e., "does it compile".


Life's to short to read bad articles with unsubstantiated claims in their opening sentence.


To be honest I didn't bother reading the article, I used to think tdd was a good idea, I don't anymore. To paraphrase the book a philosophy of software design, tdd leads to tactical programming instead of strategic design.

The premise of doing the simplest thing to make a test go green feels a bit ridiculous to me.


TDD intentionally encourages a bottom-up programming style where you allow the design to emerge instead of consciously designing up-front. The argument is that this results in better solutions, which in my experience is true - https://wiki.c2.com/?WhatIsAnAdvancer kind of describes the feeling.


Yes, I think this is true. While there have to be some big picture strategic design decisions like 'do we use a database or not' and 'is this a server/client application or not' below that very high level it quickly deteriorates into making the big decisions when you have the least information and this leads to decisions that one is probably going to regret.


No. TDD is as top-down as it can be.

« Test-driven development (TDD) is a software development process relying on software requirements being converted to test cases before software is fully developed, and tracking all software development by repeatedly testing the software against all test cases. This is as opposed to software being developed first and test cases created later. » [1]

[1] https://en.m.wikipedia.org/wiki/Test-driven_development


Then what do you call it where you write programs bottom-up but still write test cases before implementation? Because that's the most common approach that I've seen being done and called TDD.


I call it wasting time trying to fit square pegs in round holes. Many things and approaches have their virtues and their place, be it top-down, bottom-up or tests. Silver bullet techniques and dogma have absolutely none.


Not my experience, shrug.


That's when you adopt other XP practices in addition to TDD.


Perhaps it can be both bottom-up and top-down depending on your mindset.


Then it’s not test « driven ».


Yes. TDD is the micro version of Agile. It works because getting adequately comprehensive requirements out of a client, or for a business process, is incredibly hard. The first version will nearly always be significantly wrong. Everyone then shouts "do better", but that's not within your control.

So both Agile and TDD evolve a solution from "in this case, the software needs to do this". You take the software to the client/users, find out the next layer of requirements when they say it's not quite right, and adapt it.

(Every now and again someone tries to make a sort of requirements document that's also an executable test. Directly mapping the given-when-then of user stories to arrange-act-assert. It doesn't work for the usual 5GL reasons, but it doesn't stop people trying)


> TDD intentionally encourages a bottom-up programming style where you allow the design to emerge instead of consciously designing up-front.

In Haskell I do TDD top-down frequently.


How do you TDD before you have anything running end-to-end? Do you do the first chunk of development with only one test? Do you recurse down so you have several layers of failing tests at the start? I don't like either of those options much personally.


You are skipping a step there - it's red -> green -> refactor

As for "tactical programming instead of strategic design" - that's entirely up to the developer and how they approach the problem. if you are prone to neglecting the design you'll do it any way. If anything, (actually) practicing TDD would force you to come up with better design.


How does TDD force you to a better design ?


By forcing you to think about your code's interface before thinking about its implementation. That means you get larger components that fit together based on the problem requirements, rather than components with interfaces that are only locally optimal.

You can see this same thing in action when watching functional programming live coding sessions: most live coding is bottom-up, but they still write the function signatures before writing any code.


This strikes me as tautological. If you're using anything statically typed you have to write the function signature before it's body.


Static typing is not synonymous with manifest typing (always require an explicit type declaration). Many functional languages can infer the type of an expression from its definition, so explicit type signatures are optional. These languages are still statically typed though.


It doesn't. It signals a particular point in time to improve the design.

Good design is a creative activity and cannot really be forced, only given space to grow.


A nice clarification that I've heard from Dave Farley is that TDD doesn't necessarily imply not designing your system or architecture ahead of time. You still do that, it's just that with a more traditional approach, you take that design/architecture and begin implementing it assuming that while you may end up making some changes, it's basically correct. With TDD, you begin implementing your design/architecture just the same except that from the very beginning you assume that your naive design is probably wrong and and leave yourself open to making those large corrections as you encounter the need. He likens it to the scientific method where you start with a hypothesis and then design experiments that would disprove your hypothesis. In my experience, when you approach TDD this way, it remains extremely strategic.


> With TDD, you begin implementing your design/architecture just the same except that from the very beginning you assume that your naive design is probably wrong and and leave yourself open to making those large corrections as you encounter the need.

I think the assumption that you can test your way out of bad design is false. Design is about assumptions you make, and you cannot test these assumptions, because your tests have to make the same assumptions.

In other words, the problem with the scientific analogy is that in science, the model and the world are never going to be the same thing. But in TDD, the model (specification) and code are ideally the same thing. You should make sure they are the same thing and try NOT to deviate from that in the first place.

What you can do, however, when refactoring, is to compare old implementation (non-efficient one, for instance) and the new implementation. That is possible, but I rarely see that in TDD approaches (modern testing focus is on one instance of code only).


> I think the assumption that you can test your way out of bad design is false. Design is about assumptions you make, and you cannot test these assumptions, because your tests have to make the same assumptions.

I think that's right. But what TDD buys you is that, when the design changes (and it should, if it's a bad design), and you have to modify your component/class/module/whatever, you can run the tests and see if you broke anything. You can then, for each failing test, think through whether the change should have broken that (and therefore the test is now wrong), or whether your change actually broke something that had been working.

This can make it far easier to make changes safely, and lets you make them with far more courage. It means you're not stuck at "we really should change that, but it might break something, so it's not worth it."


> But what TDD buys you is that, when the design changes (and it should, if it's a bad design), and you have to modify your component/class/module/whatever, you can run the tests and see if you broke anything.

That's not exclusive to TDD, it is true for any automated testing, and not only that. Any formalizing of additional assumptions into the code (as I write elsewhere, things like type checks, assertions, property tests) will also manifest potential issues when code changes.

My point is, there are other techniques to solve this problem, more straightforward than TDD.


I agree with the book on TDD if it is the _driver_. It’s not, you are.

But if you’re doing a lightweight version where you do actually design, or explore a design, and use tests to simply run your code in isolation, document usage variations and failure modes, then it’s a very useful and pragmatic tool.

This doesn’t necessarily mean you should hack your code into small bits and all that. Example tests and a REPL are just a nice combination of tools for interactive programming.


Can you talk a little bit more about tactic vs strategic development?

What do you find ridiculous about doing the simplest thing to make a test go green first?


Not OP, but I find TDD ridiculous as well.

Thinking strategically is understanding the trade-offs of your code, and coming up with good abstractions, that take these chosen trade-offs into account. You want to express the business problem in a language that is close to the domain, and then write a translation (and sometimes the translator) to the language closer to the metal. Coming up with the correct language how to express it (the language can be as simple as recognizing that your problem is a general one on which you can apply a known algorithm) is precisely the strategic thinking.

Applied top-down, it also means exhaustively understanding states of the application and major data structures (for example, database schema) and designing the functions around that. It can also be applied bottom-up, by building reusable tools and building blocks, which are independent on the global state, and then connecting them as simply as possible (this is also sometimes called hexagonal design). Both top-down and bottom-up have pros and cons.

I am much in favor (instead of too much unit testing or TDD) of building better abstractions that make humans less prone to errors (these can even be DSLs), building assertions into your code that warn you something is amiss (and can also be used in property tests), and always testing against something (i.e. the test code doesn't use the same assumptions as the program it is supposed to test, which is a big issue I have with unit testing). The best parts of the code are the ones that you don't have to test - if the compiler (or your abstraction) can guarantee you an assumption about your code, then you don't need to worry about it being wrong in the first place.

And in particular, the problem with "x+y in C++ can launch nuclear missiles" is not syntax, it is a weak type system (ideally you would use something like Haskell that controls the side effects). Again, using checks in a stronger language is a strategic, not a tactical decision.


I share their opinion. I understand "Do the simplest thing" to mean "ignore any additional insights you may have".

If I know that a class will need to be accessed from multiple threads, I am not allowed to use locks, concurrent data structures or lock-free patterns until I add the first test that checks for multiple thread access. When that happens, I need to throw away almost all of the previous implementation. The same happens for any special data structure or algorithm that I know in advance will be necessary to solve the high-level problem, but it takes tens of tests to reach a point where the "simplest solution" no longer works. As an exercise, consider implementing a Union-Find data structure while constraining yourself to never writing a line of code that is not the simplest thing to solve the current test.

My experience has been that it's better in those cases to construct a unit testing roadmap so that:

- you incrementally build the correct solution - you throw away as little code as possible - throughout the process, you minimize the amount of code not covered by a test


> If I know that a class will need to be accessed from multiple threads, I am not allowed to use locks, concurrent data structures or lock-free patterns until I add the first test that checks for multiple thread access. When that happens, I need to throw away almost all of the previous implementation. The same happens for any special data structure or algorithm that I know in advance will be necessary to solve the high-level problem, but it takes tens of tests to reach a point where the "simplest solution" no longer works. As an exercise, consider implementing a Union-Find data structure while constraining yourself to never writing a line of code that is not the simplest thing to solve the current test.

Who's imposing that constraint on you? Why? Can't you negotiate your way out of that?


TDD. That's the point of TDD and that's what people are arguing against. If you relax this constraint, you get something different.


I share the same sentiment. TDD is one of those things that are really nice on paper for trivial toy code like Cat extends Animal, but quickly becomes impractical and falls apart against real world problems.


I think the article is interesting, but I think it glosses over the very real cost of working like this.

As someone with a career in safety-critical software (aerospace, automotive), it just doesn't make sense to me that anyone else would _want_ to be doing TDD or other similar methodologies. It's _so_ much more effort to do tests before code.

Sure, if you write some bog standard piece of code you've written a million times before it's basically a wash. But if you're writing something novel that you have to iterate on, you eat at best a 2x constant factor on all the work you do*.

The above is especially true if you're doing exploratory work. Which, of course, you don't always know ahead of time is where you'll end up. Often you learn something new - about architectural constraints, about an API, about performance, about the hardware, or most often about the true implications of the requirements/specification - and then your entire initial plan goes out the window.

This all said, obviously if you don't have tests your product is broken. But if nobody dies when it breaks, just write the code and then add the tests with the biggest bang for your buck. Spend the time you saved becoming a better programmer, as ultimately that will catch more bugs than anything else per hour spent.

---

*Frankly I think the truth is actually that the larger the feature (bug fix, change, work packet, what will you) the more the slow-down factor grows.


>It's _so_ much more effort to do tests before code.

It is?

For me, tests are (or should be) a programmatic representation of the spec. I usually find it's easier to write the spec before the tests and the tests before the code.

I worry about tests written after code, actually, coz they have a greater tendency to be aligned to the behavior of the code rather than the spec.


> it just doesn't make sense to me that anyone else would _want_ to be doing TDD or other similar methodologies. It's _so_ much more effort to do tests before code.

This statement only makes sense if you can provide figures comparing the costs of defects with the overhead of (A)TDD/BDD.

I'm not disagreeing, I'd just be genuinely interested in the true cost of forgoing TDD/BDD versus TDD/BDD. Unfortunately this would require rigorous documentation of the product lifecycle including number of defects, time to fix, losses due to defects, etc. I'm not sure there's a lot of reliable data available, though.


True, that's a good point and I pretty much agree with you entirely. I didn't really make a distinction between argument and anecdote. Basically I think it's being glossed over that there is a blatant up-front cost to testing. Write the tests first and you guarantee that you eat this cost. An argument can then be had as to whether the upsides are worth it or not.

I would love to see data on this, but to my knowledge there is none. (And it's unclear how to even design such an experiment/study to get meaningful results.) The statement I made that it's a lot of effort to do testing before code is of course pure anecdote. But that goes both ways - so are the claims in the original article as far as I can tell.


I am still at an early stage in my career. Anecdotally, I've noticed that the most vocal proponents of TDD are older men who come from Ruby, Python, or vanilla JS backgrounds. All the examples in the article are from liberally typed languages as well. To what extent do you think this approach and these languages correlate with one another?

I'm not saying rigorous typing eliminates the need to test things entirely, but for the fully functional Scala, Haskell, and Typescript I'm tasked to write, TDD seems obnoxiously myopic. I mean, I'm sure it's obnoxiously myopic in unsafe languages as well, but at least its benefits are more obvious there.


(I see you're still in the early stages of your career so I'm assuming life, too)

> most vocal proponents of TDD are older men

Being older means that you've been part of this rodeo longer than other people. Sometimes that means those older people wield useful wisdom unavailable to the not-so-old. Sometimes it means those older people have wisdom that is no longer relevant to the present moment.

As a not-so-old person, it's your job to listen and make up your own mind about your elders. But when these older people were your age, they too looked up at their elders with the same caution, and they made their journey to where they are today, handing down the wisdom that they believe is valuable, to you. Now, maybe times have changed and their wisdom has fallen from relevancy - that is for you to decide, but, from experience, you're probably going to reinvent the wheel a lot in your life. And you'll pass your wisdom down to those following you, who might end up reinventing it too.

There's a reason the wheel is the invention to avoid reinventing :)

And being men has nothing to do with the price of bread ;)


Yes - to me, types are a form of test, but a potentially exhaustive one. If you can prove a property with types you don't need to do it with tests. Conversely, if you don't have types, you need quite a lot of tests to prove your code doesn't immediately fall over if presented with badly formed data structures.


For me it was BDD rather than TDD that helped out more, add to that linters and such and it made many beginning development hurdles easier.

Other thing that made it eaiser us convention in naming ( the dont know what the function return part mentioned in text)

BDD was helping becuase it alllowed me to change the way I was solving a problem or building a feautre, where before I would focus on fixing or creating a smaler thing then moving to the next, nowdays I would mock and scaffold until I can see the direction that the code takes rather than what direction I think it takes.

Those tests I write I treat as utterly disposable ones only keeping those that are enough tied to business logic ( and the lack of it) that represent a living important part of documentation.

Only then as parts solidify I might opt to do unit tests, but all those I approach as BDD test but on a different scale/scope.

So as with everyting doing too much of something will end up creating more problems than not. But it takes time and effort to find that balance and I havent even begin to scratch the surface of it.


To me (not a trained programmer but someone who picked up a lot of stuff since 1997)programming nearly always feels like route finding.

If I just need a quick solution like yesterday (I needed to extract url parameters and concatenate them in a specific order) I think how I would approach it, try my ideas, refine them and look online for ideas when I get stuck.

When tackling bigger things it feels more like route finding with additional way points I need to hit in a specific order. These are the moments that I consider TDD. I can, so to speak, define the way points for me in the Form of tests and do my route finding until my test(s) pass. Afterwards I can smooth out the route, find more efficient, performant and beautiful ways by having the tests as a tool to show me when I break things.


IMVHO it's not a matter of methodologies, it's a matter of knowing what we are good at and value correctly that part.

Try the modern way to robotized humans, like stereotypical Ford-model workers in chain-production is the least effective way to innovate, it's good to mass production in mechanical industry, mass production of already designed and tested finished products, nothing more. That's why modern crapware is crapware, mostly "well done".

We humans are not made to reason and act methodically and slavishly like machines, we can do that of course, but the outcome is not much good, we are far better in free schema-less thinking. The hard-part is plug such thinking into a timetable but it was done in the past for centuries so I'm pretty sure we can do it today. Programming IMVHO must not have methods no TDD, no SCRUM, no $you_name_it. Programming should be thinking like writing a novel.

Doing so means we can't have closed source programs, we can't even have programs as product, we must came back to the classic model where the product is hw+an open environment for user-programming. That means putting on the table all pre-digested knowledge in form of functions and let the user combine them easily in their environment. That can be done, as was done in the past, in a multi-level design where anyone skilled in a specific aspect develop that aspect and offer the results as functions.

Doing so obviously means hard-to-port and bug-ridden software at first, then a slow refining process is needed. That's means something grow slower than today, for a far faster general progress. Patterns emerge, like we have a school system emerged with human knowledge divided in subjects, to a certain extent, specific paths depending on where the human want or need to go etc. Patterns changes, as world change. That's is an ever-evolving working system where the human is at the center, not the process. The only defect? There is no much room for today's giants, no business for them, however seen the past there is plenty of room for business, only a distributed and variegated ones with many small and medium size competitors in a free market...


Tests are just accessibility.

I don't need them to write correct programs, but I also don't need bigger fonts or higher contrast to read.

All these things just make life easier.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: