If you are diligent about documenting AND updating, sure, seems very reasonable. But most people aren't. You know what the best documentation is? Working tests, much harder to have wrong tests than wrong documentation. Even some of the largest projects like Ruby on Rails have had incorrect internal code docs.
No. Tests are not documentation. Tests are tests, written in code, which must be explained. You should document your tests, too, because I guarantee the next programmer won't understand your tests as well as you (think you) do. Also: the "next programmer" will be you in a year.
This is maybe a close #2 on the list of documentation falsehoods that programmers believe.
Yes. Test _can_ very much be documentation if you write them as such, although they aren't complete documentation. Tests are the specification, ie the the how---documentation provides the why.
Test can be written to tell a story but most people aren't taught this way (or simply don't buy it or don't write tests).
No, tests are not specification, they are examples, specifically on the form "do this, and this happens".
Imagining a specification out of this is like solving those problems of "what is the next number on this sequence: 1 18 31". It's simply can not be done, you can guess something, but you will never know if it's the real answer.
That _can_ be examples but I prefer to write them as specs. I mean, there are entire libraries centred around writing tests as specs. Just because you don't do it that way or buy into the idea doesn't mean it's not possible or valuable.
Specs are complete. If you create real world software with complete tests, I will really want to read your Turing Award receiving discourse. It should be great.
You are right about commenting tests, but it seems wrong to categorically reject their value as documentation for users. I use tests all the time to get code examples and understand what's actually supported. One of the great things about open source projects is that you can see the tests.
> You should document your tests, too, because I guarantee the next programmer won't understand your tests as well as you (think you) do. Also: the "next programmer" will be you in a year.
I hear this a lot, but I haven't found the same with myself. I can read and understand old code I wrote.
I attribute it to my IQ being mostly held up by reading and writing comprehension skills. The way I process information makes it easier for me to remember and understand old code (I believe), compared to a lot of programmers whose inherit skills align more closely to things like math and logic.
Documentation can be a rabbit hole for me, and I often don't feel I benefit from it within the code.
I appreciate this is as good tactic for many people, but implying it's for everyone is too dogmatic.
I wouldn't describe it as a "falsehood that programmers believe", just one of those perhaps unfortunate realities that exist for some codebases that have a decent set of unit tests but little else in the way of up-to-date documentation that explains all edge case behaviour etc.
The only documentation you can really trust is that which 100s of others rely on regularly, but a significant percentage of the code most of us work with isn't going to fulfill that criteria.
Explaining edge case behavior is one use-case for comments, and not the most valuable one in my estimation. Aside from that, often incorrect edge case handling is in both the code and the unit test because the problem is that the developer didn't understand the requirements. In my experience, in an undocumented and difficult codebase the tests will be as mysterious or unreliable as the code itself, which makes sense since they are usually written by the same people.
You'd be surprised. There are generations of programmers now who think nothing of writing 20+ unit tests with quite clear names demonstrating what the behaviour should be under a variety of conditions, but with virtually no other documentation. Especially true in dev shops that have high coverage requirements for a successful ci build.
Tests are often a pretty good way to learn how a given piece of code behaves. Particularly, if the author is being nice and provides a "usage example" type test.
Tests verify what the code is doing. And I agree: they are a great source of insight when understanding an unfamiliar codebase.
However, comments are typically better at answering the why.
If you are diligent about documenting AND updating
I don't understand why this is viewed as challenging. Writing a sentence or two here and there is orders of magnitude easier than writing the code itself. And any code review process should help to prevent situations where the code and comments are out of sync.
Lastly, I feel like there's a larger human issue. I write comments to explain certain why's in the code because I care about my teammates and I care about the project.
If others don't do the same, I think it speaks to a lack of care for their fellow engineers and the work itself. I think, "I just spent five hours figuring out something that you could have explained with a single 30-second comment?"
I'm baffled that some engineers think that this is okay.
> You know what the best documentation is? Working tests
Tests do have a slight overlap with documentation, but it's that, only slight.
If a piece of code has some weird non-obvious behavior, the presence of a test for that particular behavior is a signal that it's actually intentional, not a random bug.
But, it doesn't tell me anything why that design choice happened. That's what the documentation is for. So facing such code, I sure hope it's well documented.
Show me a set of tests that I can use to build something more easily than using the documentation? It sounds like just kicking the can down the road to the next person.