In the late 90s, I worked in two different Lisp companies (one of which went on to become quite successful). They were, in many ways, far more modern than the average C++ shop of the era: They accepted that programming could be exploratory, that unit tests were a great thing, and that you wouldn't know what you wanted until after you'd already built half of it.
And yet…
I remember programmers who disappeared for months at a time to work on "their" modules. Integration phases where we tried to combine wildly incompatible components after 6 months of separate work. The breathless joy of seeing an actual regression test suite for the first time. The endless arguments trying to find the ideal design. And we all thought this was the "right way" to develop software!
I remember Beck's Extreme Programming. For me, it came as a bolt from the sky, obviously radical and brilliant. It turned me from a clueless junior programmer into somebody who took over and turned around a 150,000 line, 14-year-old C++ project, a project which everybody thought had succumbed to terminal bit rot.
Unit tests. Short iterations. The planning game, a.k.a., "Here, let me take your list of features, and tell you what each one costs. Now you decide what order we build them in." Man, does it bring back memories.
Of course, that was pretty much the shining high point. Within just a few years, Beck's idiosyncratic personal vision had been replaced by the "agile manifesto", and then by an entire consulting industry. Good riddance to all that.
But you know, it's been a long time since I heard somebody say "integration phase" or "object-oriented analysis." And I couldn't be happier.
I had a similar epiphany when I read Beck's book around 2000 or so. I don't think the Agile community has added much if anything to the work Beck originally laid out, either. In fact the term "Scrummerfall" is now in common usage here in Minneapolis.
Well, that was somewhat predictable. You've got a lot of managers. Some of them don't like to code, and regardless, all of them certainly need to be employed. What would they do if their shop started doing XP? One or two can become PMs or coaches, but what happens if you have much more managers?
At the risk of sounding somewhat loony, I think Marx said something about the middle class (in his definition: the class between the proletariat engaging in production, and the ruling class of capitalists) trying to preserve the benefits it gets from its social role. Can't find a citation though.
The middle classes maintain themselves to an ever increasing extent directly out of revenue, they are a burden weighing heavily on the working base and increase the social security and power of the upper ten thousand.
It's funny that there is a quote, recently made famous by Civ 4, stating "the bureaucracy is expanding to meet the expanding needs of the bureaucracy," while management would be a much better fit than bureaucracy IMO.
Also, in practice civil services are much more efficient in many situations than their market alternatives (possibly because they pretty much epitomise the proletariat you mentioned).
In terms of customer interaction it's obviously a terrible experience, even when you do meet helpful people. But have you actually gone through a process where the civil service was more expensive than the market alternative?
By and large, the empirical data does not support the public perception of government (and remember there are many layers of "government") being hugely wasteful, especially when looking at money wasted on administration compared to the market.
Also, the government is under a market pressure of a sort: can I "sell" this budget to the people who have to vote for me as a politician?
EDIT: Of course, I'm not saying that civil service is good by definition - just that when done right there are places where it's a much better solution than a free market one. Also, corruption can be an issue of course - as I understand one of the big reasons Greece is in trouble is because of the corrupt civil service, which caused it to bloat as well.
> State run liquor stores are more expensive than private.
You mean the liquor stores in countries where liquor is extra heavily taxed in an attempt to curb alcoholism? Yeah, no. In fact, that that is exactly what I mean: you are confusing cost-to-the-consumer with cost-to-run.
Amen. I just looked at the employee section of a previous employer's website and saw that they actually have as many managers as developers. Seriously 1 manager for every developer.
Some of the criticisms are still valid: there's a definite "gossip magazine diet plan" vibe to the book, even if the clean cut from waterfall or iterative process horrors (Rational Unified Process anyone?) was sorely needed.
The criticisms about no metrics supporting the book assertions are valid too (we have more metrics now, though).
That said, a lot of things in this book are now considered good engineering practices, and the methodology guys have come back with a vengeance with the Agile/Scrum/Lean/Whatever waves that filled the blanks left by XP (and ensured a nice revenue stream for pure process consultants).
In four words you managed to perfectly encapsulate the problem I have with so much writing about development processes and tools.
It's not good enough to claim to be just incrementally better than what you're probably doing, you need to be the best most super awesome EXTREME thing ever!
I've seen a lot of cowboy coders get projects done. (I've probably been one for a good deal of my career.) Can you say that with a straight face about adherents of more rigorous development methodologies?
The real problem with cowboy coding is getting to version 2 (or maybe 1.1). But it should be stated that at least if you've got a successful version 1 you have most of a solid design to start with.
Interestingly, most development methodologies seem to be mainly concerned with getting to version 1 and not with producing a maintainable product or getting a successful but unmaintainable version 1 and evolving it into a maintainable version 3 (say).
The obvious exception is the Software Process Maturity Model or whatever the frack it's called today, which is enormously concerned with maintainability to the point of making initial development (or indeed almost anything outside of pure engineering environments) virtually impossible.
Who needs a maintainable version 3 these days? If your product hasn't been acquired and shut down by that time, you've failed. :-P
In all seriousness, yes, different methodologies are certainly good for different organizations and different projects, and you're right that cowboy coding is effective at getting things done. I've definitely seen organizations that weren't well-suited to traditional methodologies get mired in documentation and fail to get anything out of alpha. Whatever approach you choose, just don't follow it straight off a cliff.
> iterative process horrors (Rational Unified Process anyone?)
I lived in that world on a DoD program for 3 years. I still have nightmares.
We also had a customer who didn't want to "pay for the same code twice". That meant signal processing algorithm research had to be done in the operational real-time code, and that we could have one and only one codebase (including branches but not including engineer/scientist working copies). The "experts" from Mitre and elsewhere who were managing us were so clueless they didn't realize this was costing them more and adding more risk than allowing the scientists to do their algorithm development in Matlab and Fortran, then have the engineers turn the algorithms into an operational system. Of course, we still had to have requirements and design documentation up-front before we could implement code, so...
Wise policy. I keep getting tempted by that dark world, though. The subject matter I was working on was awesome (real-time radar signal processing) and I had a great 60/40 balance of EE/CS knowledge focus. I can't find any jobs like that out in the real world, though. Everyone out there seems to want either a CS expert or an EE expert, not a hybrid.
It happens outside government too. It happens anytime your higher-level boss tells you: "Yes, I'm on board! You're right, a system like that would help us tremendously in several ways! Please estimate the costs for it so I can get us the money in our budget meeting next month." Happens all the time in enterprise businesses and the newer methodologies are extremely difficult to use in this sort of situation. In this case, I need some specs up front, create dev estimates, all to ultimately deliver a rough dollar amount or sometimes a "not to exceed" number for my boss...
Waterfall is only mostly dead. I'm told that the reason why the public sector prefers price-and-scope-up-front is that the alternative of "we'll keep delivering things as long as you keep paying us" sounds to them far too much like ways that they have been ripped off badly in the past when buying tanks, bridges, etc.
That is absolutely the reason. On top of that, the only reason they want to use iterations (along with the dreaded inchstone) is so they can use Microsoft Project as a progress bar. If you try to change your iteration deliverables (that were set three years ago) because of actual project issues, you get a ration of shit from the government and their advisors.
Not only are there projects where waterfall methodologies were applied, but they are sometimes (gasp!) appropriate.
In the majority of cases, iterative development of some sort is necessary, because the requirements keep changing. Waterfall simply doesn't allow for changing requirements, so it's hopeless in those cases.
But where requirements are stable, waterfall is a reasonable approach. Remember, it had its origins in the big iron days, when vast project teams analysed and replaced existing manual systems. For those projects, waterfall worked.
If you're writing the code for a space probe, or a missile, there's a limited opportunity to iterate once the item is deployed... requirements are stable, waterfall can be a rational choice.
Don't know about the formalization by Royce, but I've witnessed and suffered waterfall in practice - let's call it ad-hoc waterfall if you will - and it was no urban legend.
And on moonless nights I still wake up screaming remembering a RUP project I was involved in around 2000.
The guilt and shame is so spot on. Living though such projects illustrated so well the "the beatings will continue until morale improves" project management school of thoughts.
This school has its agile proponents too, though. But compared to the common practices of the penultimate decade, XP and Agile at least acknowledged that failure was the default mode of software development projects, and made contingency plans for it.
I was part of a development organization that, driven by guilt and shame, tried to mature and improve their "ad-hoc" processes by fully embracing by-the-book waterfall.
To me it felt like they were saying, "You know all the things you did to make your last project a huge success: the short iterations, tight feedback loops, early and frequent customer involvement, designing your systems to accommodate change? Yeah, don't ever do any of that again."
So I left, and found that dogmatic by-the-book scrum shops can also be horrible places to work. Live and learn.
Absolutely. If the waterfall isn't working, you need to waterfall harder. Because more time spent up front writing paper architectures and guesswork requirements is what's needed!
oh, don't get me wrong i've worked in that environment, but its just a natural naive way of doing things - dignifying it with a name seems to be something that happened later
Some companies did use it. But even at the time (before Beck's book), it was known in programming literature to be a dumb idea. XP didn't introduce the idea that Waterfall was nonsense, it was just the catylist for it's removal.
my favorite one so far. from my POV today, if I didn't know this was a one-star review I might mistake it for a positive review :-) (except maybe for the obvious "worst practices" comment at the end)
-----------------
This book advocates extreme coding. It's a disguised way to legitimize "dive-into-coding" with no proper analysis and design. This author advocates little if any comments, documentation etc. He actually states that "design is in the code" and documents get out of date and that's an excuse not to do it. How is that in comparison to the "traceability" principle advocated so strongly by Jacobson and Rumbaugh who believe that the final code should be traced back to the analysis documents (use cases in Jacobson book) He advocates programming in pair instead of code reviews!
In short, this book gathers the industry "worst practices" exactly the opposite of OMT, OOSE, RUP etc.
> This author advocates little if any comments, documentation etc.
I'm very much of the opinion that good code self documents, and excessive use of comments and documentation is a crutch for bad code.
Some optimised code might be really incomprehensible without commnets, but its the exception rather than the case. By far my biggest use of comments is to highlight inadequate or rushed code that should be revisited later (i.e. TODO: or HACK: rather than NOTE:).
Don't get me wrong - design and documentation is useful in practice as well as being the 'theoretical ideal' that I think should be striven towards. Diving in and making a mess to fix stuff or diverging from an original design is bad in theory - in practice it is a necessary evil with tight deadlines or when rescuing a project that is already in a seriously bad state - or even when the original design was naive or even impossible because it came from an inexperienced or otherwise inadequate source.
>>I'm very much of the opinion that good code self documents, and excessive use of comments and documentation is a crutch for bad code.
I'm very much of the opinion that you are very much of the wrong opinion.
"Good code self-documents" is a myth. Most programmers have trouble coming back and reading their own code, much less other programmers' codes. Comments and documentation written in spoken language ensure that the code can be understood much more quickly and efficiently and any misunderstandings are prevented.
The fatal problem with the notion that the code is the documentation is the fact that the code only tells you what the code does, as opposed to what it's designed to do. How do you know a behavior you observe by reading the code isn't a bug which is going to be fixed next week? Or an arbitrary implementation detail that isn't guaranteed? That is what docs and interfaces and specs are for. "The code is the documentation" only works if the code is immutable.
This is a rather extreme opinion. I find tests to provide less value than they create most of the time. I think regression tests and integration tests are quite valuable. The former are created after the fact, and the later are often so difficult to do correctly that they are never made. Of course, this difficulty is the same reason why they are valuable.
Nonsense. When you read the code you need to know what it does, not what the programmer had intended, then, and only then, will you stand any hope of fixing bugs.
well chosen function and variable names tell you the intention, if it's modular enough then it will all make sense when you read it.
If people could just get over this documentation phase that programming went through in the 80's and 90's due to languages containing obtuse syntax and being so low level you had to hold a dozen things in your head at once (I'm thinking of you COM) - but in this era, with languages that operate at a high level of abstraction the era of documenting the code is over.
"The code is the documentation only works if the code is immutable."
that's actually backwards. If you want to stand any chance of changing the code then it should be readable and well structured, not well documented.
I'm skeptical. How do you know what's intended behavior and what isn't if you don't know why the code is structured the way it is? Understanding what it does is necessary but not sufficient. You risk introducing more bugs if you don't understand why the code does what it does.
"well chosen function and variable names tell you the intention, if it's modular enough then it will all make sense when you read it."
The two hardest things in computer science are concurrency and naming things. Oh, and off-by-one errors.
If the code doesn't match what's intended, it's most likely because the original implementator misunderstood it, or didn't think of a particular edge case - and thus the comments won't reflect that as well.
You look at code to determine what exactly the system does, but in order to determine what exactly the system should do, you need to look at the users needs or the business process, the code+comments can't tell you that.
Any public API should be documented. I can spend all day reading your code trying to code against your API, or I can read the API docs & be on my merry way. When/if there is a bug, I can fix it so it adheres to the specified purpose/pre/post conditions.
> When you read the code you need to know what it does, not what the programmer had intended, then, and only then, will you stand any hope of fixing bugs.
I'm talking about being a user/caller of code, not attempting to fix it. Of course you need to know the innards if you're going to fix it, but needing to know all the implementation details in order to even call a piece of code is counter-productive. It's needlessly time-consuming and it violates the concept of separation of concerns.
"Most programmers have trouble coming back and reading their own code"
Hogwash! maybe if you're writing assembly code or similar low level language, but one of the express benefits of high level languages like python, scala, ruby etc is that they read (to a competent developer) like english (or whatever your native tongue might be)
Maybe you and "most programmers" you know don't write decent code, but I can easily read others or my own code if its well written. The last thing I want is a bunch of comments everywhere. I am all for the occasional comment (as an exception) because my eye will be drawn to it, but in general, unless your code sucks, it should perfectly legible without the need for excessive comments and documentation.
You haven't worked in anything important, anywhere, ever.
Also if you think that you'd always remember why you did something, at some point in time, for
all given circumstances, you haven't lived long enough.
> Maybe you and "most programmers" you know don't write decent code, but I can easily read others or my own code if its well written
You really aren't that good as you think. Your problem is you grossly overestimate your
capabilities and achievements. We all were there, most of us outgrow it. You will too
at some point.
I've done a lot of digging through other people's code, in general the only comments I've found helpful are those that explicitly call out defective code, and those that provide links to external references.
Writing clear code is the only hope for the poor maintenance programmer, your carefully written comments probably aren't worth the effort.
playing the experience card is poor form and fallacious - also it wouldn't surprise me if that commenter is speaking from extensive experience.
i don't mention my 20 years of experience or background as a AAA game dev having touched every speciality with experience in high volume web and database environments, financial data quality software, language compilers and interpreters or anything like that... its next to worthless vs. what i can do, demonstrate or explain.
Yes it is harsh, in the same style of the parent comment.
Actually you're wrong on the experience card, for only one reason. What I mentioned is experience
gained that everyone will gain, just by staying long enough in the game. Not because, I am
super cool and worked in positions that few people can work.
It's the same thing, like when we were teenagers and we thought that we were right in every freaking
thing and our parents just responded "wait until you grow up". Well they were right in most of the
things, weren't they? Were they super smart? Was it relevant if they were? Nope, they just stayed
long enough in the "game".
i dunno, i like your example of parents. my experience of that was definitely learning just how naive and unprepared for everything they were...
i don't rate experience highly at all. quality of experience and what you learn from it is what adds value to that 'time served'.
plenty of old men will still dance even though they have been terrible at it for 50 years. they have had a lot of practice, but they still suck. there are ample examples like this
I agree with what you say and I find it quite orthogonal to what I said :)
You also gave an interesting example. Dancing is a specialized activity. Thus specialized
talent and/or quality experience is required. But not all people go dancing. So I wouldn't
quite include it in the set of common experiences that most people will go through
in their lives.
I think your last paragraph is generic enough, we can use it as a reply to your comment:
You really aren't that good as you think. Your problem is you grossly overestimate your capabilities and achievements. We all were there, most of us outgrow it. You will too at some point.
Most programmers have trouble coming back and reading their own code, much less other programmers' codes
Really? I was just poking through some code I wrote several years ago looking for something I may re-use and I didn't have any trouble figuring out the intention by reading the (largely) uncommented code.
Pure coincidence. Like those random unimportant memories that get stuck with us, but we don't know why.
When you work in really large projects, which are not hobby projects, with specs you don't even understand
the reason of their existence, it is impossible.
For example I have many experiences similar to what you described, yet today I couldn't remember why I did
something in a code fragment I wrote last week.
That's why you'll see all great programmers sharing similar stories. Otherwise you'd be superhuman.
I guess it could be pure coincidence. But I'm going to chalk it up to the fact that I go out of my way to have well-factored code that uses intention-revealing names and isn't too clever for its own good.
The choices that you make when writing code really do matter.
All things you mentioned are considered given, for someone that is an above average programmer.
Yet in large projects, with lots of people, with many interconnected modules, clear code and concise variable naming is not enough.
That's why we have documentation that describes the general architecture and comments for the
code parts that aren't so clear cut.
Some problems have many solutions, with different levels of advantages and disadvantages. To think that all people understand or know the solutions and the choices pertaining to them, is of course naive.
To think that you, as a person, would choose the same path at different instances of time, is just entertaining. You are not the same person you were last year, or even last month.
Thus, good comments along with clear and concise code is the only way to go.
I've had this discussion a couple of times, but have failed to identify any pristine examples of essential documentation 'in the wild' myself. I'd like to know of a couple - I'm sure they are out there.
Edit: Not project documentation, strictly code comments.
if tag in tag_list:
data[u"Tag"] = tag
try:
if i["DupilcateAssets"] is not None: # not a typo - misspelled in Dell SOAP response
data[u"Error"] = "Duplicate assets returned"
But this is largely a comment disguised as a variable name, and is no more or less likely to stay up-to-date as a comment. If the Dell Identifier were to ever be changed to "DuplicateAssets", you're just as likely to have
In python some things evaluate as false in an if or boolean conversion (0, an empty list/tuple/set/dict/user_defined, an empty string "", and more). None evaluates as false as well, but is still distinguishable from the rest.
In ruby you likewise sometimes need to use object.nil? because nil and False are both "false-ish".
I tend to have the "comments are an apology" mindset, where I will write a comment to apologize if something is partially implemented, counter-intuitive, driven by a byzantine external requirement, or an ugly workaround to someone else's bug.
Those can totally be essential, as they will save someone (your future self, maybe) from attempting to refactor them to something simpler/cleaner that won't actually work.
If only this were the case. Comments are frequently wrong- they were either wrong from the beginning, unhelpful, or never updated when code was changed. Whenever I find myself writing a comment, it is typically because I just wrote bad code. Sometimes, bad code is necessary; most of the time, its better to get rid of the comment and the bad code.
Ultimately, comments are not sufficient; you have to read the code anyway. Comments can only be useful in the regard of recording the intent of the author.
it being a myth is massively at odds with my experience. i'm also sure that i have data and experience... also that i can quantify why it is easy to read to some degree with logic (but measurement is king).
i know i'm fortunate to be very good at reading other people's code so my view is probably biased - its feedback i constantly get - i've never worked on a codebase where i can't contribute meaningfully on day one even when its of quite a poor quality. i've found myself explaining people's own code to themselves more than once too...
i've worked with lots of programmers and trawled through many code bases old and new... by far the easiest code for me to read is fairly devoid of comments with sensible variable names, lots of whitespace and descriptive function names and using procedural logic or sensible (not excessive) amounts of OO. hungarian notation i can take or leave...
comments don't hurt... documentation doesn't hurt, but more than once i've seen pretty disgusting code with a big comment above it explaining what it does where if they had just named variables properly and split the code into decent functions the comment would have been needless. a classic case is something which looks like:
that code requires zero comments or documentation imo it tells you what it does and how it does it... sure its a crude example i just conjured and there is some bad practice there if you take it literally (what data am I throwing around? there should be function parameters...), but i'm not afraid to say that if you struggle to read code like this then you are just not very smart. maybe the concepts (index buffer, cache) need explaining, but thats not what comments are for thats for Google, Wikipedia, hardware and library specifications and generally just knowing enough to be competent in your domain.
comments i like are this:
void OptimiseIndexOrderingForCacheEffeciency()
{
/// use the k-cache optimisation algorithm, this has been measured to work on test1.mesh only
// imagine code here...
}
what i never want is a huge blob of comment explaining the algorithm and burying the fact that its effectiveness in this scenario has only been measured in a special case.
its shocking how many programmers are able to produce difficult to read 1000s of lines of code functions and forget the most basic principles of good, simple procedural programming, despite how familiar they may be with design patterns, OO or functional principles, CS theory, complexity bounds etc.
advice i constantly have to give and feel i never should: use functions. name them properly. use good variable names. don't write huge comments. refactoring is a lot quicker and easier than you imagine (and removes technical debt!). don't be afraid - source control will save you.
don't mistake the plethora of terrible programmers and lack of good example code for a sound principle being mythical...
EDIT: wtf is up with the formatting in this box? i see the help link now... but screw that i'm lazy
Well written code covers the what and the how. The why is what you need to be covering with code comments. This is for the inevitable time when the "simply not so smart" developers has to fix errors in the absentee guru's code.
I'm very much of the opinion that good code self documents, and excessive use of comments and documentation is a crutch for bad code.
We obviously hear this quite a lot, but I think it's inaccurate - it's bad comments and documentation that are a problem, and comments in particular are often most prolific when they're bad. Comments and documentation often suffer from rot, in that they are not kept current with the code and end up being out of date in short order.
But the fact that comments and documentation can be out of date is a reason to do them better, rather than throw them away. I'm one of those people who typically codes "comment first" - writing a simplified english overview of the steps I expect to take, implementing each of them using real code, and taking a final pass to trim or expand comments as required. So I might start with:
function doit(){
// Load data from the input
// Get the correct records
// Search for the correct field
// Output the results
}
This is followed by implementation:
function doit(){
// Load data from the input
records = Data.load()
// Get the correct records
records.uniquify(function(e) { return e.group_id })
// Search for the correct field
results = records.map(function(e) { return e.timestamp })
// Output the results
print(results)
}
We've now got stupid redundant comments of the type that cause problems. Rule of thumb for me is to decide if the statement in the comment can be trivially deduced from the code immediately following it. If it can, then the comment can go. If there's some additional assumption—perhaps about side effects or properties of data which might not be immediately obvious—then the comment can be expanded to include this information. So we might end up with something like the following:
function doit(){
records = Data.load()
// Disambiguate records by looking at the group id - records will
// always be sorted by time, so we can do this to make sure we
// only get the first in a given group.
records.uniquify(function(e) { return e.group_id })
results = records.map(function(e) { return e.timestamp })
print(results)
}
Obviously not real code, but you get the point.
In general, I find that working on "self-documenting" codebases is actually rather frustrating - especially when I'm coming in on a new project. There's an absolute load of implicit knowledge about software systems that's contained within the code, and while it should be exposed and made explicit wherever possible, it's sometimes not practical - meaning that "self-documenting" code often seems anything but.
thanks for the detailed reply, this is an example of an 'okay' comment, but its still not really needed imo. you can fix this with variable and function names:
ExceptThatNobodyWantsToTypeDisambiguateRecordsByGroupID20Times. (No, IDE can only help you so much. You still lose valuable screen estate for browsing source code when we have excessively long names.)
Not to mention you only captured half of the parent's comment (the "why" of the function), so it should really be something like disambiguateRecordsByGroupIDBecauseRecordsAreAlwaysSortedByTime.
there should probably be a reasonable limit to the length of a name... the IDEs are great, but even without them copy-paste is pretty universal and safe for identifiers.
To get rid of comments, you've now hard-coded the algorithm into every call to `disambiguateRecordsByGroupID`. What happens if something changes and you need to disambiguate records by something other than group ID? It seems like a short comment is a pretty small price to pay for some additional robustness.
The comment about pair programming resonated with me: What Mr. Beck writes about the paired-programming aspect is contrary to common sense and anyone even the slightest inkling of perception into the personality traits of developers.
Pair programming was THE most distasteful, unpleasant thing I have ever done in any employment. And I include in that my very first job which included cleaning grease traps and bathrooms at McDonalds.
Gurdjieff said that the path to spiritual development is to like what it does not like.
If developers do not like pair programming, is it a problem with pair programming or is it a problem with developers, specifically their unconscious, machine-like it?
The whole point of XP is not only to go against management best practice but also against the comfortable habits of most software developers, in an effort to make them better developers.
I found pair programming preferable to code reviews in both efficacy and enjoyment. Because, you know, you can't really skip this part in most larger systems where more than one developer is involved. You have to do one or the other. So you're doing code reviews ... Right?
I'm reading the reviews and I realize that I don't have the background and years of experience to understand the software development world as it was in the year 2000 (I graduated in 2006).
I get some of it, because of war stories. But I know I'm missing a huge piece of it.
Like when did some of the older practices come about? Was it in the 90s? Late 90s? 80s? Are most of the reviewers curmudgeons that that watched the SDLC landscape change before their very eyes as their certifications lost value?
It's not just software. All business processes have been undergoing change from top-down to bottom-up processes as people realize those companies which allow good ideas to percolate from the bottom win in the marketplace. The driver here is the internet. It used to be the case that you didn't come into contact with good or bad practices except by working in a place they were applied, but the internet ensures everyone can now read online what works and what doesn't.
I think your looking at the internet with rose tinted glasses business practices go in and out of fashion and some times back into fashion again.
I directly saw this as I worked on the BS5750 standard and saw the rise and fall of TQM
I think you will find that in traditional engineering does stick closer to traditional project planning PERT/TQM and so on as its a lot more embarrassing and costly if a bridge falls down :-)
Its also a lot easier to do in manufacturing is easy you can measure the tolerances on a piston on an engine line and flag thats a defect or not - much harder to do in software.
If you mean Waterfall, the term was adapted from civil engineering, which of course made a lot of sense for software (sarcasm). One of the earliest papers on the process was in the 70s by William Royce where the term waterfall was used to describe software development at its worst.
Like it or not but the waterfall method does make a lot of sense in software development. It may be useless if you're trying to create another billion dollar useless web app, but is very helpful when you have to create software for mission-critical applications.
I think Beck explicitly states mission critical as a bad candidate for XP. In reality very few systems are mission critical. And of many mission critical systems, only a small part of the code is actually mission critical.
I currently work at a company making highly FDA regulated software. We once missed a contract with a big drugmaker due to lack of documentation and QA process standards. The pendulum then swung the other way towards waterfall, to the point where people would write 50-page reports about transferring a database schema from MS SQL to Firebird without mentioning that fact that all the databases had the same username and password sitting in the Registry of the client computers that connected to it for everyone to see. (obviously not anymore). To remove a nonfunctional form field I was once asked to fill out 20 pages of documentation (before I threw a hissy fit and some process Nazis left the company)
The "spiral" model was documented in 1986 and in common practice long before Extreme Programming hit the shelves. I never saw a waterfall-like process in use in the 90s except in the DoD. It was never more than a strawman for XP to beat up on. (Edit: On a hunch I just looked up when scrum originated: 1986 as well.)
Some other arguments that show that XP wasn't as new as the book argues to be:
- I doubt Lisp, Smalltalk, Forth, time sharing Basic and similar tools were invented for the waterfall model, and all were available in the '70s.
- Weinberger advocated egoless programming in 1971.
Also, looking back at the 1960-1975 timeframe, nobody would have accepted XP as a methodology. I can see the discussion "what do you mean by 'if it turns out we need more memory we simply move to a larger computer'? Ordering one will take months, it will cost us thousands of dollars a month, and we will have to build a new computer room. And no, it is not acceptable if that accounting program you are writing turns out to be a glorified checkbook because that is what you can cram into the machine."
Also, it wasn't as easy to get a MVP because there were so few libraries to build on. IBM would happily license you their 'database' software or their time sharing software, but you better be sure that you needed it, because you paid for it by the month. But it just wasn't possible to work on the idea "don't worry about the 3D graphics. We'll eventually pick a library, but that can wait."
A devil's advocate would argue that the XP book just was a good summary article with great marketing that due to pure luck appeared at just the right time.
The part I wanted to understand more was the culture of workplaces that agile would affect in the coming decade. What created cultures that adhered to waterfall. I find asking, "how did we get here?" fascinating.
Grady Booch first published the Booch method around '91. He merged his stuff with a few other guys to make UML and the Unified Process a few years later. The rise of big process in software was fairly fast. The push back you see is more a misunderstanding, they see it as a transition from no process -> process -> no process rather than no process -> too much process -> lighter process.
RUP and UML happened around the same time as XP, in 1998, but it was the culmination of a decade of so-called "object oriented methodologies" being discussed between Booch, Rumbaugh, and Jacobsen. The latter coined the term "use case" in his method. RUP and UML combined the three approaches and modelling notations (I'd say 70% Rumbaugh, 20% Booch, 10% Jacobsen)
Certifications? No, the whole "certified bullshit expert" thing started with XP. Before that people just did their jobs and didn't invent religious practices that had to be adhered to in order to do so.
There has never been an official XP certification. There is a ScrumMaster certification. I'd suggest that you avoid anyone who claims certified expertise in XP.
And no, people didn't "just do their jobs." Management methodology isn't a new idea.
PMP certifications were established in 1984[1], and while I don't think the certification itself is nearly as abused as "Certified ScrumMaster", PMI and Six-Sigma are definitely touted as panacea, especially in the enterprise.
PMI in 1984 was exactly what it says in the name, project management. It didn't become a programming methodology thing until the agile craze. Six sigma was big in the industrial world, not programming.
Though tbh the same could be said of any of the mountains of methodology books we waded through in the 90s. The word 'evidence' appears precisely once in the RUP white paper, and it's not in reference to evidence for RUP itself being effective.
Projects can be canned for all sorts of reasons, but from what I recall the people involved in C3 blamed the Chrysler management. Perhaps the latter were more interested in having a working payroll system than inventing the future of software development.
One of the criticisms I remember some people making of XP when it first came to prominence was that it shifted a lot of the hard work of software development onto the customer, or rather the customer representative; certainly, ever since I heard about what happened to the first XP customer representative (see http://www.coldewey.com/publikationen/conferences/oopsla2001...) I've taken comments about XP being a humane methodology with a pinch of salt.
I don't disagree with what you say about C3; my comment is more to do with the "see one-write one" nature of software books. Even though some books turn out surprisingly well, there's a distinct lack of depth and statistical analysis in the field.
Robert C. Martin (known as "Uncle Bob" to many) wrote a blog post somewhat related to this recently. The gist is that Extreme Programming the name is dead because Extreme Programming the principles have been (mostly) widely adopted. It's a great read and I recommend checking it out.
No. In college, I followed the c2 wiki where XP was crafted, out of the ashes of the Smalltalk-based Chrysler payroll system.
It was a bunch of practitioners sick of being told by MBA-types (or modelling "architects" that couldn't code) of the proper way to write software.
Almost everyone in modern software development has been impacted by XP, if you think through some of the practices that almost no one mainstream had heard of or were discussing in 1999 until XP:
- Customer is on the team (Often the biggest factor in IT projects)
- Test driven development (now BDD)
- Continuous integration (now Continuous delivery and Devops)
- Design improvement (refactoring)
- Small releases (now "Lean Startups")
- Planning game (or estimation without Gantt charts and WBS)
Detractors focus on the controversial practices like pair programming as if they were lunatic. Or on consultants making money flogging a crap methodology. I don't think consultants could make money on XP back in the day - it was too foreign and extreme.
Agile was born off the fumes of XP, to give people methodological freedom while agreeing to the same principles. This unfortunately enabled legions of consultants to invent their own whatever-the-hell method for their client and subject employees to it. But it also helped some teams to deliver faster and higher quality than they would have otherwise.
XP - to actually be XP - is no less prescriptive. Either you follow the 12-step program, or you're a heretic. Now small-a agile is certainly a good thing - but like Nietzsche I am suspicious of any attempt to systematize what should be organic.
For the curious [1]. Which brings up a point that I try to make with colleagues about any method/process. If you apply them blindly you might get lucky, but you'll probably get terrible results. They have to be modified to suit the personalities and capabilities and needs of the team, the company, the industry.
But the very core of XP was that it did have to be followed exactly. That there was 12 specific things you had to do. All of those things were existing things, that people already did. It was precisely the "doing all 12 of them exactly as we say" that was XP. And it was total BS.
Xp compared to the things that came before it, or to the things that came after it?
If you compare the opposite poles of Xp, and the things advocated in those 1-star reviews, the way that a lot of successful bushinesses develop software is a lot closer to XP than "pre-Xp" software methods.
It was a watershed for many. In fact, I would agree that the XP and TDD side of the argument won the intellectual debate.
I do. I might be biased as my first work experience was using it extensively (and we were quite successful doing it).
So far, it is not rare that I can find mission where people, once the mess cleaned a bit with their current environment and trust established, are willing to give it a try, progressively and in good conditions.
The general feedback I got is quite positive.
Hopefully we've evolved to the point where we can reject any dogmatic approach and realize that there exists certain good practices that can be applied - but it depends on the environment we find ourselves in, size and makeup of available team, timeframe, existing rules we have to adhere to for the particular project etc.. No silver bullet, but we rehash these debates endlessly.
And today most people I know who mention it keep saying XP when they mean pair programming. Not as if it's one of many practices in XP, but as if the two were synonymous. Please tell me this isn't common.
People latch on to what aspect they don't like of a thing and it becomes synonymous with the thing itself.
With XP, sadly, this is true. Most times you bring up the XP practices individually, and they're "good ideas". You say XP, and they say "fuck pairing!"
And yet…
I remember programmers who disappeared for months at a time to work on "their" modules. Integration phases where we tried to combine wildly incompatible components after 6 months of separate work. The breathless joy of seeing an actual regression test suite for the first time. The endless arguments trying to find the ideal design. And we all thought this was the "right way" to develop software!
I remember Beck's Extreme Programming. For me, it came as a bolt from the sky, obviously radical and brilliant. It turned me from a clueless junior programmer into somebody who took over and turned around a 150,000 line, 14-year-old C++ project, a project which everybody thought had succumbed to terminal bit rot.
Unit tests. Short iterations. The planning game, a.k.a., "Here, let me take your list of features, and tell you what each one costs. Now you decide what order we build them in." Man, does it bring back memories.
Of course, that was pretty much the shining high point. Within just a few years, Beck's idiosyncratic personal vision had been replaced by the "agile manifesto", and then by an entire consulting industry. Good riddance to all that.
But you know, it's been a long time since I heard somebody say "integration phase" or "object-oriented analysis." And I couldn't be happier.