Hacker News new | past | comments | ask | show | jobs | submit login
Why don’t software development methodologies work? (typicalprogrammer.com)
289 points by gregjor on Feb 10, 2014 | hide | past | favorite | 197 comments



It's fun to watch observations like the following re-discovered again and again. From the NATO Software Engineering Conference in 1968:

Ross: The most deadly thing in software is the concept, which almost universally seems to be followed, that you are going to specify what you are going to do, and then do it. And that is where most of our troubles come from.

Fraser: One of the problems that is central to the software production process is to identify the nature of progress and to find some way of measuring it. Only one thing seems to be clear just now. It is that program construction is not always a simple progression in which each act of assembly represents a distinct forward step and that the final product can be described simply as the sum of many sub-assemblies.

(Full transcript of the 1968 conference here. http://homepages.cs.ncl.ac.uk/brian.randell/NATO/nato1968.PD... It's a really fun read!)


Imagine you are a large company and there are 10 ideas for things you can build next, and you can't do them all. You gather the people that build such things and ask them, "how much will it cost to make each of these, and how long will it take". Do you think it is an acceptable answer if the builders reply, "I have no idea how much it will cost, I don't know how long it will take, and I don't know what you will have at the end of the process." How should a business handle this problem?


I can't give you an exact cost and time, but what I can do is give you relative cost and time. "This is easy, this is easy but will take a while (low variance), this is hard (relatively long time, high variance.) -- so that's giving you your estimate and your tolerances.

Further, I can work with you to figure out what portion of each project will give you the biggest bang for your buck. So you can say, "I want to spend x00,000 dollars." -- and I come back and give you a demo each week that you can play around with. At any point, you can say, "This is enough to generate value for me, let's move to the next thing for a while."

Further still, I can build it with enough test coverages and good design practices that it will be possible to extend upon the design at a later point without scrapping everything.

On the other hand, you can spend half of that x00,000 developing a specification. If detailed enough, I can give you a very low variance estimate of how long it will take. However, you won't know if it will actually meet your needs until you see it. You won't know about problems in what you really need until it's too late, and you'll end up spending more money in the end.

That's the message that should go out.


You have successfully solved the problem posed, but it was not well-stated. In a real-world scenario, it is important not just to identify which of the 10 ideas is most promising, but to defeat the null hypothesis that the programmers should be fired and no idea should be pursued. After all, to choose the best of 10 bad ideas is to have failed as an economic enterprise.

In order for a business to be investable, it is necessary to demonstrate that it will be profitable to an investor. (And of course all businesses need investment, whether that is VC money or just a lone developer's part-time effort.)

We have avoided this problem because in the early days of software, most software was profitable. But as the software industry has grown, this is proportionally less true. And I seriously question in 2014 whether more LOC are committed each day to "black" or "red" projects.


> to defeat the null hypothesis that the programmers should be fired and no idea should be pursued

Now I'm imagining a website like oDesk or ELance, but which requires employers to "defeat the null hypothesis" for any job contract they want to post. What a wonderful world that would be.


There are two potential problems posed in the scenario:

The first, as I believe you presume, is that the business is looking to develop a revenue-generating product. I agree that Continuous delivery, quick iterations, late-binding requirements (a la Behavioral Driven Design), and rigorous testing are not a strong candidate for this problem within the context of big business, and I can point to a handful of systems where this approach did not work well.

However, the second interpretation is that a big business has ten internal projects to increase their efficiency. In this case, return on investment is relatively easy to calculate, but requirements are usually nebulous and thus estimates are by necessity high-variance.

I think the subset of agile techniques I described (as well as cross-functional teams, high customer involvement, and exploratory testing) are well suited to this problem because they, as a collection, allow a business to receive value quickly and allow the business to "fail fast" in lean startup parlance.

Are you building embedded software? The agile approaches aren't, collectively, well suited for that domain (though I'd argue pair programming and cross-functional teams aren't a bad idea here.) It's not going to work for games. But for internal software and <10 client software, I argue that this subset reduces risk and increases the potential of high ROI. ___

Now, as for a few things you mention.

* The programmers are not responsible for calculating return on investment. However, does an extensive requirements gathering mission with months of meetings going to be cheaper when the costs of meetings (in lost productivity and wages) is calculated? In my experience, these documents end up being works of fiction, on which aggressive estimates are produced upon pressure of management. Then, the costs overrun (especially considering that nobody accounted for the lost productivity of the customer) because the estimate was based on business need rather than reality. Then, features are dropped and technical debt accrues, leading to a system that is hard to maintain and is scrapped after five years of frustration when the customer starts again. At least, this time they know what they don't want. I don't think that works.

* I think that is the situation that leads to the "red" projects that you mention. However, you speak about early days of software where most software was profitable. However, I've seen and heard the horror stories of multi-million (and Billion!) dollar projects from the 80s and 90s. While http://calleam.com/WTPF/?page_id=1445 mentions recent studies, I remember seeing studies that 2/3rds of projects fail in the 80s and 90s as well.

So, most software wasn't profitable then (at least in terms of internal projects) and they aren't now.

However, I'd argue that the collection of techniques above allow a project to fail faster and cheaper if managed correctly.

If we were to put it into a methodology, here it is:

1. Determine return on investment if X can be automated. How many hours are saved? How many fewer people are needed to do the same job? What percentage better forecast can this project create (estimated)

2. Use some sort of behavior driven development to get a basic sense of the complexity of the system needed. If the complexity and ROI don't match up, stop here. You've lost a minimum amount of money.

3. Start prioritizing which pieces would provide the most business value for the least complexity cost. Build the first of these pieces, elaborating on the BDD done in step 2. Write this with good tests, so that the system can be extended with a reduced fear of later regression.

4. Demo to customer. Does this fit their needs? Is additional complexity exposed? Does this provide value? This is the next cutoff point. If the system appears to be more complex than thought, or if the team is unable to provide the business value anticipated, stop the project and re-evaluate or shelve.

Everyone gets together and does a retrospective with a question to continue, an honest look at what worked and what didn't, and a set of things to try to fix the problems of the first iteration.

5. Repeat steps 3 and 4 until the software either sufficiently meets the business need or the project is ended early due to a fatal flaw. Always be willing to ask if this is the appropriate time to declare the system "done for now." ___

Now, this won't work in most major corporations for a variety of political reasons, but to me this seems to be a better system than the traditional corporate project structure from my experience in both successful and unsuccessful IT projects.


Isn't that one of those methodologies ;)


Yeah that was a little bit optimistic about the powers of Agile for me, but I think the thing that Agile gets right is that we all have to operate under the uncertainty of the scope of software projects. In that sense it's not a methodology for estimating the complexity of a software project but a way to start working without that.

Perhaps the biggest reason this keeps causing problems is that companies have no good way of dealing with change. If you expect to finish any project that you start then you need to know more about that project than is realistic at the time you start it. A canceled project is a big failure for most employees and not something they want to happen to their careers.


It's not so much avoiding estimating complexity as much as moving from absolute estimates to relative estimates. Arlo Belshee has the best article I've seen on the subject: http://arlobelshee.com/post/planning-with-any-hope-of-accura...

However, until you establish reality, all the estimates in the world aren't going to help much. And most customers cannot estimate reality until they are actually in the process.

I'm not saying Agile is a silver bullet -- it can go wrong in many ways, and it's not appropriate for every situatoin. However, it's the best we have for its niche.


> Do you think it is an acceptable answer if the builders reply, "I have no idea how much it will cost, I don't know how long it will take, and I don't know what you will have at the end of the process."

If that's the actual answer, then anything else is just obfuscating the truth.

Why should we encourage lying to try to cover for the fundamental lack of knowledge? If you don't know, trying to make up numbers never makes things better.


Making up numbers doesn't make things better for you or for delivery of the project. But it makes you liable to the boss, so they can make your head roll if they need to.

A surprising number of people in business think this way: not that they need to make sure of something, but just that they need someone to tell them everything will be OK, and then put the blame on if anyone is unhappy. That someone will naturally be whoever is at the bottom of the totem pole.


This remind me that business is pure gamble, and business people is bad gambler. They want to play but do not want to lose and if they lose they want to blame somebody.

Managing ego and heart is harder than managing a project i think.


The Second Law of Consulting: No matter how it looks at first, it's always a people problem.

Gerald Weinberg

http://en.wikiquote.org/wiki/Gerald_Weinberg#The_secrets_of_...

http://www.codinghorror.com/blog/2008/01/no-matter-what-they...


haha, I love this guy..

Asking for efficiency and adaptability in the same program is like asking for a beautiful and modest wife. Although beauty and modesty have been known to occur in the same woman, we'll probably have to settle for one or the other. At least that's better than neither.


No haha, this is just plain sexist and wrong. One of this wonky generalizations and why we don't have real software engineering yet.


"Make up numbers" glosses over a bunch of actually-useful techniques, though. You might not have any data to begin with, but you can certainly get some.

For example, since a time estimate can trivially be converted into probability estimate for something happening on time, all of this is relevant: http://lesswrong.com/lw/3m6/techniques_for_probability_estim....

The best technique from that set, I usually find, is betting (with actual money) on results: setting up a intra-company prediction market for when your software will ship can tell you more than you ever wanted to know about how much time it'll truly take.


> If that's the actual answer, then anything else is just obfuscating the truth.

There are three possible reasons for an answer of 'I don't kmow'

1. Something is unknowable 2. The person being question has insufficient expertise to discover the answer 3. The person being questioned hasn't really thought about it.

You appear to be assuming that 1 is always the reason.


There's a lot more than those three possibilities. I don't really believe that anything is unknowable, aside from semantic arguments about Godel's incompleteness and things like that.

What I'm saying is, that at the moment someone asks you "How long will X take to bring through the software development process", "I don't know" has to be a possible answer or your whole method is corrupt.


To know how long something takes you must have done it already.

And that's the problem with software, there are no equals. And there are always "surprises".

A car manufacturer knows how long it takes to build a car, because it does that thousands of times for a same model. Now, designing the car, that's a different issue. (And don't give me the BS enterprise "model" of architects throwing some UML from a high tower and "developers" filling the voids because that's the worse way you can build a software)

Ok, it's one thing to build a simple ACID application, another thing is to build something that "has never been tried before" .


You can answer with order of magnitude approximations (days, months, years, might be impossible).

The order of magnitude approximation is good enough, and for ones in which you can't make an order of magnitude approximation you should be able to back it up.

e.g. c.a. 2002 a PMM said "All we need for this idea to work is speaker-independent voice recognition in a noisy environment; how long will that take?"

The correct answer to that question isn't a time estimate, or "I don't know" it's saying that doing so is an R&D problem, not an engineering problem.


A business should clearly handle this problem by maintaining the delusion that they control all these factors, and firing underlings if they interfere with the maintenance of the delusion.


Businesses never had the stomach for basic research. It's why progress is traditionally done in one form or another of a governmentally sanctioned monopoly.


The business should handle this "problem" by recognizing it shouldn't be in the business of creating or building new things, because it hasn't got the talent or will "at the top" to understand or stomach all that building or creating new things entails.


What about the other half of the uncertainty?

The company does not know how much revenue each of those ideas will deliver.


Do you expect to get estimates of outcome from stock market traders? They do their thing and you gain and loose according to external factors.

But seriously scrum with concept of velocity and estimating size of tasks is the best approach I've seen. That's if ou want to ask programmere at all. Preferable way would be to predict the duration of the project scientfically from metrics gathered in the past (of not only the programmers and projects but also the environment they operate in).

How can a programmer estimate when project will get built if he doesn't even know if anyone actually cares enough about it being built to actually decide what the software should do?


Beats me. But I have every confidence that you'll figure it out. You're smart, that's why they put you in charge.


Having read "the mythical man-month" I came to conclusion that majority of the problems we have today were discovered an described by the 1970-ies.


Great book. He anticipates a lot of the problems, and "Adding idiots at the end" rarely fixes things. (Look at the Obamaacare website)

It's been a while since I've read it, but I don't think he captured the benefits from agility.


He mentions the importance of automated tested once or twice I think, but it isn't hammered on as a modern Agilist would. But, remember, 1970 was a different world. You may literally have not had room to include unit tests in your code, or literally be unable to afford the sort of abstractions that would enable unit testing, for instance.Other parts of that book talk about the difficulty of squeezing bytes out of pages. A lot of modern Agile practices are not possible back then, or are so different as to be entirely different things. TDD would probably get laughed at by all, for instance. "What, you want me to waste my precious timeshare time to run tests I know are going to fail?"


I don't have the reference, but I read somewhere several years ago when digging through old journals that regular testing was part of the Project Mercury? Gemini? software development process.


http://www.testingreferences.com/testinghistory.php, via Google, leads me to Google: http://books.google.nl/books?id=76rnV5Exs50C&lpg=PA81&ots=o8...

"Project Mercury was done with short half-day iterations. [...] Tests were planned and written in advance of each micro-increment"


I would not be surprised to find that someday we understand why some projects work from agile, and see that it basically comes back to the "vision" observation in Mythical Man Month.

Of course, this really just betrays that I want that to happen. So... take it for what its worth.


The majority of everything in software was discovered by the 70s. Since then we have been reinventing the wheel every 5-10 years on better hardware.


This is so true. And not even amusing, it is like we can't come up with anything new, we even go backwards.


It's the politics that change. Sometimes we use a different tool to prove a point. What we lose in productivity we sometimes make back in social progress (ex: yanking control away from certain corporations).

I do hate the reinvention of the wheel too, but it's not all pointless.


I don't think hardware has anything to do with it. I think the most important factor in this evolutionary cycle is that every 4 - 6 years, new Computer Science/Software Developer STUDENTS enter the market, spend a few years catching up, and then trying to re-invent everything. Rediscovering the territory that they'd not really learned that the industry had already proceeded through at much greater pace than Academia.


No, I mean hardware is genuinely progressing. Software is going in circles. It's a culture thing. Legacy code should mean "battle tested code" not "boring, let's replace it!" We are so busy replacing perfectly working code with the latest fashion that we never advance.


Yeah - and I mean that its not being driven by the hardware folks, but in fact is a cultural phenomenon derived from the academia world and industrial world having different gears.

Yes, hardware is progressing (though we've reached a few limits) at an amazing pace - and we've pushed technology into quite a lot of cracks and crevices in the last 40 years (at least, thats as long as I've been in the computer scene) .. but what I observe is that, every 3 - 6 years, there are cycles of entropy and creation which are a result of the endless churn of 1. Industry -> 2. Academia -> GOTO 1.

Today's Rubyist is tomorrow's Cobol'er. It goes on and on, and this is - in my opinion - a people thing. Not a hardware thing.


The 1968 NATO SE conference is also the earliest reference I know of discussing software reuse (page 79 of the PDF, "Mass Produced Software Components"). Interesting to see how some of the problems then are still problems today.


Some?


Things which are a LOT easier:

1. Interactive editing 2. Interactive compilation (these two are probably the biggest reason programmers now are more productive) 3. Having enough storage for most problems 4. Having enough computation to solve most problems interactively 5. Handling numbers larger than one machine word 6. Handling floating point numbers 7. Handling precise decimal numbers (except for the lucky few using mainframes / COBOL) 8. Handling text using characters which wouldn't have worked in a telegram 9. Exchanging text with other systems (you couldn't even assume ASCII with other encodings (e.g. EBCDIC) being common) 10. Tracking changes to source code 11. Distributing shared code and tracking new releases 12. Communicating with other systems

We've gotten a lot better on various technical details. The parts which remain difficult are generally different aspects of human limitations reasoning about complex systems.


"The only things that really changed over the last 30 plus years is computers are a lot faster and people are dumber." -Old Coder


I think that the success of open source libraries shows how software reuse can work rather well. However when designing a new library it is still (and may always be) difficult to "parameterize" (to use McIlroy's term) in the most useful way for your users.


May not be a problem for them any more..


Alan Kay suggested we have these problems because computer science is more like pop culture than science. http://www.youtube.com/watch?v=FvmTSpJU-Xc


Ironically, it is military software projects that are some of the the worst examples of what Ross and Fraser warned about.


And the answer is ... ? :-)


In my experience, these methodologies are best described as cultish. The form of religion but not the power.

Here's my methodology.

* Figure out what the software does, what the business cares about, and what the users care about.

* Figure out what it kinda sorta needs to be under the covers

* Write prototypes of the technically gnarly bits.

* Flesh out the prototype and start writing integration/functional tests for the system.

* Ensure CI server is live, running integration/functional/unit tests and build documents on push.

* Primary work phase: code review, write tests, write code as you go along. Figure out if features are needed or not. Don't overdesign or underdesign.

* Document subsystems as they stabilize using comments and in-repo text files.

* End of project: everything is tested, documented, and working.

Rinse, repeat.


That's a great solution when you understand the problem. Process is great when nobody can understand the problem. Windows is a great example of this, nobody at Microsoft really understands even 1/2 of what Windows does internally down at the code level. Now, how do you ensure a useful update comes out the door in 2015?

PS: Granted if you deal with less than 30 people process tends to be overrated.


I have no beef with process, I just find that the hooting and hollering over software methodology fairly asinine. Worse, the most stupid rules wind up getting embedded into company policy if you're not careful.

So, let's assume that different approaches work for different numbers of people (shocking).

How would you break down work for a > 1 MLoC codebase with lots of people on it? How do you ensure it gets done?

I would do it by having 1 person operate as a system integrator with responsibility for designing interfaces at the top level, then farming out the subcomponents to each subteam, until this reached the point of implementation. Lots of these things can be specified up front (front end! back end! database! integration with other systems! tooling! algorithms! shared utilities!) . As development ensued and new knowledge found, responsibilities would shift and new teams formed/older teams reassigned. Several key thoughts here: 1 person owns the vision of the system (call them the systems engineer), interfaces define interactions with other parts of the system, teams responsible for tightly cohesive components with loose coupling to other teams. Mind you, that's sort of Fred Brooks-ish, but why not? We haven't gotten too far from Psychology of Computer Programming in actual practice.


Its refreshing to see someone suggesting systems engineering as the solution to managing complex software projects. I have felt, and occasionally advocated, this for years. I'm actually surprised how little systems engineering methodology has penetrated the tech industry. Have CMMI and similar initiatives soured commercial enterprise on the idea?

Systems engineering is especially important when software is part of a larger multidiscipline project. All discussions here seem to focus on pure software projects.


> I'm actually surprised how little systems engineering methodology has penetrated the tech industry. Have CMMI and similar initiatives soured commercial enterprise on the idea?

I do not know. IMO, the CMMI-esque approach reeks of paperwork and micromanagement, and Agile approaches reek of lacking of planning. Both seem to be a bit problematic in their own way.

At any rate, would you mind providing some references & resources to read over for what you term systems engineering methodology? I am keen to understand other disciplines' approaches for getting engineering work done.


I'll start by directing you to the website of the membership society for systems engineers: http://www.incose.org Their products and publications section has some publicly-available documents discussing various topics.

The Systems Engineering Body of Knowledge (http://www.sebokwiki.org/) would also be a good place to start. Probably quite a bit of overlap with the INCOSE site, but better organized.

OCW has a systems engineering course posted: http://ocw.mit.edu/courses/engineering-systems-division/esd-.... There are some lecture notes available, but unfortunately the materials are a little thin.


Too pragmatic. Where are the meetings?!


Snarky, but frankly, there will be meetings very regularly with people who care & fund the project, as well as meetings within the engineering team to keep everyone abreast of what's going on.

The trick is not to have pointless meetings. Someone should get something out of the meeting besides warm fuzzies and new doodles.


> The trick is not to have pointless meetings. Someone should get something out of the meeting besides warm fuzzies and new doodles.

Agreed. I should have been more specifically snarky.


Where's the fooseball and beer?


In the 'figure out X' parts.


If web based project: Involve your operations team as early as possible.

Carry on.


Deployment plans & team should be part of initial technical planning. They are the linchpin to ensuring happy customers. No?


The move from labor-intensive, big-bang, monolithic deployments to fully automated, tested, non-event deployments has been a bigger positive change for me than any defined methodology or specific technology.


Continuous testing/building is a major component to a good project. I know it'd been done for years by some groups Agile became a Thing.


I tried to track the history of CI once[1] -- a similar practice may date back as far as the 60s at IBM. There are multiple sources indicating daily builds with tests at Microsoft from at least the early 90s (not quite CI but at the scale of the projects probably as close as practical at the time). Long before Agile became trendy in the 00's.

[1]: http://www.alittlemadness.com/2009/01/09/continuous-integrat...


Pardon me prying, but what exactly is your experience?

I'm only asking because I've seen "in my experience" sermons delivered by fresh-out-of-college kids one too many times, and have become wary of the phrase.


IMO it's all about working with smart, motivated people who can work independently without the raging ego to cross borders to expand their computational Lebensraum. I've run remote projects with old friends over 3-5 years and it's gone spectacularly well.

In contrast, I've been on SCRUMed and Agiled projects where the resulting overhead on those who can already get work done without SCRUM Master Jar Jar's constant interruptions reduces them to people who quit within a few months.

But hey, acqui-hires rock and I can't imagine a better way to drive the talent out of a stable Fortune 500 company in order to take a chance at a startup than by driving them $%!@ing crazy and reducing their productivity to jack diddly. Ah the cycle of life...

That said, I see the use of methodologies where the talent is both junior and mediocre. But the former is better addressed with good mentoring (ha ha just joking - mentoring is for wimps, am I right? am I right?) and the latter by not hiring the mediocre in the first place (which means HR needs to be disrupted stat and since they hold all the cards they won't be - cycle of life again).


Assuming I won't be the only one: http://en.wikipedia.org/wiki/Lebensraum


"Surprisingly, left to themselves programmers don’t revert to cowboy coding—they adopt or create methodologies stricter and more filled with ritual than anything I experienced in 1980. In fact programmers today can be much more rigid and religious about their methodologies than they imagine a 1970s-era COBOL shop was. I now routinely get involved with projects developed by one or two people burdened with so much process and 'best practices' that almost nothing of real value is produced."

Sing it, brother! Can I get an "Amen"?

I know of one project with a technical lead who, to my knowledge, has never done anything other than support the vendor program the project is replacing; the team has developed a process built primarily around avoiding Subversion merges (which is not necessarily a bad thing given that they seem to actively resist learning anything about how to use Subversion) and secondarily around adopting anything anyone has ever described as a "best practice", including inventing a few new ones. So far, I do not know of anything their project actually does, although there is a great deal of it.

Oh, and their scrum meetings seem to involve the scrum master reading a fair amount of text from PowerPoint slides (which have far too much text on them).

[Edit] Apparently, I can. http://typicalprogrammer.com/why-dont-software-development-m...


I like the Kafkaesque idea of inventing new best practices specifically so you can avoid using them.


Because it tries to fix human problems logically. Humans are not very logically creatures. We love to think we are, but most of us are very emotionally driven. Our emotion comes into play when it comes to drive, motivation, hard work, creativity, and organization. Creating software requires motivation, creativity, organization, etc. Our emotion is behind software and we simply try to manage it with software development methodologies, but that's not enough.

How many people are over weight, who know that all you have to do is eat less, work out and then you will get in shape? It's that simple. Yet because of emotions, eating due to emotion, happy eating, sad eating, or poor self image many people don't get in shape. Likewise many students understand that all you have to do to get great grades is to avoid procrastinating, just start studying on time, study hard, study more, do most of the exercises in your text book, use multiple books and you are most likely to pass with high grades, yet lots of students put it off, poor will power, delayed gratification, it's more fun to goof off.

Same things apply to software, a lot of us know exactly what it takes to make good software, to spec it out, to plan a good architecture, to write good test units, to comment and document the code, to organize the process, to avoid over optimization, to avoid changing and adding lots of new features in the middle.

Yet, a lot of us don't do that, we start writing code before we even code before we spec out, because its exciting! Our emotions in play, we don't practice that delayed gratification of holding off and writing specs. We plan to throw away this prototype, but then somehow, it ends up being what everything is built on. We plan to refactor one day, but feature creep never allows for that. We know we ought to comment, but we understand the code now and don't, then 6 months later we are cussing and kicking the wall. Test units are boring, so we write as little of it as possible.

Until we take into account that programmers are humans, with emotions and different level of discipline, motivations and abilities our software development methodologies will keep to fail us for most projects.

Just my opinion.


I agree with your fundamental underlying opinion, but unfortunately it can't be reduce to a "programmer"'s problem. More often than not, it is to the whole organization you have to show "something" that is horribly coded but will convince someone from upper management than it is doable in a few months. And then you'll spend some years debugging the horrible mess you have created in the first place just to "show something". Pretty much anyone agree this is bad, but you can't sell a project by showing test coverage statistics.


The success of dieting programs is that people change their habits from unhealthy behaviors to healthy behaviors. The most successful ones are done in groups that offer emotional support in addition to health education.

The point of these methodologies is to help replace unhealthy development behaviors with healthier ones and to use groups that offer reinforcement of the healthy behavior as part of the process.

Of course, much like diets, not everyone is really going to "buy in" to the new "healthy" habits. They will drag their heels, whine, and generally continue their unhealthy behavior.


I think diets are just a poor metaphor here.

A big part of the lack of success of most diets is that the diets themselves change your emotional state. Not to mention that most actually encourage less healthy eating habits.

The American Heart Association for years recommended a low fat diet for heart health, when the studies they were using to justify this decision showed a higher overall mortality for people on low fat diets; they just died of fewer heart-related problems. (If I'm remembering correctly, there were a LOT more suicides in the low-fat group.)

The people who say "just eat less and exercise more!" are also completely full of it. It simply doesn't always work -- handled wrong it can lead to really bad emotional states brought on by the lack of food. Like the above example, you could end up with a lot more suicides, which are pretty bad for your health, no matter how you slice it. Presumably most people who make this claim are not fat or have never tried it, but it's frustrating to see it repeated over and over by people who exercise less than and eat more than a lot of "fat" people.

On the other hand, maybe the metaphor does work: We're recommending or enforcing practices when we really have no clue as to their efficacy. We don't always know what is healthy and what is unhealthy; we have guesses, but then we latch onto them like religion.


Because it tries to fix human problems logically. Humans are not very logically creatures. We love to think we are, but most of us are very emotionally driven.

But there is nothing stopping you from incorporating emotion into your logical models.


Temporal and modal logic, E. Allen Emerson: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.92.5...

(I've been told that I have been using "yeah, good luck with that" too frequently lately, so I'm trying to branch out into different phrasing.)


Because: "if you feel sad take the day. if you feel stressed take a break" probably doesn't sound great to employers.


But people don't...


In my own anecdotal experience I would say the team matters way more than the methodology. I've been on teams that could have been (and were) repeatedly successful on projects that ranged widely from strict agile to strict waterfall. Every single time that team pumped out good quality software reasonably on time.

I've also been on teams that were not successful no matter what methodology they used. The team just wasn't cohesive, didn't have enough talent, and/or had poor leadership.


One of the best features of good teams is that they are able to throw away unecessary rituals of their choosen methodologies.


I've never been on a team that didn't have some manner of unnecessary ritual put on them by the management or client. I agree with your comment, and I would additionally add that a good team can go with the flow and work well within the context of most silly rituals.


A couple of observations. First, we tend to push to maximize potential output. That means demanding more than could actually ever be done. I've come to the conclusion that if you're writing code that meets all requirements, on time and under budget, you're simply not ambitious enough. Pristine, perfect code is a sign of customer laziness.

The software we create today is tremendously more complex than the software we created back when I started in the 1990s. Part of that is standing on the shoulders of giants (and the libraries of giants), but part of that is also process.

The amount of process required for clear, effective communication increases with the number of parties involved. Creating good interfaces between components and the teams responsible for them is really a very difficult problem. If you're working on a three person team, it's easy. If there are 30, it's much harder. If it's 100, you probably can't even know every single person involved, much less avoid stepping on each other, not without a lot of process.

So what process does is increase the potential scope of a problem above what a small team can do (mostly by breaking it down into several interacting subprojects). That's difficult, important work, even if it's outside the realm of the average HN startup's imaginations and experience.


An important realization I made a while back was that design methodologies do little to address program correctness, which is almost always the wildcard on deliverables; buggy software means missed deadlines and budget. Some, such as TDD, work to address the rapid building of tools to a particular spec, but often fail to promote static guarantees, especially in languages and environments where such provability is largely impossible. Dynamic languages penchant for monkey (guerrilla) patching further exacerbates the problem.

Solutions to this are tough. My first suggestion would be to use languages which facilitate correctness, although it's usually at the expense of developer availability: the pool of engineers with experience and know-how in true FP is orders of magnitude smaller than more pervasive languages. My second thought is to further embrace math as the building block for non-trivial applications: mathematical proofs have real, quantifiable value in correctness. I find it no surprise that the larger companies have made foundational maths, such as category theory and abstract algebra, the underlying abstraction for their general frameworks. This is even a tougher pitch than the first since most engineers don't recognize what they're doing as math at all - a big part of the problem. So many of us are doing by feel what has already been formally codified in other disciplines.

I'm aware that both require more (not necessarily formal) education than most engineers have pursued and makes it a difficult short-term pitch point for any company, but I think if we're serious about eliminating sources non-determinism from projects, it's important we address them directly.


I think that's one aspect of the problem, but the migration from make-do to mathematically rigorous code can be equally fraught with peril. While the code itself can be made predictable, and easy to reason around, the time estimates and project planning often cannot. There is never enough time to factor out all of the commonality, remove all of the unnecessary use of state, codify all the assumptions into data types, etc., so you have to pick your battles.

A programmer needs intuition of what the biggest, most effective improvements are on the code base, which allow them to get the most work done. They also need some ability to guess how much time it will take, so that they don't miss deadlines. No amount of static type analysis will fix that.


Ah, but that's a matter of design - except now we have strong constructs from which to consider our problem. We will never get away from developing the architecture of our system, which is cost dependent on how well understood the domain is. Ideally, that's where we should aim to move: problem specifications that render implementations rote. A lofty goal, I know, but within closer reach every day and possible in many environments already.

Part of the beauty of proof is that so long as it is correct, the individual lemmas are largely irrelevant: we don't necessarily need to remove commonality or statefulness. Of course, I'm purposefully glossing over extra-functional requirements on which, outside of big-O, we don't have a firm grasp.

I'm attempting to stay away from "effective" or "most work done" since they're ill defined and highly subjective but rather focus on measurable changes. I'd argue that, amortized, the upfront costs of better understanding the problem definition results in cost savings down the line, especially as it recedes into maintenance.


The dark side proof is that most mathematical problems are incredibly contrived. Taking a business requirement and translating it into known mathematical problems is harder than it sounds. Once you can do that, you're 90% of the way there, and people who can do that reliably are regarded as geniuses.

But most of the time, development is done without full understanding of the problem space. Usually, the problem doesn't become fully understood until you've already spent a good amount of time coding up your solution. If you wait until you fully understand the problem before starting, then you'll never start, because your brain simply can't comprehend the entire scope.

So instead, you get code bases full of sort-of well-factored code, but with lots of unwittingly reinvented wheels. This status is occasionally improved when somebody really smart happens to notice the commonality, and remembers a classic problem that it resembles, and manages to refactor the entire thing using the general solution. However, this almost never becomes apparent during the first revision.


"I find it no surprise that the larger companies have made foundational maths, such as category theory and abstract algebra, the underlying abstraction for their general frameworks."

I would like to learn more; do you have a specific example?

Having worked with real "engineer" engineers, I've found that they have, and value, a considerable amount of mathematical education, but that education is all in continuous mathematics; abstract algebra and formal logic have about the same amount of respect as basket weaving. Unfortunately, continuous math isn't particularly useful for software.


As an electrical engineer who briefly flirted with computer engineering, I had 4 semesters of continuous math and 1 semester of discrete math[1]. I also had classes like linear systems and electromagnetics where I had to actually use continuous math heavily.

While I do not think continuous math is particularly useful for general software development, I think it is very valuable for specific problem domains. I have found my Calculus/DiffEq foundation to be very valuable for my work with radar signal processing and NLP code, more so in the former because it was basically translating electrical engineering algorithms, formulas, and concepts into C. It is also important for any type of development that makes heavy use of geometry.

As a side note, I saw some of the bias you describe out of the more "pure" EEs I worked with when I first started. There was a strong bias against software engineers, particularly those who went the CS route, because they didn't understand the math and physics behind the hardware. Admittedly, some were clueless and probably should not have been writing software for multi-million dollar hardware[2]. Most were competent, though, and able to pick up the basics they needed to know when tutored for a bit.

[1] Which was actually titled "Discrete Mathematics", and was just covered basic set theory, combinatorics, and linear algebra. [2] Like the one who added a reset routine that blindly opened power contacts on the UUT without verifying that the power supplies were off first. Fortunately, that was caught before they actually opened with hundreds of amps going through the bus.


As you say, continuous math (to my mind, calculus, diffeq, and linear algebra; anything involving reals) is necessary for some problem domains. But accounting is necessary for some problem domains as well. And molecular biology.[1]

But if I get worked up into a good froth, I can make a case that software development is applied formal logic or applied abstract algebra (or both). I don't believe you can do professional software development (in Weinberg's sense) without some serious discrete math, in the same way you can't do signal processing without calculus.

[1] If you've got something that mixes the three, let me know. It's probably something I should stay away from.


I must admit to ignorance of Weinberg's books and other writings. My interest is piqued now, though.

That said, based on your second statement nearly all scientific and engineering programming would not qualify as "professional software development". The code I worked on had little to no discrete math or formal logic in it. There was not an integer to be found save loop counters and array indices Do you not consider an (electrical engineer | mechanical engineer | physicist | molecular biologist) who can code and spends the vast majority of their time writing production code like this a professional software developer?


Two good examples from the Scala community would be Algebird[1] from Twitter which uses Monoids as a main abstraction and Spire[2], which might be the best numerical library out there and heavily rooted in Abstract Algebra.

1. https://github.com/twitter/algebird 2. https://github.com/non/spire


I considered editing my other comment but decided instead to break it out.

There are a couple of complexities that your comment illustrates well:

First, continuous math _is_ available and immediately applicable today. The problem is that we often reason in and program to the implementation, not the abstraction - a subtle difference, but an important one. Not only that, but by reasoning in a flawed representation, we often miss important derivations that result in dramatic simplifications and reductions in the problem domain. I would also argue that we already do use continuous math regularly - for example, linear algebra, combinatorics, and set theor: most of us only know them as arrays, random, and SQL.

Secondly, not enough effort is made in formal education for applying 'pure' math to computer science. Some branches, such as linear algebra, have obvious implementations and analogies already available but others are quite a bit less clear - I fault this more on curriculum silos than an engineer's innate abilities. It's a learned skill that just isn't often taught.


This might be beside the point, but combinatorics, set theory and algebra are prime examples of discrete mathematics, not continuous.


Just twisting an old joke: Good software development methodology is like a UFO - everybody has heard of it, nobody has seen it.

Restricting a creative process into some pre-baked confines of thinking doesn't make a sense to me. Some rules are vital thought those should not get too much in the way.

I think only organically growing your rules makes sense and you've got to accept there will be situations where you have to break them or they won't allow you to adjust your strategy. Imagine a huge company that standardized on some well-tested processes/methodologies that served well in the past and were adjusted as it was growing. Staying with the same set of methodologies might kill the company's future if the structure of its workforce changes, technology moves forward, competitor has more efficient methodology for achieving returns etc. Often methodologies are used internally for political gains, pursuing promotions and power and "change agents" don't really care that much about improving efficiency, instead riding the wave of a currently cool methodology as recognized by the upper management.


A software methodology can be thought of as a cattle herding dog. The dog's sole purpose is to get the herd from point A to point B generally at the same time and in one piece. The methodology doesn't really matter, the results do. Leadership/management gets to pick the dog though and this is where people complain.... Everyone wants their own dog.


"Maybe social skills come harder to programmers than to other people (I’m not convinced that’s true), but developing those skills will certainly pay off a lot more than trying yet another development methodology."

Like almost any other skill, "Social Skills" are developed through practice, and they also degrade when they aren't used. People that spend a majority of their time on a computer may struggle when communicating face-to-face. (I've also seen it happen with stay-at-home parents with young children.)


Like the author mentions, he subscribes to the "no silver bullet" thought. Methodologies doesn't ensure project success because it's insignificant compared to the individuals involved and the scope of the product. The focus on process, is because process is easy to implement and change, giving the illusion of control.

What good project managers do is much harder than implementing a Scrum method. It's finding good people, making the hard decisions of removing the bad people and get / make a clear understanding of product scope and goals.

The last point in particular should be a well duh, but in reality it's ridiculously common to encounter projects where the PM (and hence even the customer), have a very vague understanding of what's actually being built. Those projects are guaranteed to go massively over time and budget, or being outright cancelled.

Good process is just the icing on the cake. It should be used, process alone does not make a project success.


In a nutshell: because by the time software companies start to look for some methodology silver bullet they are already hip-deep in trouble and no amount of 'process' will fix multiple years of bad hiring decisions, bad engineering and unrealistic expectations.


Our process is untenable! What additional process can save us from the process quagmire? Quick, implement it.


I have a buddy who went through six-sigma and Toyota Production System training in his work as a Process Engineer in a large manufacturing company. When we talked about the application of methodology to an organization, he made a keen observation on human nature: "people don't like to be told what to do." To me, this goes a long way toward explaining why "conceptual integrity" matters more than a methodology, and why it is difficult to apply a methodology to a new team or company unless they discover it for themselves organically.


"Everyone has a plan until they get punched in the face."

--Mike Tyson


They work, but you still need very capable people (developers, managers, product owners, etc).

Furthermore, if you have a very capable team, you may not even need a formal "Methodology".


In other words, they work, but only when you don't need them? Funny definition of "work", that.

(This actually perfectly matches my experience. A good team can work in spite of a methodology; a bad team won't work regardless of the methodology.)


Depends on the size of the team. If you have a 5 or even 10 person team, then maybe. If you have 100? You need methodology.

Once you break a project down into subcomponents that interact, you need to coordinate release of any changes to interfaces, so the work of one team doesn't break the work of another.


I still think it is the capability of the people what makes the difference.

I actually have had the experience of leading a team of 50 developers in a 120 people project. We all were the new on the project and didn't know each other, I basically picked the most qualified 7 developers and formed teams. We broke the project into modules and look for smart ways to work and not stepping over each other.


Exactly. Lots of things "work" for small projects. Try to scale some of those strategies, however, and things break down.


this one of the many examples of the anti-intellectualism that is so pervasive in the software development scene. Are the inflexible, detail-prescripting methodologies he describes any good? of course not, but I haven't seen a single seriously used methodology be like that in real life unless it fell on the hands of negligent/incompetent management and/or engineers.


You are lucky if you've never seen a methodology misused or misapplied. I have seen it many times, in my own work experience and in the failed projects I've taken over.

I'm not sure how you got the idea that I'm anti-intellectual. My intention was to express my own opinions and experience and start a discussion. I wasn't trying to wave anything away.


There's a difference between enterprise IT and other environments. For example, if you are a government tax department, your agenda is driven by legislative changes, and you want a fairly rigid process to ensure that things are done accurately and correctly.


>anti-intellectualism that is so pervasive in the software development scene

Errrr....


How else do you characterize an industry that seems content to re-learn lessons of the past over and over?


1) Industries don't learn; people do. We have a lot of turnover and growth in this industry, so a lot of new people. 2) Although there are high points of genius we may never reach again (Turing, etc), we are making progress. Software today can do things hardly imagined a few decades ago. 3) Part of how we make progress is to try "unlearning" things; throw out conventional wisdom and try something "crazy". Maybe it will fail the same way as before, or maybe this time it will work. Maybe the constraints that produced the conventional wisdom have changed. 4) People problems will always be hard, in every industry. Productivity is a people problem.


I am against a general ignorance of where the industry's been, which manifests as a refusal to learn basic software engineering concepts that are laid out in a book like the Mythical Man Month. Seems like every new medium (such as the web) starts with masses rallying around a figurehead who proclaims "this time it's different! We don't need any of that engineering stuff!" A few years later, they do.

This trend of anti-intellectualism is worrying. I doubt it's new -- Dijkstra had similar sentiments -- but the worst part is the developers who seem to enjoy being ignorant.


It seems like almost all professions have a body of knowledge. Software Engineering doesn't.

Imagine getting open heart surgery, your chest is open, and doctors start arguing over the best methodology to do it.


This is not a fair analogy. Open heart surgery is a repetitive and well-defined procedure with clear goals and context. Software Engineering is a complex and creative endeavour where individual talent makes a big difference. There is no methodology that will help a mediocre writer produce a great novel. It takes skill and talent.


This is exactly why software engineering needs to professionalize.

I don't see developers always having more leverage than their business counterparts. Really, I'd love to see developers wholly accountable to their peers, along with some sort of entrance exam. I think the world at large would take developers a bit more seriously.


It does, and it has for quite some time: http://www.computer.org/portal/web/swebok


OMG. I just hope you are being sarcastic.

That's everything we are complaining about right there on the index.


No sarcasm intended. You don't have to agree with SEI or IEEE, but the complaint was that there wasn't an organized body of knowledge. My response was to show that there is.

I'd be interested in knowing what you find so objectionable about SWEBOK.


Actually, there are surprisingly large and persistent differences between hospitals in their complication rates for various surgeries. It is fueling a debate about how to evaluate them and how to promote the methodologies of the more effective places. Not really all that different from the software industry.


The problem here is that he defines "works" as "Deliver a predictable or repeatable software development process in and of themselves."

And that's crazy. Software delivery is at least 50% design/problem solving - and those aer neither completely predictable or repeatable.

To me a software methodology works if it _improves_ delivery. If the result with the methodology is more predictable, and less likely to go horribly off the rails than if you'd used a different methodology (or none at all) then it's a good methodology.

Looking for something that can perfectly transform any human requirement into code, in a predictable manner, is just silly.


Software DEVELOPMENT is a lot of design and problem solving. It also usually involves changing requirements and discovering limitations of the tools. I agree that there's a lot of uncertainty in the development process.

Software DELIVERY is what management, customers, and users care about. If a software system does not meet requirements, or if it's so late or over budget that it's a net loss to the customer, it matters little how well-designed it is or what problems were solved.

By predictable and repeatable I mean the entire process. Obviously not every aspect of of the development process can be predicted, nor are projects enough alike to be produced by an assembly line. What management and customers want is a reliable prediction that the system will meet requirements and be finished on schedule and within budget. And they want to believe that a process that can deliver a system this year can do it again.

An architect can reliably predict that a house or an office tower can be made from their plans. A chip designer can predict that a working CPU can be designed and built. An automobile designer can predict that a working car can be made. And these processes are repeatable. Any designer/builder will encounter problems to solve and changing requirements. Failures occur in these realms, too, but not nearly as often as with software projects.

I've heard the argument that software development is different: it's a creative process, the people are idiosyncratic, the problems are unpredictable. I think big-budget movies face the same challenges, but even in that domain there are predictable and repeatable processes that are reasonably good at delivering.

I agree that "Looking for something that can perfectly transform any human requirement into code, in a predictable manner, is just silly." I didn't write anything like that, nor is that what I think anyone means by software development methodologies.


"An architect can reliably predict that a house or an office tower can be made from their plans."

Yes, but that's because the plans are nailed down and fixed - the creative design part is _done_.

"A chip designer can predict that a working CPU can be designed and built. An automobile designer can predict that a working car can be made."

Neither of these are true until _after_ the designs are finished. Intel has gone massively over-budget on chip designs before, and automobile designs have been abandoned after spending a fortune trying to get them right.


I think I'm not making my argument clear. Software development methodologies are PROMOTED as ways to improve the predictability and repeatability of software development. They are adopted because management (or customers) are scared by the usual chaos of software development.

I think I clearly communicated my opinion that no development methodology in and of itself guarantees predictability or repeatability. They fail to meet the expectations of management; in other words they don't work.

I also think I clearly expressed my surprise that even without management pressure programmers will adopt methodologies and then force themselves to follow the rituals. Maybe this happens because programmers don't know better. I think it has more to do with creating a firewall of process and so-called best practices to avoid oversight and criticism. Sometimes there's an element of showing off -- look at how Agile/OOP/TDD/whatever I am, and no one told me to do it!


In the end it's people that have to do the work. No methodology can magically fix stuff for you; the best it can do is help people focus on what's important. What's important depends on your situation. Pick a methodology that fits your needs.

To me, Scrum really helps me focus on what's important now, and not waste too much time on potential future problems.


Because Software is inherently complex, one of the most complex things that Humans can try and make. And whenever we successfully make something easier, we push the borders out so that Software is always stretching to the limits of the complexity we can handle. Ultimately, its just hard. Sometimes methodologies help, but there is No Silver Bullet.


"I know the feeling working on a team where everyone clicks and things just get done. What I don’t understand is why I had that feeling a lot more in the bad old days of BDUF and business analysts than I do now."

I suspect it's the author that has changed, not everyone else. It's probably a combination of a nostalgic bias combined with the author's increasing age making it harder to get on with a typical team of young whippersnappers.

I say this as someone who is over forty and regularly falling into the "things were better in my day" trap (although hopefully self-aware enough to see it).


I'm strongly against what he wants to revert back to, but I think he has a point. Many engineers are completely lacking in leadership and strategic capability, and fall prey to a cargo-cult, new-must-be-better, buzzword-chasing careerism no different from what we accuse managers of.

He's probably seen a few teams ruined by young megalomaniacs who want to put terms like Scrum and Kanban all over their resumes because they (being Clueless, in the MacLeod / Gervais sense of the word) think it's the ticket to the brass ring.


I agree, unfortunately (or fortunately), having stuff like that on your resume can translate to more dollars and happier wife; happier life. As I get older I wonder if Social Security will be there. In my lifetime pensions have gone away so it wouldn't surprise me to see SS go away. So I'd say load 'er up! Get all the terms and buzzwords you can on there.


So what this article is really saying (although reluctantly it seems) is that methodologies really do work, it's just not that they're the end-all and be-all. You need to have common vision and communication that works around the team's goals and personalities. Seems to be pretty common sense to me.


I absolutely agree that no software development methodology is guaranteed to work. However, there are certain things that one can do to reduce the chances of embarrassing development stalls, such as proper documentation, continuous unit testing, talking to the end user, and team communication.


Try this thought experiment: Imagine two teams of programmers, working with identical requirements, schedules, and budgets, in the same environment, with the same language and development tools. One team uses waterfall/BDUF, the other uses agile techniques.

Problem is that with agile (as I understand and mildly practice it at least) you can't have "identical requirements", because you don't have full, complete and detailed requirements upfront.


I think there's a difference between having an initial set of requirements, and believing that those requirements are the requirements forever and ever, amen. Waterfall relies on the latter, while Agile assumes that they can and will change.

Both projects will be given the set of initial requirements. After all, if you don't have requirements, what the hell are you even doing? It's what happens when those requirements change that's the fun part.


There are a lot of issues layered in here, a bunch of complexity that might not be obvious just by scanning the article.

Does a good team deliver solid, quality code? What if it's code the user doesn't want? Conversely, what if the team excels at delivering what the user wants, and they love it, but the code is so buggy that it only works 50% of the time? Would that be better than a solid app the user hates? (Probably yes)

Should a team working inside a large company deliver faster than the organization is able to accept change?

Just what is a good project, anyway? One in which we all had a good time and thought we did a great job? Or one in which the person paying us thought we were awesome?

No matter the criteria, everybody seems to agree that having good people is something like 70-80% of the secret. The big debate is what goes into that other 20-30%

ADD: An interesting thought experiment to play here is to posit that the team sucks -- wrong guys, wrong personalities, whatever. In that case, what would you want to happen? The answer to that should be an important part of whatever you want your process to be. Explicitly define your failure mode. (Because failure is still more often the norm than success in software development)


> rigorous studies of software development methodologies haven’t been done

A simple search proves this is false. Paired with author's other unsupported and stupendous claims I have to doubt the validity and worth of anything else author says.


I did do a "simple search," and quite a few that weren't so simple. I've also read books on software development methodologies since the 1970s. I did link to the Cockburn paper, which surveys some studies and comes to the same conclusions I have. In the interest of brevity I didn't get into the well-known (anecdotal) studies from Weinberg, DeMarco, Yourdon, et al.

I know it's part of the fun to toss a firecracker over the wall, but if you are going to say that there ARE rigorous controlled studies please link to them, or at least post your "simple" search terms. If you want to point out which of my "unsupported" and "stupendous" claims are invalid and worthless I'm happy to learn from your experience. I tried to make it very clear that I am relating my own experience and drawing my own conclusions.


Could you please link to some of these studies that adequately control for the constituent programmer ability & personalities?


Let me recommend you "Lean Software Strategies" book. It refers to many software development experiments and observations, some of them from Lockheed Martin. While the book does not claim to present the ultimate methodology, it draws on progress in other engineering disciplines to create a better, continuously improving process.

http://www.amazon.com/Lean-Software-Strategies-Techniques-De...


I'm a fan of repeatability and process. Here's my take on why they don't work:

1) You can't get around that things don't work as planned, and too many methodologies assume that things will.

2) It is better to have a PM with A level content and B level process, than the other way around. Professional PMs don't have the intuition to solve the daily project problems, but they're the ones selling the methodologies.

3) Too much focus on tools. Using MS Project to keep the plan doesn't work when 90% of the project doesn't know how to use it.

4) Too much change. Rather than settling on a decent methodology that everyone agrees to, companies keep re-inventing their process every few years.

5) Wrong methodology for the wrong purpose. Waterfall methodologies don't work in research. Agile methodologies aren't the "fits every project" panacea.

I can go on, but you can't enforce excellence with formality. The formality can help outstanding people achieve excellence though.


#1 and #3 sound like they're right out of http://agilemanifesto.org/


Typed from memory w/out referencing it. (I'm not agile certified, or anything to that effect)


From my personal experience, the article nails it

"It’s common now for me to get involved in a project that seems to have no adult supervision". But my bigger problem is that the youth of today seem to have a much bigger ego than the youth of yesterday.

"Once a programming team has adopted a methodology it’s almost inevitable that a few members of the team, or maybe just one bully, will demand strict adherence and turn it into a religion". Recently working with an off shore team, they called their tech lead literally "God". And he was pontificating procedures and styles left and right without understanding the core of the software first. He stopped when the product start to have big performances problems. One of the ones that I liked the most was his prohibition to use class indexers in C#, backed up by an example in Java.


This is a fantastic post and I agree with it fully.

I get why process exists, I know management sees it as a way to make software development less lumpy; to bring below average teams up to average productivity, but it isn't a one way lift. Extensive process might raise the below average toward the average, but it can also lower the above average toward the average.

Twice I've seen situations where a team of better than average developers had no really well-defined process (though not surprisingly a sort of home grown process evolved to suit everyone's needs -- agile, as opposed to Agile, if you will) that was highly productive suddenly have process dropped on them from on-high in an undeniably productivity-killing way.

One time it was because higher-ups at the company randomly decided they needed to standardize on the Rational tools (ClearCase, Rose, etc... still have nightmares about that stuff) and out of that insane decision we ended up with some stupidly strictly defined RUP-based process to tie everything together.

The other was when a team was transitioned from one company to another and the other was one of those stupidly-strict "Agile" shops that fully drank the ritual kool-aid and adopted basically every suggested "Agile" strategy they could slap together without giving much thought to the actual original ideals of why "Agile" came about.

I guess the takeaway is be really selective about introducing new process. It might entirely make sense if your project is building some mostly throwaway CRUD app for an internal company department and all of your developers are, well, the kind of developers you can find who will work on such things and the project is off-track. Or even if your developers are all good, but the project is failing for other reasons like lack of cohesive vision. A little bit of process introduced sparingly might help.

But if you have a highly productive team already, don't fool yourself into thinking that because process "improves" software development that adding more of it will make your highly productive team even more productive. Because it very likely will do the opposite.


Extensive process might raise the below average toward the average, but it can also lower the above average toward the average

The way that I see it is that at best, rigid processes can set set both a ceiling and a floor. It's sort of like going to (a real) Starbucks. 99 times out of a hundred it's not going to be terrible, nor is it going to be great. It's going to be just OK.

If you're the kind of organization that looks at software as a necessary evil instead of a strategic advantage, "just OK" is an appealing thing to strive for.


> Individuals and interactions over processes and tools


A short comment but the one I was looking for in this thread. People often forget the "People over Process" part of agile which implies no Methodology is going to be the answer. All you can hope for is to create the correct process (methodology) for the people you have at the time.

Trying to fit the people to the process is why software methodologies fail.


This would rather much be like a carpentry shop owner lamenting "why don't either electric circular saws OR handsaws work to make my mediocre workforce produce beautiful furniture without all of this wasted wood?"

First of all, creating beautiful furniture takes LOTS of time, as it rests in attention to detail and an uncompromising position towards quality. When you're uncompromising about quality, you require yourself to throw out, to waste, things that are of lesser quality.

Second of all, beginner, poorly trained, and dispassionate employees are never going to produce beautiful furniture. Mostly because of the above: they either lack the ability or the caring to have the attention to detail necessary to do great work.

And the best carpenters aren't going to come work for you unless you're willing to make it worth their time. Coming to work for you means they will have to do a lot more work in a much less comfortable environment than they are used to. Because the best carpenters have their own shop, their own tools, and work on their own time, because they love it. Going anywhere that is not their own setup is automatically worse for them.

It's the same thing in software: great software takes patience, it takes time, and it takes money to convince the best programmers that they should be spending their time on you rather than on themselves.

But at no point does any of that mean that, because Tool X can't magically turn your mediocre working into a stellar one, that means that Tool X is not worth studying. That is completely, 100% backwards. The master practitioner studies all tools, even the ones he or she doesn't like, at the very least to understand what is wrong with them and why they don't produce the results they want. A master carpenter might prefer a hand saw over an electric one because the electric saw chips the corners of the board too easily, or something. But he doesn't know that unless he's studied the electric saw.

You can't have "an absence of methodology" any more than a wood shop can have "an absence of tools". It has to be there. But they all have their pros and cons, and non of them has a pro that includes "makes men out of mice."


Nice analogy. Waste is normal in manufacturing. Cost-effective manufacturing doesn't necessarily mean that no wood is wasted. If you are focusing completely on wasted wood you might lose track of overall profitability or quality. (Or, you know, people not wanting to leave your shop ASAP)


Methodologies and process aren't created to be silver bullets. It's our human nature to try and turn things into silver bullets, I would guess this is mostly due to our laziness to try and create better process on our own. Methodologies are created to solve the problem that not every team will be a star studded team with excellent communication and cohesiveness. Methodologies address the very real problem of less than perfect teams that still have goals to reach. It's about elevating the below average to the average.


I guess to answer that question, you first have to define what a working methodology is supposed to accomplish.

How do you know something is developed in a timely fashion, for example? All we really have is a gut feeling.

We could implement the same project multiple times and use a different methodology each time. Then compare. But you'd want to use people who understand the methodology and care about winning the competition.

Even then, I bet if you ran that experiment with 5 different projects, there'd probably be a different outcome every time.


I actually found that rapid-prototyping works best for me and I was able to deliver fairly good results using it. In fact I am starting a consultancy in month or two that will use rapid prototyping and other best practices I observed as effective.

My goal is to provide environment where clients would feel their input is steering project and they are getting results they want, also that developers and others working on a project would like clear direction and ability to show their best skills. Sounds a lot, I know.


> rapid-prototyping works best for me

> provide environment where clients would feel their input is steering project

> others working on a project would like clear direction

What tools would you/do you use?


There are lot of different tools, alternatives you could use, here are preferred ones by me:

MiddleMan as a static site generator, I also use Foundation by default or can use Bootstrap if someone prefers it. Slim templating engine as it is very easy to read.

ConceptBoard or InvisionApp are good for iterating over feedback.

Trello or any other of similar tools can help you allow visibility what is happening.

It is very important to do things at the right time. First develop wire that would capture requirements, original idea. Then design interface based on wires, and model data (very important step). This is when you can start writing tests, not before. Then coding rest of the interface.

BTW, this also works for developing API and works really well.


Well, his post is a bit inflammatory at first, but I sort of agree that processes sometimes get in the way, even the almighty Agile (big A) that some people like to subscribe to.

Ultimately, it's more about the people involved. Bad teams are going to find a way to screw themselves over no matter what process they use. Good teams don't need a rigid process since they'll just find a way to get stuff done no matter what.


http://en.wikipedia.org/wiki/The_Mythical_Man-Month

http://www.joelonsoftware.com/articles/fog0000000069.html

I've been a web developer for 13 years. Following a specific methodology doesn't matter, as much as focusing on the important aspects of the project, which are the customers, and what difficult problem the product solves.

The funny thing is that even if a team does this, it is not guarantee of success. If the customer base is too small, or there are lots of competitors, even the best software will fail. Conversely, if the customer base is massive, and there are few competitors, even poor quality software can succeed. Luckily for the software industry, and not for consumers, most teams do not follow these guidelines with the net effect being a TON of poor quality software.

Obviously here, Stackoverflow, and various other programmer focused sites, it's common to focus only on the coding side of the equation. The reality is that the best software comes from the collaboration between customers, designers, programmers, testers, usability experts, and sales. When each group brings their strengths to the table and focus on a common goal and solution, everyone wins. However it's very rare, since often the programmers are seen as the builders, the designers as the painters, testers as a nice to have, usability as a fad, and sales as being helpless.

To clarify a bit: design != usability, functional != usable. A beautiful design does not mean usable, look at the very confusing swipe action based calculators on iPhone. Functional is an auditorium with 20+ projects, usable is having the specific video cable dongle.


Maybe they all don't "work" because people suck and get lazy? Things can be very pleasing when one works with developers use the planning tools and have a leader who never relents at keeping the team on track with a given methodology, all without being a bully of course. Of course it's rare in the real world, but it does happen.


I left my last job because my time had degenerated from doing cool stuff to fixing the stupid errors that my colleagues were making because they were lazy. Allowing some coders to be lazy in terms of adherence to coding standards and methodology has to be one of the best ways to get rid of your good programmers.

For example, we had coding standards requiring a comment explaining the contract for each method written, and this was enforced by version control on check-in. My colleagues were putting in comments like this:

  /**
   * Comment
   *
   * @param foo Parameter
   * @param bar Another parameter
   * @returns Something
   */
to avoid triggering the automated coding standards check, without actually revealing anything about the method they had written. I complained to the manager a few times, who did nothing, so I left.


That is too much doc for me, I wouldn't want to write it all either. Sounds like different people want to code a different amount of docs on their methods.. and should maybe be on different teams ;)

Remember, docs always lie


That sort of thing has an obvious problem. Take a fairly simple class:

  class Shape
  {
  public:
    void SetAbsolutePosition(uint x, uint y, LengthUnit units = LengthUnit::PIXELS);
    void SetRelativePosition(int x, int y, LengthUnit units = LengthUnit::PIXELS);
    void SetXAbsolutePosition(uint x, LengthUnit units = LengthUnit::PIXELS);
    void SetYAbsolutePosition(uint y, LengthUnit units = LengthUnit::PIXELS);
    void SetXRelativePosition(int x, LengthUnit units = LengthUnit::PIXELS);
    void SetYRelativePosition(int y, LengthUnit units = LengthUnit::PIXELS);
    void SetSize(uint height, uint width, LengthUnit units = LengthUnit::PIXELS);
    void SetHeight(uint height, LengthUnit units = LengthUnit::PIXELS);
    void SetWidth(uint width, LengthUnit units = LengthUnit::PIXELS);
    void SetColor(Color color);
  };
What the methods do is bleeding obvious and the whole thing fits on a single screen where you can look at it. Now let's put all those comments in, what are they going to look like typically when inserting them is mandatory?

  class Shape
  {
  public:
  /**
     * SetAbsolutePosition sets the Shape absolute position
     *
     * @param x The absolute X coordinate
     * @param y The absolute Y coordinate
     * @param units The units of the coordinates
     * @returns Void
     */
    void SetAbsolutePosition(uint x, uint y, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetRelativePosition sets the Shape relative position
     *
     * @param x The relative X coordinate
     * @param y The relative Y coordinate
     * @param units The units of the coordinates
     * @returns Void
     */
    void SetRelativePosition(int x, int y, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetAbsoluteXPosition sets the Shape absolute X position
     *
     * @param x The X coordinate
     * @param units The units of the coordinate
     * @returns Void
     */
    void SetXAbsolutePosition(uint x, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetAbsoluteYPosition sets the Shape absolute Y position
     *
     * @param y The Y coordinate
     * @param units The units of the coordinate
     * @returns Void
     */
    void SetYAbsolutePosition(uint y, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetXRelativePosition sets the Shape relative X position
     *
     * @param x The relative X coordinate
     * @param units The units of the coordinate
     * @returns Void
     */
    void SetXRelativePosition(int x, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetYRelativePosition sets the Shape relative Y position
     *
     * @param x The relative Y coordinate
     * @param units The units of the coordinate
     * @returns Void
     */
    void SetYRelativePosition(int y, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetSize sets the width and height of the Shape
     *
     * @param height The length of the height
     * @param width The length of the width
     * @param units The units of the lengths
     * @returns Void
     */
    void SetSize(uint height, uint width, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetHeight sets the height of the Shape
     *
     * @param height The length of the height
     * @param units The units of the length
     * @returns Void
     */
    void SetHeight(uint height, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetWidth sets the width of the Shape
     *
     * @param width The length of the width
     * @param units The units of the length
     * @returns Void
     */
    void SetWidth(uint width, LengthUnit units = LengthUnit::PIXELS);
  /**
     * SetColor sets the color of the Shape
     *
     * @param color The color to be set
     * @returns Void
     */
    void SetColor(Color color);
  };
So now the class is eight times as long and you have to scroll eight times as much to find anything, in exchange for which you get a bunch of comments that tell you things you already know. Meanwhile writing those comments involved a lot of copy and paste and then changing one or two variable names and descriptions, which violates DRY and is the recipe for erroneous comments.

The better alternative is to comment only those things where the comment provides useful information, so that when you see a comment it strikes you as something you should pay attention to rather than some boilerplate to be ignored because 85% of them are mandatory but useless.


While I understand the argument behind not commenting getters and setters, and it is quite valid, no it is not obvious what each of those methods do. For instance, what is the relative position relative to? Is the colour for the outline of the shape or the fill? Why do you have a width and height for a shape - does that mean it is a rectangle?

But of course I meant my earlier comment particularly for quite complex methods that had very ambiguous behaviour unless defined in a comment that could act as a contract. The contract can then be enforced by writing tests against it, and you know your code works when the tests pass.

It's very common to write the code first, write the tests against the code, and then write a comment against the code if you are lucky, but that doesn't actually prove anything, as the tests will be testing the implementation details, not the contract.


> But of course I meant my earlier comment particularly for quite complex methods that had very ambiguous behaviour unless defined in a comment that could act as a contract.

Which is kind of the point. You do need some kind of documentation for nontrivial methods. But requiring boilerplate documentation for everything encourages the opposite of that because it becomes "fill in the fields" rather than "say something useful." I mean let's say you're right and SetRelativePosition isn't clear about what it's relative to. Which of these is better?

  /**
     * SetRelativePosition sets the Shape relative position
     *
     * @param x The relative X coordinate
     * @param y The relative Y coordinate
     * @param units The units of the coordinates
     * @returns Void
     */
    void SetRelativePosition(int x, int y, LengthUnit units = LengthUnit::PIXELS);
-or-

    /* SetRelativePosition: sets position relative to current position */
    void SetRelativePosition(int x, int y, LengthUnit units = LengthUnit::PIXELS);


Unfortunately we all know that, but according to Lake Wobegon management style, not only are all our devs above average and all our code is error free, but also all our information is useful. All of it. Says so right there, on the motivational poster.

It doesn't apply to your example, but sometimes in maint mode the best part about boilerplate comments is being able to search for a synonym that appears in the prose comments but not in the code. In your example you could have some kind of distance calculating function that somehow neglects to mention Pythagorean theorem in the code itself, but very optimistically it would appear in the boilerplate, so I could grep for it.


I'm not sure about Javadoc, but with Doxygen you do not need to put inline documentation immediately next to the code you are documenting. You could put all the documentation at the top of the file and the function declarations at the bottom. This makes it a little harder to maintain the docs while you maintain the code, but it does resolve one of your pain points.


The problem you're describing - detail overload - is a problem that should be solved by the IDE, not by removing comments.

The comments are most likely used to generate documentation, which can be very handy later.


"People suck and are lazy" is a known issue. It's been common knowledge for thousands of years now. So a methodology that doesn't work when practiced by sucky, lazy people is a methodology that doesn't work, full stop.


A methodology for lazy people? Sure, that exists. It's called going out of business.


I like this article, and I get it. In my personal experiences, I have found that scrum is successful when it is used to reinforce a single coherent vision and to build a tight integrated team.

It does not work when it’s used as a magic machine of happy fun time productivity…. Sprints and stories don’t translate into more productivity. Focus and teamwork do.


The real world and its constraints have this terrible tendency to intrude on the idyllic visions imagined around software development methodologies. Budget cuts, people quitting, timelines changing every week, internal politics, sabotage, personality issues - basically everything but the "methodology" itself.

The advice often comes (including in one of the XP books) that the answer to "bad projects/people/environments that you can't change" is to get a new job; this may work at the individual level (sometimes), but it does nothing to actually fix the broken projects, environments, or people themselves.

There is indeed no silver bullet. All these decades later, and many people still refuse to accept that.


This made me think about the book "Shop class as soul craft". Cars use to be made by hand by skilled artisans. So did software.

As with any process of production the move to change it in to a documented repeatable process complete with middle management has taken place.


On the other hand, cars are still made by hand by skilled artisans. They are assembled by an assembly line on a factory floor by a documented, repeatable process.

A better analogy might be: how do you go from a blank whiteboard to Ford Fusion Hybrid #1?


Legendary epic fails would seem to abolish the concept of the "repeatable process".

Imagine how silly novel authors, or fine arts painters, or technical paper authors, or poets, would look if they spent lots of time debating the "one" "true" way to produce.

A good interview question isn't how would you install the cylinder head bolts on the millionth model T engine quicker, its more like how would you paint the Sistine Chapel quicker? Are you changing the world with code or just shoveling out stereotypical CRUD app number 32515?


I love that book. It explains a lot of the value of being a software developer - craftsmanship.

Then again, anything in software that can be made a "documented repeatable process" in a really deep way can simply be replaced with a well-designed piece of software. So software is all craftsmanship, because everything that doesn't actually need to be craft, we automate away pretty quickly.


Apparently, Japanese electronics firms succeeded at this in the 80s and 90s with embedded control of digital appliances.


In my experience, a methodology should help a team get work done without having to think about the various steps/stages of the work.

Large corporations like to get everything down to a standard process (typically). A software development methodology lets them do exactly that; it promises a consistent process and (hopefully) quality.

In a way this works well - small firms innovate and come up with new methodologies which may at some point get to a tipping point where everybody wants a piece of the latest craze. Not everyone is in the business of questioning what works well or why - "if the competition is going Agile or setting up a DevOps team, then it must be good for us as well"


1. Most important thing: do the users want the change. If not they sabotage and you fail. Users will never be happy.

2. Do you have enough experience and do they work well together, if not your mostly screwed. Right hand doesn't know what the left is doing. Takes too long to train new people usually.

3. Are estimates being treated as estimates or is date,team and scope being dictated from above? project triangle anyone. Stretch goals for managers mean long nights and weekends for the team and bonuses for other companies recruiters.

4. Has management been over sold on buzz words and marketing hype?

5. Is there some kind of incentive to come in under budget. So, much for quality.

All of the above can screw up a project quick.


I think the key insight of the article is how different each person is in temperament and how that comes out in programming teams.

What is interesting is why this relates to software more than, say, ditch digging or gardening.

I would say that is because software involves dealing with the most complex situation imaginable as effectively as possible. In other situations, a person only has to mobilize some of faculties, in software, a person has to mobilize all of their faculties - or at least much more of a certain type. But this shows how different people's abilities are at the limit. Not just different in extent but different in kind.


The author seems to answered his own question if you pull out some key points and examine them in a logical way:

1. Software development was tightly controlled in the 70's through a hierarchy of management (project managers, business analysts, senior programmers).

2. Today the hierarchy is gone, reduced to “team leads” and “scrum masters” with no real authority or control.

3. Today programmers left to themselves often adopt or create methodologies stricter and more filled with ritual than software development in the 70s.

4. Today he often gets involved in 1 or 2 people projects that have so much process and “best practice” that nothing of real value gets produced.

5. What makes software development work is “conceptual integrity” or common vision or that feeling working on a team where everyone clicks and things just get done.

6. He doesn't understand why he had that feeling more in the 70’s than now.

Well if you follow the logic, what is missing? The hierarchy of management. Which he seemed to blame for the original reason why software development methodology doesn't work.

I hate to admit it, but maybe the hierarchy of management added more value than we gave them credit for?

He sort of suggested finding that “feeling” in a project/team was unpredictable, random. But perhaps the hierarchy of managers actually had valuable skills that led to higher success rates of projects achieving that “feeling”? Perhaps they had a knack for picking the right people and for managing people in a way so they worked together more efficiently? Of course they weren't perfect and there were some bad apples, just like any profession.

I wonder what the demise of the hierarchy of managers was caused by? Did the hierarchy of managers lose these skills overtime and become useless? Did programmers just not recognize their value added and throw them under the bus for 30 years until they didn't have a leg to stand on? Was it pure budgetary, cut costs and get rid of the resources with the least perceived value?

DISCLAIMER: I am not a manager. I use to be very hard on managers, then I became one for a while, realized how difficult their jobs can be, and now I am one of the easiest people to manage.


I think software development methodology 'erodes' over time.

For example original waterfall was kind of similar to agile methodology, there were re-estimations, milestones, prototyping... But I never heard anyone recently saying that our presumptions were wrong and we need to re-design and re-estimate.

Agile (scrum to be specific) in many companies is practically form of micromanagement. Number of companies are using 'agile' without automated tests and other necessary tools.


Methodologies alone don't guarantee success.

I find more problems when there's less focus placed on understanding the data, manually, first, before placing it into a process.

Methodologies don't make it any less important to find, connect and understand the data first, but it seems to happen way too much.

A methodology I see missing is when we developers obsessively focus on optimizing tooling and code, instead of obsessively finding and understanding the data first.


I agree mostly with the article. The most important thing about any team is its people and their individual & team dynamic. 2nd is motivation to make the project a success.

3rd or possibly lower is the methodology used. The methodology's main purpose in my opinion is to align everyone's working style to milestones, it doesnt reveal anything about whether the project will succeed, only how we will approach it.


I am becoming more and more convinced that the single most important part of the development process is the creative one. Seems every successful project I have seen has been one with a very well directed creative development team.

My hypothesis is that when you get engineering and management teams to try and manage out the creative aspect of most projects, things just go poorly.


Nicely said.

One thing I found that helps a lot is to work on a product that includes EEs and MEs, e.g. not pure software. It really drove home two points:

- process can work if people actually follow it

- the craft part of engineering takes a long time to learn but software is such a young field that practitioners are distracted by shiny objects instead of focusing on learning their craft


+1 for old-timer perspective and the reference to fred brooks' conceptual integrity. And to think, the MMM said most projects suffered from too weak of management _back then_.

-1 for being too short of an article; I wanted to hear better specific examples, and was sad when I saw the comment section starting :-)


Thanks for the kind words! I have a pile of notes that I decided to work into several articles, so stay tuned.


I have to say, I'm not a huge fan of articles or discussions with titles that beg the question.


It all depends on the domain.

In defense applications, software development methodologies work excellently. For example, MIL-STD-498, which I've used extensively in the previous decade, has worked wonders.


Summary: No article containing the word "methodology" is worth reading. Instead, spend time talking and listening and understanding exactly which problem you need to solve.


I think one part of the problem is that success is extremely non-normal with a very fat tail. To separate the effect of methodology from luck, you are going to need a lot of data.


I think one large problem is that no matter which new-age dev methodology you're into, if you're working to an incomplete spec it won't matter.


It's not the methodology; it's the people.

Been around long enough to observe 100% correlation, extensively.

And it's a small minority who are competent.


it seems to me that iterative methodologies (like basic "agile") are an implicit acknowledgement that development methodologies don't work. periodic re-adjustments based on local circumstances are necessary because the terrain on which our current software systems rest is too unpredictable


Because they only really affect the dev team. The clients are virtually exempt, which destroys any process.


Workification, tricking people into doing useless work by making it seem productive and unfun.


Because, in the end, no one actually knows how to write software.


That's clearly untrue, because software does get written and often enough it even works well.

Knowing how to make something doesn't mean that you know how to make it instantly without any costs, however - or on whatever other arbitrary schedule management sets out of ignorance


Form a good team is more important than choosing a methodology.


Of course they don't work. You expected something else?


People.


None of it works because Waterfall and most forms of Agile both assume closed allocation, which is a bad assumption that wrecks everything. Closed allocation dooms you to prevailing mediocrity and process can only make that worse.

Most management and process exists to turn 1.0x developers into 1.2x (in theory) but turns the 10x into 3x or 2x or sometimes -2x. When you start enforcing process, closing up definitions of work, and take power away from engineers, you lose so much more off the top than you get from bringing up the bottom.

The problem is that very few executives actually want to create an inspiring place to work where people do their best. ("Fuck you, I've got mine.") They want incremental "improvements" (of questionable long-term value) on what already exists. This hand-wringing about process sounds a lot like early communism: it trudges along happily, inventing new structures, in complete ignorance of the human motivations around it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: