I believe that in order to get really good at any of these roles, you must do the other roles, deeply and often. For example:
The more time you spend Debugging shit, the less likely to you are to Architect something that produces shit.
The more time you spend fixing a million little things caused by poor early decisions, the better Starter you'll become.
The more fun you have conceiving and Starting projects, the more you'll realize how important Finishing is, so that you can get to new stuff.
And the more time you spend doing each of these roles, the better you'll get at doing all of them at the same time, and understanding what conditions are needed to do that. (Hint: Lots of quicker smaller complementary projects that you can wrap your whole head around.)
[This whole discussion reminds me of the time I was a restaurant manager and got tired of the servers and cooks bitching at each other. I had them switch roles for one shift. Once they understood how what they did affected the other, everyone got a little better and the bitching stopped.]
The more time I spend Debugging shit the less likely I am to Architect or even Start anything because I am painfully aware that everything produces shit.
EDIT: Wow. A downvote. That what I get for showing the pain I experience to the world. :-)
The way over that hump is essentially to realize your implicit baseline of "not shit" is unrealistically high. An architect should strive to not produce shit, but also has to be able to emotionally deal with the unattainability of that goal. No sarcasm, I've seen people who can't deal with that. An architect must also have just enough ego to believe they stand a chance of being able to produce something less shitty than would otherwise be produced without them, while at the same time not being so egotistical that they lose touch with the users of their architecture, which is a fatal error, or even worse, become actively arrogant. (You are still producing shit, after all, and don't you ever forget it.) It's a delicate balance.
You just described my most fitting persona, one which wasn't mentioned in the OP: the Refactorer (or less glamorously, the software janitor). Reducing complexity is among my favorite programming activities. Nothing like calling it a day after disentangling some chunky piece of over-engineered object-oriented spaghetti, leaving the codebase a few hundreds LOC lighter and all test lights flashing green.
I think what he's saying is, after all that, you're still probably producing shit, that no matter how awesome you are, whoever ends up tending to your pet Frankenstein after you've moved on is probably still going to spend half their time cursing your existence, if they weren't part of the project from the beginning.
And that's only if you're good enough in the first place so that "finishing" is actually worth the effort, rather than starting from scratch using your code as the blueprint for the new and improved 2.0 spec. This requires not only that you be good, but that you get lucky: that you've accurately predicted the stack that your ops team will be pushing to standardize on in 2 years, that there's enough expertise in the language(s) you used to justify its continued use, that management hasn't decided to switch everything to Windows + .NET because "support", that every one of the OSS libs that you've pulled in all happen to be licensed under the three particular OSS licenses that legal decided to approve after that recent switch to a "whitelist" policy, that the company services you were forced to integrate with when you started haven't been shitbinned, that the new lead developer on the project doesn't have aesthetic problems with the way your code is organized, and so on. All of which usually mean "rewrite!", and few of which you have much control over. At the end of the day, it doesn't matter whether it's your fault or not, the people evaluating your work will still consider it "shit" because it's not usable anymore, and nobody loves the project enough to bother figuring out how to patch it up.
IMO, though, the real reason we all tend to produce shit (and then throw it out later) is actually missing from this otherwise excellent four-part breakdown: documentation. And I'm talking real, detailed, explicitly human-centric documentation that's more than just class/function level comments or a user manual. Lately I've been dealing with several attempted tech consolidation tasks as part of a several-tier acquisition (big fish swallows little fish, bigger fish swallows entire lake), and it seems that no matter what size the company is, nobody in software ever fucking documents anything well enough to reuse it, even when they're developing services that are explicitly intended to be consumed by other groups. The resulting duplication of effort is ridiculous.
Good internal documentation needs to start with the starter, end with the finisher, and be taken seriously by everyone in between. It should be considered almost as important to a project as the code itself. It needs to explain how to use the system, how to integrate with it, how to debug common problems, where the pain points are, how the system is architected, why certain decisions (especially tradeoffs!) were made, and more. Done right, good documentation can make up for some pretty bad code, and make sure that a project is actually maintained rather than thrown away.
Spot on—wish I could upvote each sentence of your comment.
Everyone knows code readability counts, because it's “read much more often than it is written”. Same on a larger scale—the project is built once, and then maintained for years and decades (especially in “enterprise” environment). So you should optimize for maintenance—that is, write docs.
It really puts me out that no one else in the team or management appears to take documentation seriously. Is this ‘job security’ or general negligence, I don't know. I think either your primary concern is project success, or it is your job and money. If it's the latter, then of course why would you write docs—just bang out something that works for client/employer, you'll get paid and it'll be harder to fire you.
Or maybe it's management problem really and I shouldn't care.
I think it's a short-term vs. long-term problem: everyone's focused on hitting their milestones and the problems that will be caused by the bad documentation won't manifest themselves until later. It's kind of a "technical debt" situation.
+1 for bringing up documentation. Good documentation without code is valuable. Good code without documentation ... probably worthless (unless it's clean enough to read, which is rare for production code once you've killed all the weird edge-case bugs).
I would agree that people learn from being involved in all phases of a software project at least once. Especially starters.
Starting a project is an exciting, heady, creative time. If the last 10% takes 50% of the effort, the first 90% is where you create and build at a fast pace, enjoy the adulation of your clients who are amazed with how quickly things are going, investigate and implement interesting new technologies and approaches, and revel in the fun of creating something new.
The question is, did you leave the code base in good shape, or did you go on a bender, drinking the champagne but leaving your successors to experience the hangover? My guess is that most people on this board have had to deal with a truly bad code base, in production, inherited from a programmer who flitted off to the next project cause he "enjoys the challenges of creating new things".
Until you've been the debugger who fixes the million little problems, the "architect" (wish we had a different word, cause you will be coding a lot) who refactors the code into something that can actually be maintained, and the finisher who experiences the irritated grousing of clients (who notice that things have slowed down so much since the "starter" left the project... we just asked for it and he did it), you don't know the kind of damage a rogue "starter" can do.
Completely agree. I once worked at a bespoke software shop where they had 'proper' developers upstairs, and support developers downstairs. (Quotes used to indicate how that made the support staff downstairs feel).
Most projects taken on by the company followed the same path: upstairs developers would code up the application as per functional spec, client would do a bit of UAT, accept it, publish it live, at which point the project was assigned to the support staff.
What generally happened next was the application would begin to fail on non-functional fronts, generally appalling performance and scalability problems.
Since there was minimal communication between floors (afterall, what could a 'proper' developer learn from a lowly support developer?), the pattern repeated again and again.
I'm sure that either rotating the positions, or having individuals responsible for staying with each application throughout its life, or some other form of mixing, would have drastically improved both morale and quality of apps produced.
The danger I had when I got good at debugging, is I came away with the belief that I needed to make mistakes impossible to make. In that regard, I did exactly as you say is unlikely. I left a world of debugging and wrote some highly over engineered crap. Pulling things back to make the mistakes obvious, not impossible, I've had much much more enjoyment.
I can also say that the thing you really need to do is use what you are programming. Or at least something like it. I don't know as that this is cross cutting, in that someone that uses a product may not be good at building it, but they are probably good at selling/describing it.
What you've described is fundamentally a problem of communication.
You don't need to do multiple roles in order to understand them. You do need to listen to the feedback from each different area and then act on it so that your decisions don't create unnecessary work for others.
The restaurant scenario is one I've heard before and it's the literal version of putting yourself in the other person's shoes. Not workable in every situation (but kudos for having the balls to do it.)
No need for that. I had already done every job and knew exactly what would happen.
But your question reminds me of another time when I did do that. I was the IT Director and got tired of the lead programmer bitching about my decisions. So we traded jobs for one day. I had a great day just coding. He made 5 decisions. On each one, I pointed out about 5 things he didn't take into consideration, making his decision less than optimal. He was glad when that day ended. (And I was glad when that job ended; I've been coding ever since.)
The more time you spend Debugging shit, the less likely to you are to Architect something that produces shit.
The more time you spend fixing a million little things caused by poor early decisions, the better Starter you'll become.
The more fun you have conceiving and Starting projects, the more you'll realize how important Finishing is, so that you can get to new stuff.
And the more time you spend doing each of these roles, the better you'll get at doing all of them at the same time, and understanding what conditions are needed to do that. (Hint: Lots of quicker smaller complementary projects that you can wrap your whole head around.)
[This whole discussion reminds me of the time I was a restaurant manager and got tired of the servers and cooks bitching at each other. I had them switch roles for one shift. Once they understood how what they did affected the other, everyone got a little better and the bitching stopped.]