You're probably being unfair there. The programmers who work on games are often very smart, very capable people who genuinely want to make a good product. A lot of so-called professional programmers could learn a thing or two from games developers about everything from writing high performance code to co-ordinating large development teams via the value of developing in-house tools to make development work more efficient.
But the games programmers are screwed by the simple fact that the market will accept games that are buggy/require insane hardware/etc. and pay for them, will usually not attempt to return them if they simply don't work but will rip them apart in on-line commentary if the game sucks, and will rip them off at the first opportunity if the game is cracked.
That means the business guys who are, quite rightly, calling the shots, are interested in basically the first few days after launch and nothing else. That in turn means there is no need to bother with things like code quality and long-term maintenance. Better to spend the money on a cinematic trailer and DRM that takes one more day to crack.
This will continue as long as people continue to accept buggy games, dubious policies like shipping incomplete games and then filling in the blanks with paid-for DLC, and so on. And right now, while there are clearly plenty of people like me who would spend a lot on good games that worked properly and didn't come with pseudo-malware attached, it seems there are plenty more people who just don't care.
You want them to write games for people like me. (Thanks! :-)) Unfortunately, most of the big games shops are going to follow the money.
I'd have to content that my own games (Crash + Jak) are far far less buggy than the typical web or desktop application. We had a VERY low tolerance for bugs. I NEVER EVER shipped with a known crash bug (there were some I didn't know about of course). As entertainment we always felt that the consumer had zero tolerance for bugs. Something like MS Word is riddled with serious bugs that don't get fixed major release after major release. We had a huge staff of in house QA and even more external.
Well, FWIW, I'm glad someone is still making games with some eye on quality.
Personally I basically gave up AAA games around four years ago, after three big disappointments.
Crysis should have been great but had all kinds of problems even on high-end hardware.
Supreme Commander should have been great, except that a major selling point was its world-scale maps but if you actually played a full on game on such a map you went over 2GB RAM and crashed it on 32-bit XP.
Oblivion should have been great, with a lot of power in the game engine and some interesting ideas, but they forgot to bring the fun part.
Those were probably the three most eagerly anticipated PC titles of their generation, and while I did complete Crysis and have spent plenty of time on Forged Alliance, the enjoyment was severely damaged by the frustration with all the problems.
Since then, it seems like all I hear about is ever more intrusive DRM screwing things up and ever more profiteering via DLC and exclusive content deals. I've given up on contemporary AAA titles entirely until this sort of silliness goes away, and I content myself with things like puzzle games and titles from GoG that are actually fun to play.
The sad thing is, I'm betting the programming teams behind Crysis and SupCom could have improved the trouble spots and given everyone a much more enjoyable gaming experience with a bit more time, but I bet the suits pushed them to ship when it was "good enough". As for the DRM and DLC in more recent titles, that's just management madness through and through.
I agree with most of your post, but I don't think the problem is bugginess.
The real problem is the short-term fix nature of games. As you say, they're (mostly) not designed to be one-shots, with the majority of players playing through the game for a couple of weeks each, within a period of a few months of each other. That is why bugginess is tolerated, not the other way around. It's not worth the time to fix non-critical edge-case problems in situations. By the time the bug can be fixed, and tested, it'll only be useful to, say, 10% of total players.
In multiplayer games (and other games where the player is expected to engage with the game for months at a time) then you find that there are frequent updates and bug fixes.
Such is the reason the only game I play is a Quake clone (Urban Terror). It is mostly bug free, runs on cheap hardware, and most importantly: it is very fun (due to the great gameplay.) It also has a great community (with a lot of hackers/nerd types.
But the games programmers are screwed by the simple fact that the market will accept games that are buggy/require insane hardware/etc. and pay for them, will usually not attempt to return them if they simply don't work but will rip them apart in on-line commentary if the game sucks, and will rip them off at the first opportunity if the game is cracked.
That means the business guys who are, quite rightly, calling the shots, are interested in basically the first few days after launch and nothing else. That in turn means there is no need to bother with things like code quality and long-term maintenance. Better to spend the money on a cinematic trailer and DRM that takes one more day to crack.
This will continue as long as people continue to accept buggy games, dubious policies like shipping incomplete games and then filling in the blanks with paid-for DLC, and so on. And right now, while there are clearly plenty of people like me who would spend a lot on good games that worked properly and didn't come with pseudo-malware attached, it seems there are plenty more people who just don't care.
You want them to write games for people like me. (Thanks! :-)) Unfortunately, most of the big games shops are going to follow the money.