There is something special about software engineering that deserves a bit of thought - the discipline is all of 70-80* years old. There are people on HN who are older than software engineering.
Engineering is very much about the accumulation of experience - the reason all the planning is up front isn't because the Secret Society of Engineers said so, but because they are practicing something that has between hundreds (electrical, maybe ~200 years) and thousands (mining, dates back to the stone age) of years experience.
In 100 years, software engineers will have an enormous advantage over almost all non-engineers at designing and running software projects. There'll be a few amateur prodigies that nobody trusts, and putting an engineer in charge of software projects will mysteriously trigger a lack of things going wrong.
Software isn't unpredictable. Nearly nothing is unpredictable. The process of figuring out which bits are predictable isn't finished because the hardware keeps changing and forcing the tech stack to change. When the S-curve of hardware progress flattens, software engineering will gain in importance.
* Changed from 20-30 because that was wrong. Changes none of the arguments.
"The Mythical Man Month" first edition is 45 years old as a retrospective on a decade old development project 55 years ago.
The strongest indication programming is not engineering is forgetting and reinventing the past. Also revolutionary tools and techniques are invented far too often in CS.
In mechanical engineering we've had screws for a few centuries and off the top of my head the most recent innovation in screw fastener tech was the creation of the "Torx" screw standard in the 60s. The Torx innovation was due to smarter torque controlled automatic screwdrivers on assembly lines WRT electronic drive tech and early digital control.
While the basics principles are the same and over 120 years old, the Combustion engine has been re-invented and re-engineered a gazillion of times...
Maybe some of you guys here are young, but there were cars (large SUVs, or Sports Cars), that would consume 8-9mpg, yet still be slow as molasses compared to today's cars.
So, yes while the basic principles of the combustion engine are the same... but it has been re-engineered (from scratch), a many times, and each time it would be slightly better and better.
Additionally, the process of doing the physical engineering has changed drastically -- computational tools, design space exploration, optimization methods. (recognizing that the grandparent comment stuck specifically to screw-tech, but it's interesting to think about physical-engineering-innovation in broad terms)
Then there are innovations in manufacturing enabling new designs. 3D printing is coming along. Not everybody has to pull from a catalog of shapes, thicknesses, etc, for structural design.
You are right that software engineering is a relatively young field, but people have tried to apply engineering principles to software since at least the Fifties. The term itself was coined sometime in the early 60s and was used by the ACM in 1966.
How do you define software engineering such that it's 20-30 years old? Why isn't it 50 years old? I ask because I've heard the argument that software engineering is a young discipline but it feels like you're really stretching it with those 20-30 years.
>When the S-curve of hardware progress flattens, software engineering will gain in importance.
But a lot of software engineering issues are entirely divorced from the underlying hardware. So I'm not sure how the shape of hardware has any significant bearing to our current situation.
>Engineering is very much about the accumulation of experience
I would argue that we actually know a ton about how to build software. It's just that it's not really being taught or valued or applied. I would argue that's an important distinction because it strengthens the analogy of electrical engineer vs electrician. So it's more about just an accumulation of experience.
> But a lot of software engineering issues are entirely divorced from the underlying hardware. So I'm not sure how the shape of hardware has any significant bearing to our current situation.
To play devil's advocate with the other's position, even just over the time I've been in software since the early/mid 2000s I've seen things shift a number of times as new hardware creates new demands from our software or obsoletes old development struggles. PC RAM capacities rising 100x from 20+ years ago, more operations coming over on a slow and flaky WAN from untrusted devices, fewer people tolerating apps that only work on the tower PC stuck on their desk, more apps expected to sync state across devices because they're cheap enough we can own a bunch of them, use cases we learned to handle with plugged-in tower PCs adapting to power-draw-sensitive mobile devices, mobile devices being 1.5GHz instead of 16HMz, CPUs becoming fast enough that performance is often a footnote concern for whether to implement crypto, dedicated AI microchips changing what's possible on-device...
All of these seem like very reasonable justifications to rethink the way we build software, to change what's standard practice or to invent new tools that solve old problems in a modern context. To some degree the clock on when our industry starts accumulating stable knowledge won't start ticking until these kinds of changes stop happening.
>All of these seem like very reasonable justifications to rethink the way we build software, to change what's standard practice or to invent new tools that solve old problems in a modern context. To some degree the clock on when our industry starts accumulating stable knowledge won't start ticking until these kinds of changes stop happening.
But we already have stable knowledge that's simply not considered valuable. Your list of changes over the decades is of course a change but I don't understand why the attitude is "unless things stay stable over the next 30 years, there's no point in trying".
All those SOLID, agile, TDD, DDD, micro-services memes that are being implemented so superficially and cargo-cultishly are actually grounded in a real understanding that has a context, constraints, and trade-offs. The issue is that superficial and cargo-cult application. "Software engineers" that decide they're going to do micro-services without understanding what it solves, how it solves it and what they're trading off. And a business that doesn't care or know enough to ask for better. Maybe they don't need better! Maybe it's ok to slap together and the product will be equally mediocre whether the devs use cargo-cult pattern X or not. Maybe the quality doesn't matter that much to the business. In which case the devs being hired also don't need to actually know that much software engineering. In which case maybe the idea of electrical engineer vs electrician makes sense here too.
Engineering is very much about the accumulation of experience - the reason all the planning is up front isn't because the Secret Society of Engineers said so, but because they are practicing something that has between hundreds (electrical, maybe ~200 years) and thousands (mining, dates back to the stone age) of years experience.
In 100 years, software engineers will have an enormous advantage over almost all non-engineers at designing and running software projects. There'll be a few amateur prodigies that nobody trusts, and putting an engineer in charge of software projects will mysteriously trigger a lack of things going wrong.
Software isn't unpredictable. Nearly nothing is unpredictable. The process of figuring out which bits are predictable isn't finished because the hardware keeps changing and forcing the tech stack to change. When the S-curve of hardware progress flattens, software engineering will gain in importance.
* Changed from 20-30 because that was wrong. Changes none of the arguments.