Hacker News new | past | comments | ask | show | jobs | submit login

> I disagree; math would be a closer analogy. And indeed, arithmetic still works like it did a millenia ago. Closer to the present, I have binaries from the late 80s that still work today (and I use them semi-regularly.)

Sure, those binaries might work the same when executed. Although the probability of that is never 100%, but as you pointed out, the rules of arithmetic aren’t expected to change any time soon. That’s correct. Unfortunately software does not exist in its own micro-verse, it’s subject to the laws of physics acting on the machines it’s running on. So while you might be able to write scripts that work decades later, it’s much harder to ensure those scripts consistently run for decades. RAM chips, CPUs, and everything in between are guaranteed to eventually fail if left running unsupervised in perpetuity. Entropy raises with complexity. At Twitter’s scale, to run a software service you need globally distributed cloud infrastructure. They likely have hundreds of services, deployed to many hardware instances distributed across the globe. Twitter isn’t 1 script running 1 time producing a single result. It’s hundreds if not thousands of systems interacting with one another across many physical machines. Layers of redundancy help, but ultimately cascading failures are a mathematical certainty. Many would argue the best strategy to reduce downtime on these systems is to actually optimize for low recovery time when you do fail.

Software is also bound to the world in other ways. Similarly to how most business processes, products and even more generally, tools, change over time, so too do the requirements placed on software systems made to facilitate or automate these things.

Ultimately the only way to escape the maintenance cost of software is to stop running it. The longer you leave a software system running, the more likely it will eventually stop.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: