Hacker News new | past | comments | ask | show | jobs | submit login

Everything I've seen about revolutionizing software points to the likelihood of it succeeding being zero. There's a lifecycle to it, and what we preserve are data and protocols, but not the incidental complexity of the systems themselves. At every point of the continuum, computer technology is more complex than it has to be in an academic sense("why have a computer at all if you can do equations in your head?"), but it solves enough problems that it stays alive.

Or to see it from a different light: once it's programmable, you've doomed it to die, the more so the more programmable it is. And this is borne out by how fast we burn through hardware. Our code survives the best where it's more driven towards a known destination format - e.g. an old TeX source document is more likely to be rebuildable into a rendered artifact than equivalent C code into usable software.

The Lisp or Smalltalk attitude to this - which is to remove code/data boundaries altogether - mostly seems to add further uncertainty and less room for curated archival. Either the whole thing runs or it doesn't.




Good points - see my comment below ("With regards to standing on the shoulders of giants...").

I wonder if there is some sort of enforcement of rules you could apply that would result in a programmable thing maintaining a relatively clean / non-bloated state. Of course, once you introduce more rules, you reduce freedom...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: