Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We'll have to agree to disagree, then. Technologies such as the GPU provided such massive improvements that you either had to get on-board or be left behind. It was the same with assembly line vehicle production, and then robot vehicle production. Some technological enhancements are so significant that disruption is inevitable, despite current "lock-ins".

And that's fine. We're never going to reach 100% efficiency in anything, ever (or 90% for that matter). We're always going to go with what works now, and what requires the least amount of retooling - UNLESS it's such a radical efficiency change that we simply must go along. THOSE are the innovations people actually care about. The 10-20% efficiency improvements, not so much (and rightly so).



You restate that disruptive innovation happens when gains are large enough to overcome inertia, and that smaller conceptual shifts aren't worth pursuing. Your premise is pragmatic: if it mattered, the market would already have adopted it.

This still sidesteps the article's point that what we measure as "efficiency" is itself historically contingent. GPUs succeeded precisely because they exploited massive parallelism within an already compatible model, not because the ecosystem suddenly became open to new paradigms. Your example actually supports the article's argument about selective reinforcement.

There's nothing to agree to disagree about. You're arguing a point the article does not make.


I get the feeling that we're talking across each other now...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: