Hacker News new | past | comments | ask | show | jobs | submit login

I would speculate that even if hardware progress were to stop today, we would still see advancement in the speed of computer systems for quite some time. All just from optimizing what we are already doing, or (as others have pointed out) not doing unnecessary things.

There are so many things I'd like to fix if I had the time...




You're clearly an optimist. I find that the tendency is for software to get slower -- and generally worse -- over time.

Five years ago I was using Chrome and it was fast, lightweight, and stable. Now it's slow, uses 500 MB of RAM, and the renderer crashes on a regular basis.


A couple of things might influence this observation:

1) Your definition of what is fast changes. Downloading an MP3 in 5 minutes felt really fast in 1998, but really slow today.

2) What we expect chrome to do keeps increasing. Since Chrome was so fast, people started creating webpages that had to do more (because they could!) It is a form of Induced Demand (https://en.wikipedia.org/wiki/Induced_demand)


#2 is the big one. If you could surf 2010's web with Chrome 45, I bet it would be a better time than 2010's web on the Chrome of the day.


I think there may be a counterintuitive situation where incremental improvement in hardware is worse than no improvement.

Consider how much extra performance game developers are able to wring out of a console after a few years' experience. If the hardware is absolutely stable, and there's no expectation that it can get faster, then people do make a big effort on software.


The biggest example of this is the demoscene, where some absolutely astounding things have been done with old hardware; e.g. a Commodore 64, 1MHz 8-bit CPU with 64KB of RAM.


In relation to this, user interfaces seem to be designed to have some response time less than some value. For games maybe it's 100ms, for the game console's dashboard, maybe 500ms, for an ATM, it's like 4000ms.

I suspect the target response time is what determines how slow things are, not the speed of the hardware. As people have pointed out, hardware has gotten faster and many user interfaces have gotten slower, so it seems more likely that: the target response times have been rising (people tolerate higher response times in exchange for more features or whatever), or people with older and slower computers observe much higher response times than the original target (iphone 3gs vs iphone 5 for a given app, for instance).

Stable hardware solves the second issue at least, but not sure about the first one. I think over time people will notice that everything else responds instantly to touch, and computers should too, so they will start rejecting apps that have a high target response time.

As an example of that, I chose my last monitor by looking at http://www.displaylag.com/ to get the minimum latency there. If other people start caring more about the latency of the things they use, hardware and software developers will prioritize that.


It used to be a browser just went to the destination and showed you what was there. This was fast. Now every time you visit a site, a giant war goes on behind the scenes between what the site wants to force you to see and what you actually want to see. This may have something to do with it.


If hardware speeds didn't change anymore there would be more effort spent on speed. Right now it's to easy to add stuff and not feel much of an impact if you follow all hardware upgrades (ruining it for everybody else who doesn't do so), if people had to squeeze more performance out of their code they would. (See the iteration of games for old platforms, where people did the craziest things over the years to get results that were thought impossible)


Well I have finally come to realize that this is indeed the reality.

"Innovations in hardware are to give programmers freedom to build shitty software and get away with it."


Well, yeah, it is certainly possible for software projects to go the other way. It depends on what the focus is.

If people knew that their own choice for speed improvements was optimization (and to a certain extent, that is true already), then they might focus more on that instead of adding features.


I'm still hoping for the whole stack to flatten significantly with PL guys managing to allow for multistage/multilayer JIT optimization over evaluation towers.



One of my favorite quotes (wish I knew where it originated): "I can't make it run any faster, but I can make it do less."


Brings to mind optimisation of grep. Covered previously at HN.






Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: