Hacker News new | past | comments | ask | show | jobs | submit login

My co-founder tried out some of the techniques in the paper on our system: "I was skeptical, probably because I've never done much performance tuning like this. But I changed (triples) to create a 2-d array of fixnums instead of the default list of lists from the select. I added the declarations. It's about 100x faster. 100!" Very nice :)



When I mentioned that it may be possible for a web startup in beta to improve the speed of their code by a factor of 1024 over no particular timescale, I was downmodded by five points ( http://news.ycombinator.com/item?id=187692 ).

I can only advise that you shouldn't quit while you're ahead. Now that you've eliminated the most obvious performance bottlenecks, you may find that less obvious bottlenecks have emerged. Removing these could give you an additional factor of 10 improvement.


We're using the SBCL profiler on a regular basis to speed up the system whenever we find an area that feels a bit sluggish. We've made very large speed gains with only minor coding modifications.


I'm very impressed with your quantative approach. I'm also very curious about the type of modifications you're making. What proportion of changes are local changes which affect common cases, what proportion of changes are local changes which affect corner cases and what proportion of changes are architectural? Is it 70:20:10?


Thats a good question. Our workflow is to look at the profiler results, and work on the functions that are taking the most time. Sometimes we'll find a function being called far too many times for the page being viewed, so we'll look into the program logic to solve that. Our code is highly integrated, so most pages are using functions or methods common to many others, so a fix (or a bug) effects a large part of the system. The example 100x triples improvement sped up the entire system as triples are used at the core of our security and relations. This is a long winded way of saying I'm not sure what the proportions would be, a guess would be 60:10:30


Probably not. Remember Amdah's Law: http://en.wikipedia.org/wiki/Ahmdal%27s_Law


I'm aware of Amdahl's law. I also know of a case where adding an index to a production database gave a factor of 1000 improvement. If your design started as a proof-of-concept, you made a rash assumption, you had a tight deadline or you're just plain incompetent then there's almost no lower bound to initial system efficiency.


My comment was aimed at your down-modded comment. You can make 10% performance improvements in dozens of modules throughout your system, but if they only account for, say, 5% of the total execution time, it won't make much difference.


Premature optimization may be the root of all evil, but failure to optimize isn't too hot either.

Glad to hear your results were so dramatic.


We only optimize when things start feeling a bit slow - never prematurely.


How did this comment end up in the rss feed?

Interestingly, the "comments" link for it points to http://news.ycombinator.com/item?id=196342 -- the original post without any comments shown!


It was submitted as a standalone post. There are no comments on the new submission.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: