Hacker News new | past | comments | ask | show | jobs | submit login

He goes on a bit too long in this post, but I hope that the folks piling on to Blaine (who I don't know, but doubt is as incompetent as he's been made out to be) will read it. Scaling is not a simple problem, and many aspects of it are unique to the specific application. Given that Twitter runs on a not-very-fast framework built on a not-very-fast language and it rapidly became the largest site running that framework and language, one should really step back and ask, "is one IT guy going to be able to fix this?" and the answer is gonna probably be "no".

Premature optimization is the root of all evil, because it leads to micro-optimizations rather than architectural optimizations...since Twitter seems to be running an awful lot of boxes for shipping around tiny (really tiny) messages, I can't help but think they need some architectural optimizations. And that's something it takes the whole team to accomplish. One guy, particularly one guy without enough power to make people fix their code, is going to be helpless in the face of that--he'll keep throwing hardware and edge caching (edge in the sense of at the edge of the outgoing network rather than the traditional sense of close to users--but without a lot of awareness of the data or application) at the problem. Sometimes it'll work, sometimes it won't.

I don't know Blaine or the Twitter situation, but I do know that scaling on a very large scale is never simple (my previous business was focused on web caching and scalability, and I've dealt with several quite large scaling situations--for sites burning way more bandwidth than Twitter). People outside of the situation probably shouldn't decide they know what went wrong and who was to blame.




The "premature optimization" advice is usually taken to mean "worry about performance later". But it can be very hard to make a system performant after the fact, when the important design decisions have long been made and a lot of code has been built on top of them.

I think what the advice really means is don't optimize unnecessarily. Figuring out what's necessary vs. what's frivolous optimization is actually a hard judgment call, so people (often good programmers who want to follow best practices) fall back on not optimizing at all, until they run smack into a brick wall. By then it can be too late.

I too know nothing about the Twitter situation, but it wouldn't be unusual if Blaine had nothing to do with the design decisions that are the root cause of their performance problems, nor if he had been asked to "make it scale" without actually having any power to affect the system in the necessary ways.


I didn't mean to rile the "optimization is more awesome than donuts" brigade. Really. I was just trying to say (unsuccessfully) that they're probably approaching the problem wrong, if one guy is expected to solve performance problems without giving him the power to make deep architectural changes. Let's pretend I never mentioned optimization, premature or otherwise.

You go right ahead and optimize early, often, and from both ends. I don't want to get in the way of a man and his passion for optimized code.


Oh, I wasn't disagreeing with you at all. I liked everything you said and thought I was just adding to it. In fact, I originally said something like "All this is by way of agreeing with everything you said" and then took it out because it seemed redundant!


I was merely pointing out the humor in the fact that whenever one says, "Premature optimization is the root of all evil", there will be a large number of people who spring to the defense of premature optimization. I wasn't picking on your post in particular...just the amusing fact that an off-the-cuff comment that didn't really express my intentions very well was the one sentence that got a very vocal response from several people, and yours was the first (and actually the post that signaled for the Premature Optimization Defense League).

It's a failing on my part for even bringing up premature optimization at all, particularly if I wanted to discuss anything other than premature optimization (or evil). I've been programming for over 20 years...I should have known what would happen.


I'd suggest "never optimize code without profiling" might work as a better maxim than "avoid premature optimization". Profilers are great for pointing out where you're likely to get the greatest return on investment when trying to optimize.


Amen! I optimize incrementally; sometimes when you're writing something you can tell it will be CPU-intensive or I/O intensive. You don't know how much, but you optimize it enough so that it is at least up to par with the rest of the code. It's definitely no good to leave all performance consideration to the end of the project, anymore than it would be to leave debugging to the end, and for some of the same reasons. If you have an algorithmic problem that kills performance, it might be too late to change the code to fix it. Plus, just as one bug can hide another, many performance problems won't show up if another, slower problem is overshadowing it. You have to keep a basic level of speediness to the code as its under development in order to spot sudden changes in overall performance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: