Hacker News new | past | comments | ask | show | jobs | submit login

I think it's an interesting demonstration of the rather sad fact that despite hardware today being many, many orders of magnitude faster than what was around e.g. 2 or 3 decades ago, it seems software has only gotten increasingly inefficient, so there's a huge amount of processing power and - if humans are being kept waiting, their time - being wasted because of this; this is a text editor, albeit a featureful one, whose resource consumption is disproportionate to its capabilities. It's what happens when countless abstractions are piled upon one another, with little regard for the consequences of doing such a thing.

Even Emacs is smaller, and it's just as if not more customisable/featureful.

Here is a nice rebuttal to that "premature optimisation" myth: http://ubiquity.acm.org/article.cfm?id=1513451

From personal experience, by the time a system gets too big/slow, the inefficiency is usually distributed throughout so that it becomes more difficult to single out one component that's causing it, and there is no easy way to optimise such a system short of redesigning and rewriting it - it can be said that the problem is because "the whole is bigger than the sum of its parts."




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: