Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This sometimes doesn't work out in practice because the scaling involved runs into a limitation your big-O model didn't account for. Typical examples are: The size of the machine registers, physical RAM, addressable storage, or transmission speeds.

If your O(1) algorithm takes an hour for any input, and the competition is O(n) it may seem like there must be cases where you're better, and then you realise n is the size in bytes of some data in RAM, and your competitors can do 4GB per second. You won't be competitive until we're talking about 15TB of data and then you remember you only have 64GB of RAM.

Big-O complexities are not useless, but they're a poor summary alone, about as useful as knowing the mean price of items at supermarkets. I guess this supermarket is cheaper? Or it offers more small items? Or it has no high-end items? Or something?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: