Hacker News new | past | comments | ask | show | jobs | submit login

> We used to talk about algorithms in terms of big-O notation.

This is something CS majors often have to un-learn as they enter the world of practical programming. Big-O is just not incredibly useful in reasoning about algorithms as n is always bounded.




Big-O is also not so great in the real world where you should be more concerned about cache misses/alignment


Yes, and realistically, you're just as likely to benefit from investigating lock and I/O contention. I wish my work was mostly optimizing single threaded CPU bottlenecks. Would make things a lot easier.

If you're dealing with disk access, sequential access is often so much faster than random I/O even in SSDs (especially in writes) that Big-O can be very misleading to look at.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: