> We used to talk about algorithms in terms of big-O notation.
This is something CS majors often have to un-learn as they enter the world of practical programming. Big-O is just not incredibly useful in reasoning about algorithms as n is always bounded.
Yes, and realistically, you're just as likely to benefit from investigating lock and I/O contention. I wish my work was mostly optimizing single threaded CPU bottlenecks. Would make things a lot easier.
If you're dealing with disk access, sequential access is often so much faster than random I/O even in SSDs (especially in writes) that Big-O can be very misleading to look at.
This is something CS majors often have to un-learn as they enter the world of practical programming. Big-O is just not incredibly useful in reasoning about algorithms as n is always bounded.