One of the answers there specifically disparages big-oh analysis of algorithms.
This is actually sometimes important. A self-taught programmer, who really was a whiz at coding had some prototype code for data-analysis. It worked great on the sample data, but was O(n^3) (not just worst-case, average case) and we had a hell of a time convincing him that it wouldn't scale to the quantities of data we had.
Of course constant factors make a difference, but it is frustrating to waste time trying to convince someone that nlogn is going to beat n^3 on a large data-set.
This is actually sometimes important. A self-taught programmer, who really was a whiz at coding had some prototype code for data-analysis. It worked great on the sample data, but was O(n^3) (not just worst-case, average case) and we had a hell of a time convincing him that it wouldn't scale to the quantities of data we had.
Of course constant factors make a difference, but it is frustrating to waste time trying to convince someone that nlogn is going to beat n^3 on a large data-set.