I was expecting Moore's Law to give us a renaissance in algorithmic thinking, but The Cloud has shown me I was wrong. First, we're going to have to fully explore Amdahl's law.
Eventually every problem goes to logn time, best case. The logn factor shows up over and over, from constraints on on-chip cache to communication delays to adding numbers of arbitrary size. We make a lot of problems look like they are O(1) until the moment they don't fit into one machine word, one cache line, one cache, into memory, onto one machine, onto one switch, into one rack, into one room, into one data center, into one zip code, onto one planet.
If we can't solve the problem for all customers, we dodge, pick a smaller more lucrative problem that only works for a subset of our customers, and then pretend like we can solve any problem we want to, we just didn't want to solve that problem.
Herb Sutter had a wonderful talk on constant factor plague. As much as I like clojure and similar convenience and algorithmic beauty.. I can appreciate the devil-is-in-the-details much more since this video (forgot the title sorry)
Eventually every problem goes to logn time, best case. The logn factor shows up over and over, from constraints on on-chip cache to communication delays to adding numbers of arbitrary size. We make a lot of problems look like they are O(1) until the moment they don't fit into one machine word, one cache line, one cache, into memory, onto one machine, onto one switch, into one rack, into one room, into one data center, into one zip code, onto one planet.
If we can't solve the problem for all customers, we dodge, pick a smaller more lucrative problem that only works for a subset of our customers, and then pretend like we can solve any problem we want to, we just didn't want to solve that problem.