Dude. Be fair. You left out the part where he says:
> In other cases, operations have been made faster by changing the algorithmic complexity of an operation. It’s often best when writing software to first write a simple implementation, one that’s easily maintained and easily proven correct. However, such implementations often don’t exhibit the best possible performance, and it’s not until a specific scenario comes along that drives a need to improve performance does that happen. For example, SortedSet<T>‘s ctor was originally written in a relatively simple way that didn’t scale well due to (I assume accidentally) employing an O(N^2) algorithm for handling duplicates.
You'll never get anything done if you want to get it perfect the first time round. Or as they say, first make it work, then make it right, then make it fast.
> In other cases, operations have been made faster by changing the algorithmic complexity of an operation. It’s often best when writing software to first write a simple implementation, one that’s easily maintained and easily proven correct. However, such implementations often don’t exhibit the best possible performance, and it’s not until a specific scenario comes along that drives a need to improve performance does that happen. For example, SortedSet<T>‘s ctor was originally written in a relatively simple way that didn’t scale well due to (I assume accidentally) employing an O(N^2) algorithm for handling duplicates.
You'll never get anything done if you want to get it perfect the first time round. Or as they say, first make it work, then make it right, then make it fast.