Is there an example of actually helpful and practical enabled-by-undefined-behavior optimization in C/C++? All I can remember are discussions of its pitfalls, like this.
> In the case of our example, the program actually compares such an “unobservable” bit pattern with a constant, so the compiler constant-folds the result to whatever it pleases. Because the value is allowed to be “unstable”, the compiler does not have to make a “consistent choice” for the two comparisons, which would make such optimizations much less applicable. So, one time we “look” at x the compiler can pretend it is at least 150, and then when we look at it again it is at most 120, even though x did not change.
One classic helpful example that gets brought up is integer overflow and loops. Because overflow is undefined in C and C++, the loop doesn’t need to check for it on each iteration.
If x <= 7 then the answer will be 420, but if x > 7 then behavior is undefined because x will overflow, and x is signed, and signed integer overflow is UB in C, so the compiler is free to, for example, conclude that blazeit() is never called with x > 7, thus it can prove that blazeit() always returns 420. Besides, if the compiler chooses to implement signed integer overflow much like unsigned integer overflow, then x will eventually come around to 7 anyways, so the answer must be 420.
I expect the compiler would optimize the tail recursion into a loop, so the likely worst case is not that you blow the stack but that you spin for a while.
The loop (and function) terminates when x == 7, and the compiler can show that x will be 7 at some point i.e. it knows it's on a system with overflowing integers