I guess I should rephrase. It saves all the interrupts except the one triggered at 40ms delay. For every button press without hardware debouncing, you can have 10s - 100s of 1to0 and 0to1 transitions on the microcontroller pin. This is easily verified on a oscope, even with "good" $50+ honeywell limit switches. Every single one of those transitions triggers an interrupt and robs cpu cycles from other things the microprocessor is doing. The code in the interrupt gets more complex because now it has to do flag checks and use timers (bit bashing) every time they are triggered instead of just doing the action the button is supposed to trigger. None of this is to say one way is the "right" or "wrong" way to do it, but putting the hardware debouncing complexity into hardware specifically designed to handling it, and focusing on the problem I am actually trying to solve in firmware is my personal preferred way of doing it.