I'm a big fan of debouncing in hardware with the MAX3218 chip. It will debounce by waiting 40ms for the signal to "settle" before passing it on. This saves your microprocessor interrupts for other things. It also will work with 12 or 24 volt inputs and happily output 3.3 or 5v logic to the microprocessor. It is pricey though at $6-10 each.
That chip is more expensive than having a dedicated microcontroller that polls all of its GPIOs, performing software debouncing continually, and sends an interrupt on any change.
It's price is it's biggest drawback, but it is also replacing any electronics used to run the switches at 12 or 24v which gets you above the noise floor if you are operating next to something noisy like a VFD. from the 6818 data sheet: "Robust switch inputs handle ±25V levels and are ±15kV ESD-protected" [1]
My thought is: This introduces latency that is not required (40ms could be a lot IMO depending on the use.) It's not required because you don't need latency on the first high/low signal; you only need to block subsequent ones in the bounce period; no reason to add latency to the initial push.
Also, (Again, depends on the use), there is a good chance you're handling button pushes using interrupts regardless of debouncing.
I guess I should rephrase. It saves all the interrupts except the one triggered at 40ms delay. For every button press without hardware debouncing, you can have 10s - 100s of 1to0 and 0to1 transitions on the microcontroller pin. This is easily verified on a oscope, even with "good" $50+ honeywell limit switches. Every single one of those transitions triggers an interrupt and robs cpu cycles from other things the microprocessor is doing. The code in the interrupt gets more complex because now it has to do flag checks and use timers (bit bashing) every time they are triggered instead of just doing the action the button is supposed to trigger. None of this is to say one way is the "right" or "wrong" way to do it, but putting the hardware debouncing complexity into hardware specifically designed to handling it, and focusing on the problem I am actually trying to solve in firmware is my personal preferred way of doing it.
that seems like a real overkill - it's a full-blown RS232 receiver _and_ transmitter, including two DC-DC converters (with inductor and capacitor) that you don't even use... Also, "R_IN absolute max voltage" is +/- 25V, so I really would not use this in 24V system.
If you want slow and reliable input for industrial automation, it seems much safer to make one yourself - an input resistor, hefty diode/zener, voltage divider, maybe a schmitt trigger/debouncer made from opamp if you want to get real fancy.
That's a neat chip, especial max6816/max6817 version in SOT23 package!
but yeah, very expensive for what it does. If my MCU was really short on interrupts, I'd go with I2C bus expander with 5v-tolerant inputs and INT output - sure, it needs explicit protection for 24V operation, but it also only needs 3 pins and 1 interrupt.