Say for example Lua (by default) is configured to use only double. It can be configured to use single-float, or even integer - but only one numeric type.
Standard JavaScript uses doubles everywhere. They get casted to 32-bit signed integers for bitwise operations, the results are casted back. It's a little odd, but consistent at least.
Ah. I don't know the inner workings of Javascript so I'm not surprised I was wrong. Just at a glance, that problem to me screams floating point limitation.
Say for example Lua (by default) is configured to use only double. It can be configured to use single-float, or even integer - but only one numeric type.