Yes, an interpreter has a much better shot at detecting runtime undefined behaviour than a compiler.
The whole point of undefined behaviour in C and C++ is to let the compiler cheat: ie a Java or Haskell compiler would have to take into account that (i < i + 1) can sometimes be wrong for native ints, and would have to prove that overflow can't happen in order to optimize this comparison away to True. Undefined behaviour in the standard frees C and C++ compilers from these obligations, and they can just assume overflow for signed ints won't happen.
These shortcuts (plus a lot of smarts) make it feasible to write a fast optimizing compiler with the 1970s state of the art in static analysis.