One example: Ruby (the original interpreter in C).
OK, I should have said, and have now modified my original statement to "Interpreters don't necessarily do GC".
In general an interpreter can add any additional or different semantics
Yes, and that falls under "something fancy"! The naive way of interpreting object-language application by meta-language application transfers the latter's calling discipline to the former.
Of course, but no one claimed that interpreters necessarily do GC either, so this statement is not very useful. You keep arguing against claims that aren't there. The only claim ever made in response to OP was that precise GC requires runtime information, and that this can sometimes be easier when it exists anyway (as in JITs and real-world interpreters), but can also be implemented when the compiler produces it AOT. I hope that's clear enough :-)