Do you happen to have any source/book on why you can't use anything but a conservative gc on C-like languages? I would really like to know why that's the case.
Basically C semantics are to blame, due to the way C was designed, and the liberties it allows its users, it is like programming in Assembly from a tracing GC point of view.
Meaning that without any kind of metadata, the GC has to assume that any kind of value on the stack or global memory segments is a possible pointer, but it cannot be sure about it, it might be just a numeric value that looks like a valid pointer to GC allocated data.
So any algorithm that needs to be certain about the exact data types, before moving the wrong data, is already off the table in regards to C.