What absolutely blew me away was to realize that operators were able to guide the Lunokhods by looking at the low-res video feed in black-and-white on a small CRT with motion feedback latency.
The whole surface is flooded with sunlight and it's hard to perceive much DOF. Moon surface at that angle looks very much flat to my untrained eye.
Yet the rover covered lots of distance with just a few (still serious) navigational mishaps (like when it almost slid down a slope into an unexpected crater).
The feedback latency is just ~2.5 seconds, humans can adapt to that. I remember myself using a dial-up to access a server via ssh, it occasionally had a similar latency just because of a low bandwidth due to phone line issues. The longer I used it the better it felt.
The whole surface is flooded with sunlight and it's hard to perceive much DOF. Moon surface at that angle looks very much flat to my untrained eye.
Yet the rover covered lots of distance with just a few (still serious) navigational mishaps (like when it almost slid down a slope into an unexpected crater).