What is a likely way for that to work with something launched in 1996? If the plane was going 50 MPH away or toward satellite, that's ~7e-8 * c. If I remember, the frequency would be shifted by a factor of the square root of (1 - 7e-8)/(1 + 7e-8), or its inverse.
Would the satellite always relaying that level of precision to the ground? If it is, why would they have prioritized that over what would have presumably be a dramatic increase in capability if it were discarded instead? Could they have designed it to store that much precision in rotating logs available on request? And furthermore, how is a relativistic Doppler shift on those magnitudes discernible from the frequency shift naturally occurring from the plane's transmitter?
I suppose my point is that while it's easy to understand the idea of a train whistle changing pitch as it passes by, how they would do it in practice sounds like magic, being unfamiliar with the capabilities and the equipment that geostationary satellites generally carry.
Would the satellite always relaying that level of precision to the ground? If it is, why would they have prioritized that over what would have presumably be a dramatic increase in capability if it were discarded instead? Could they have designed it to store that much precision in rotating logs available on request? And furthermore, how is a relativistic Doppler shift on those magnitudes discernible from the frequency shift naturally occurring from the plane's transmitter?
I suppose my point is that while it's easy to understand the idea of a train whistle changing pitch as it passes by, how they would do it in practice sounds like magic, being unfamiliar with the capabilities and the equipment that geostationary satellites generally carry.