Post History
First, the time dilation experienced by a satellite is tiny on a human scale. It takes sophisticated algorithms and lots of processing power to measure time differences between a spot on earth and...
Answer
#1: Initial revision
First, the time dilation experienced by a satellite is <i>tiny</i> on a human scale. It takes sophisticated algorithms and lots of processing power to measure time differences between a spot on earth and various satellites. That's how GPS works. Early GPS receivers were rack-mounted boxes before integration allowed them to be a small number of chips. Second, a satellite doesn't really need to know what time it is to just be able to relay a message. It receives a message via one channel, and sends it back out on another. There is a need for some relatively close agreement on frequency between the sender and receiver. Consider ancient RS-232 between two computers here on earth. At the common configuration of 1 stop bit, 8 data bits, and no parity bit, a 3% clock mismatch results in ¼ bit offset at the last data bit. The system will still work with that assuming reasonably clean signals. Other protocols may require more accurate agreement of frequency between sender and receiver, but they all have some tolerance for mismatch. The frequency skew due to relativistic effects of satellites orbiting Earth is tiny compared to that. In practice, the much larger source of frequency mismatch between a satellite and a ground station is doppler shift due to the satellite having a motion component towards or away from the ground station. That's many times more significant than relativistic effects.