this post was submitted on 06 Oct 2024
36 points (92.9% liked)
Watches
2042 readers
2 users here now
For watch enthusiasts to discuss everything related to watches and horology.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To determine how long the signal took to arrive you would need to know the exact distance it traveled and its propagation speed.
To calculate the shortest distance the signal could travel you would need to know the exact locations of the transmitter and the receiver. That, plus some trigonometry, would get you a minimum distance. That seems to be a good enough correction for things like GPS to work.
If you really want to know the exact distance the signal traveled, it gets a lot trickier. Radio waves do not follow the curvature of the earth. To receive a signal that is beyond the range of sight, it needs to bounce off the ionosphere and back down to the surface. It may need to do that multiple times. That doesn't always work, and when it does, it happens at various altitudes that vary based on a multiple factors. Without access to a lot more information, you will never know exactly how far the signal traveled.
Then you need to know how fast the signal was moving over the course of its journey. Radio waves only move at the speed of light in a perfect vacuum. The chemical composition, and even the temperature, of the atmosphere affect how fast it moves. And, of course, those factors will not remain constant from the transmitter to the receiver. So you would need to know the exact route the signal took, as above, and then know the details of the atmosphere at each point over that route. That would require access to even more extremely detailed information. And a lot of computer power to make use of it.
I don't think there is currently any way you could get data about the signal bounce path. I am even more doubtful about getting detailed information about the composition and temperature of the atmosphere along the entire path.
There are probably other considerations that a physicist would bring into this, but I'm just a layman.
What is a sufficiently accurate estimate? That depends on what you need to do with it. There is no universal answer. The uncorrected time signal itself is good enough for nearly all purposes.
Having said all that, I'm actually with you on this. It would make me happy to have the delay-corrected time, even though the difference could not possibly matter to me. I don't need it. But it would be cool.
Yeah from my napkin calculations from distance from colorado to los angeles, I would get a time delay of 0.0045 seconds using the speed of light. But let's say for the sake of interference and curvature of the earth stuff, we double that number, you would get 0.009. With that number, I still won't notice that delay by much. So I wonder why I'm getting like 0.25 seconds of delay which is a significantly large number. It makes me want to get several atomic clock devices to see if it's an issue with the watch or my environment. I doubt the watch is the issue though since I trust Casio's QC is very good, so it must be my setup.
That is a much larger delay than I would expect too. I can see why you're curious.
I posted this elsewhere in this thread but you don't actually need to know the distance if you do some clever stuff. Like NTP does. Dr Julian explains it pretty well in this video.