View Single Post
  #1  
Old June 22nd 05, 01:11 AM
Michael
external usenet poster
 
Posts: n/a
Default

It's many miles. Nanosecond accuracy is required for the current ~10
meter accuracy. Linear extrapolation would lead to ~10 kilometers.


The speed of light is about 30 cm per nanosecond (a foot, give or take
a bit) so 10 meters calls for a resolution of about 30 nsec. Not
really a big deal these days (I design systems that must resolve to
better than 0.5 nsec). But the extrapolation to 10 km (which is indeed
how far light travels in 38 microseconds) doesn't work.

GPS receivers don't rely on an internal, independent clock. They
synchronize to the satellite - which is a sloppy way of putting it
anyway. The real issue is the difference between travel times of
signals from different satellites, not the absolute travel time. Thus
what matters here (to a first approximation, anyway) is that the
satellites are synchronized to each other, not to any earthbound clock.
To a second approximation, it is important that the almanac be right.
In other words, the satellite needs to be where it is expected to be at
the time it transmits. However, now the errors measured in
microseconds are much smaller - the key parameter is not how far light
travels in those microseconds (about 10 km as you noted), but how far
the satellite travels in those microseconds (more properly measured in
centimeters rather than kilometers).

Of course the error, if not corrected, is cumulative. After a few
weeks it would be quite significant.

Michael