[time-nuts] Explanation of how a GPS RX generates time/freq precision?
mpb45 at clanbaker.org
Sun Nov 22 21:51:19 EST 2015
I have a couple of friends who have asked how GPS based
time & frequency units arrive at their level of precision.
One sent me some questions and comments (see below)
which indicate his lack of understanding of how GPS
disciplined oscillators can deliver the precision they do.
My question is-- Can anyone on the Time-Nuts list point me
to a source I can give them that will explain how the process
works? Seems to me that I have seen some papers describing
the process but my search for this info has not turned up
anything suitable that goes into some detail on this.
Any feedback or suggestions on this is much appreciated!!
Here is what one of the guys sent me:
A 2GHz Intel processor can't measure 10^-12, let alone getting
a P4 to do it. RS232 delay times are measured in mSec, not
10^-9. Yes it can time-stamp PPS at 10^-9 if it had a time source
that accurate but that has nothing to do with when the pulse
gets to the receiver or the original time source of the system.
Given that my phone has a 2+ GHz processor / clock and a GPS
receiver, what is the advantage of a 6 GHz (or 10 MHz) oscillator?
I could well believe that system is accurate to 100 nSec (10^-7)
or that a Stratum 0 GPS system good down close to 10^-8
(40 nSec). I don't see any way of getting down to 10^-12 with
that technology or transmitting a time more accurate than 10^-7.
If the original GPS time source is 10^-7 or a little better, how
can it ever be more accurate than that?
What are we missing?
More information about the time-nuts