[time-nuts] Characterising frequency standards

Tom Van Baak tvb at LeapSecond.com
Wed Apr 8 21:37:41 UTC 2009


> But how does that explain the output of Tom's adev1 program which
> still seems to give a a good measurement at tau = 1s?

The first argument to the adev1 program is the sampling interval t0.
The program doesn't know how far apart the input file samples are
taken so it is your job to specify this. The default is 1 second.

If you have data taken one second apart then t0 = 1.
If you have data taken two seconds apart then t0 = 2.
If you have data taken 60 seconds apart then t0 = 60, etc.

If, as in your case, you take raw one second data and remove
every other sample (a perfectly valid thing to do), then t0 = 2.

Make sense now? It's still "continuous data" in the sense that all
measurements are a fixed interval apart. But in any ADEV
calculation you have to specify the raw data interval.

/tvb




More information about the time-nuts mailing list