[time-nuts] Characterising frequency standards
sar10538 at gmail.com
Thu Apr 9 11:51:47 UTC 2009
2009/4/9 Bruce Griffiths <bruce.griffiths at xtra.co.nz>:
>> Doesn't that imply that the data point should correspond to the whole
>> sampling period and not just half of it?
> The total measurement time is only deceased by 1 sec at the most if you
> delete every second line.
> The resampled data now has a sampling interval of 2 sec for the entire
> measurment time.
> The original data samples are phase differences measured on the second
> every second.
> The resampled data are phase differences measured every 2 seconds on the
> corresponding second transition.
OK, I'm ready to be shot down on this but from what I can see right
now the measurement period of 2 sec should be maintained to satisfy
the measurement of drift which would otherwise be incorrectly
interpreted if I processed 400000 sec of data as only 200000 sec. I
can see that noise on the data can be broken down into two major
groups, drift and what I would really see as noise, IE PN, flicker,
random, etc. I guess I have been ignoring the whole drift component
with my missing data used for the ADEV plots. The point to me though
is that, even with the reduced data, an ADEV plot should be able to
characterise 'noise' for the actual sampling duration of the data, IE.
1 sec. What would obviously be incorrect is the affects of drift which
should logically show up as being twice as great. Maybe my idea of
using the 1 sec sampling period would work out better with HDEV.
Steve Rooke - ZL3TUV & G8KVD & JAKDTTNW
Omnium finis imminet
More information about the time-nuts