[time-nuts] Limitations of Allan Variance applied to frequency divided signal?

Magnus Danielson magnus at rubidium.dyndns.org
Sat May 14 07:47:10 UTC 2011


On 05/13/2011 05:28 PM, Tijd Dingen wrote:
> In trying to put together a way to calculate Allan variance based on a series of timestamps of every Nth cycle, I ran into the following...
>
> Suppose you have an input signal, but it's a bit on the high side. So you use a prescaler to divide it down to a manageable frequency range. And now you want to use that signal to be able to say something useful about the original high frequency signal.
>
> Now taking a look at the part about "Non-overlapped variable tau estimators" in the wikipedia article here:
>
> http://en.wikipedia.org/wiki/Allan_variance#Non-overlapped_variable_.CF.84_estimators

Nice to see people actually read and use what I wrote.

> It seems to me that "divide by 4 and then measure all cycles back-to-back" is essentially the same as "measure all cycles of the undivided high frequency signal back-to-back" and decimate. Or "skipping past n − 1 samples" as the wiki article puts it. And that is disregarding /extra/ jitter due to the divider, purely for the sake of simplicity.

If you use a prescaler of say 1/64 then it takes 64 cycles of the 
original signal to cause a cycle to the counter core. These are then 
time-stamped, i.e. a time-measure and an event counter measure is taken. 
To transform the event-counter value into the properly scaled event 
value, you then multiply the event counter by 64, since it took 64 times 
more events than counted by the counter. The time-stamps does not have 
to be modified.

Notice that the pre-scaler is only used for higher frequencies.

> Plus, I strongly suspect that all these commercial counters that can
> handle 6 Ghz and such are not timestamping every single cycle
> back-to-back either. Especially the models that have a few versions
> in the series. One cheaper one that can handle 300 MHz for example,
> and a more expensive one that can handle 6 GHz. That reads like:
> "All models share the same basic data processing core and the same
> time interpolators. For the more expensive model we just slapped on
> an high bandwidth input + a prescaler."

You never time-stamp individual cycles anyway, so a pre-scaler doesn't 
do much difference. It does limit the granularity of the tau values you 
use, but usually not in a significant way since Allan variance is rarely 
used for taus shorter than 100 ms and well... pre-scaling usually is 
below 100 ns so it isn't a big difference.

> Anyways, any drawbacks to calculating Allan Variance of a divided signal that I am overlooking here?

No significant, it adds to the noise floor, but in practice the 
time-stamping and processing doesn't have big problems due to it.

Cheers,
Magnus



More information about the time-nuts mailing list