[time-nuts] Question about frequency counter testing

Magnus Danielson magnus at rubidium.dyndns.org
Sat May 12 17:20:51 EDT 2018


Hi,

On 05/12/2018 08:38 PM, Oleg Skydan wrote:
> Hi!
> 
> From: "Magnus Danielson" <magnus at rubidium.dyndns.org>
>> ADEV assumes brick-wall filtering up to the Nyquist frequency as result
>> of the sample-rate. When you filter the data as you do a Linear
>> Regression / Least Square estimation, the actual bandwidth will be much
>> less, so the ADEV measures will be biased for lower taus, but for higher
>> taus less of the kernel of the ADEV will be affected by the filter and
>> thus the bias will reduce.
> 
> Thanks for clarification. Bob already pointed me to problem and after
> some reading *DEV theme seems to be clearer.

The mistake is easy to make. Back in the days, it was given that you
should always give the system bandwidth alongside a ADEV plot, a
practice that later got lost. Many people does not know what bandwidth
they have, and the effect it has on the plot. I've even heard
distinguished and knowledgeable person in the field admit of doing it
incorrect.

>>> Does the ADEV plots I got looks reasonable for the used "mid range"
>>> OCXOs (see the second plot for the long run test)?
>>
>> You probably want to find the source of the wavy response as the orange
>> and red trace.
> 
> I have already found the problem. It is HW problem related to poor
> isolation between reference OCXO signal and counter input signal clock
> line (it is also possible there are some grounding or power supply
> decoupling problems - the HW is made in "ugly construction" style). When
> the input clock frequency is very close (0.3..0.4Hz difference) to the
> OCXO subharmonic this problem become visible (it is not FW problem
> discussed before, cause counter reference is not a harmonic of the OCXO
> anymore).

Make sense. Cross-talk has been performance limit of several counters,
and care should be taken to reduce it.

> It looks like some commercial counters suffers from that
> problem too. After I connected OCXO and input feed lines with short
> pieces of the coax this effect greatly decreased, but not disappeared.

Cross talk exists for sure, but there is a similar effect too which is
not due to cross-talk but due to how the counter is able to interpolate
certain frequencies.

> The "large N" plots were measured with the input signal 1.4Hz (0.3ppm)
> higher then 1/2 subharmonicĀ  of the OCXO frequency, with such frequency
> difference that problem completely disappears. I will check for this
> problem again when I will move the HW to the normal PCB.

Yes.

>> If fact, you can do a Omega-style counter you can use for PDEV, you just
>> need to use the right approach to be able to decimate the data. Oh,
>> there's a draft paper on that:
>>
>> https://arxiv.org/abs/1604.01004
> 
> Thanks for the document. It needs some time to study and maybe I will
> add the features to the counter to calculate correct PDEV.

It suggest a very practical method for FPGA based counters, so that you
can make use of the high rate of samples that you have and reduce it in
HW before handing of to SW. As you want to decimate data, you do not
want to lose the Least Square property, and this is a practical method
of achieving it.

>>> If ADEV is needed, the averaging
>>> interval can be reduced and several measurements (more then eight) can
>>> be combined into one point (creating the new weighting function which
>>> resembles the usual Pi one, as shown in the [1] p.54), it should be
>>> possible to calculate usual ADEV using such data. As far as I
>>> understand, the filter which is formed by the resulting weighting
>>> function will have wider bandwidth, so the impact on ADEV will be
>>> smaller and it can be computed correctly. Am I missing something?
>>
>> Well, you can reduce averaging interval to 1 and then you compute the
>> ADEV, but it does not behave as the MDEV any longer.
> 
> With no averaging it will be a simple reciprocal counter with time
> resolution of only 2.5ns. The idea was to use trapezoidal weighting, so
> the counter will become somewhere "between" Pi and Delta counters. When
> the upper base of the weighting function trapezium is 0 length
> (triangular weighting) it is usual Delta counter, if it is infinitely
> long the result should converge to usual Pi counter. Prof. Rubiola
> claims if the ratio of upper to lower base is more than 8/9 the ADEV
> plots made from such data should be sufficiently close to usual ADEV. Of
> cause the gain from the averaging will be at least 3 times less than
> from the usual Delta averaging.

You do not want to mix pre-filtering and ADEV that way. We can do things
better.

> Maybe I need to find or make "not so good" signal source and measure its
> ADEV using above method and compare with the traditional. It should be
> interesting experiment.

It is always good to experiment and learn from both not so stable stuff,
stuff with significant drift and very stable stuff.

>> What you can do is that you can calculate MDEV or PDEV, and then apply
>> the suitable bias function to convert the level to that of ADEV.
> 
> That can be done if the statistics is calculated inside the counter, but
> it will not make the exported data suitable for post processing with the
> TimeLab or other software that is not aware of what is going on inside
> the counter.

Exactly. You need to continue the processing in the counter for the
post-processing to produce good post-processing for unbias values.
There is many ways to mess it up.

>> Yes, they give relatively close values of deviation, where PDEV goes
>> somewhat lower, indicating that there is a slight advantage of the LR/LS
>> frequency estimation measure over that of the Delta counter, as given by
>> it's MDEV.
> 
> Here is another question - how to correctly calculate averaging length
> in Delta counter? I have 5e6 timestamps in one second, so Pi and Omega
> counters process 5e6 samples totally and one measurement have also 5e6
> samples, but the Delta one processes 10e6 totally with each of the
> averaged measurement having 5e6 samples. Delta counter actually used two
> times more data. What should be equal when comparing different counter
> types - the number of samples in one measurement (gating time) or the
> total number of samples processed?

How does you get so different event-rates?

If you have 5 MHz, the rising edge gives you 5E6 events, and which type
of processing you do, Pi (none), Delta or Omega, is just different types
of post-processing on the raw phase data-set.

Cheers,
Magnus


More information about the time-nuts mailing list