[time-nuts] Stanford Research SR620 Measurement Bias

Magnus Danielson magnus at rubidium.dyndns.org
Sun Sep 27 21:55:55 UTC 2009


Stan,

Stan wrote:
> I just acquired an SR620 that looks to be in excellent condition. I've
> noticed something in its operation and I'm not sure if it represents a
> problem, a calibration issue, or if it's just a "feature."
> 
> Yesterday, after letting the counter warm up for an hour, I calibrated the
> 10 MHz timebase and did a couple of Auto-Cals (just to be sure), but even
> after all of that, there's still a bias of somewhere between 40 and 70
> counts in the frequency readout, independent of the gate time. For example,
> if I measure the 5 MHz output of my 5065A, and using a sample size of 1,
> here's what I see (these are representative readings-the values are bouncing
> up and down by 10 or 20 counts and I'm "visually averaging" them).
> Increasing the sample size has no effect other than to reduce the
> variability of the measurements, but the bias is still the same magnitude:
> 
>  
> 
> 1 sec Gate:     5,000,000.00054 Hz
> 
> 0.1 sec Gate:   5,000,000.0060 Hz
> 
> 0.01 sec Gate:  5,000,000.055 Hz
> 
> 1 Period Gate:  5,004,7__ Hz

Notice how the bias shift up one decade as the time-base shifts down a 
decade. Consider that this is a reciprocal counter which measures the 
number of events and amount of time (in its time-base):

freq = Events / Time

As we step Time a decade down, the error increases by a decade, but an 
offset in number of Events should produce a much larger error (5 Hz, 50 
Hz and 500 Hz respectively) so we must look at the time side of the 
division.

To facilitate this, I make up estimates for the Event counter and Gate 
Counter and runs the calculation backwards:

freq = Events / Time = Events / (GateCount * 90 MHz - Start + Stop)

Events = freq * Time

Time = Events / freq

tau    Events    Gate Count  -Start + Stop
   1 s  5.000.000 90.000.000  -107,999 ps
100 ms   500.000  9.000.000  -119,999 ps
  10 ms    50.000    900.000  -109,999 ps
500 ns         1         18  -199,999 ps

This is fairly consistent. Notice that the 1 cycle path is different 
from the internally gated path.

However, the cause of this time error needs some more thought.

If it where an inconsistens between the start and stop interpolators, 
the error would shift as the internal time-base beats against the 
internal 10 MHz oscillator (which is stepped up to the 90 MHz coarse 
counter clock).

However, a voltage error between start and stop trigger voltage would 
produce a stable offset. Since the start and stop input selection is the 
same in the SR620 when doing frequency measurments, this flaw is 
canceled out.

The ST620 uses a little dedicated circuit to post-process the frequency 
measurements and produce the start and stop signals being sent to 
measurement channel muxes. After this mux signals goes to the event and 
gate counter setups and also to the interpolator logics. Any systematic 
time-offset due to propagation delay variations in the start and stop 
delay of that circuit will introduce a time-bias into frequency 
measurements.

Looking at the overview of calibration bytes, byte 50 looks like the 
byte of interest. However, it is claimed that byte 50 is among those 
that is trimmed by AutoCal, which makes the peculiar statement on page 
78 that Frequency does not need calibration. It should say it does not 
need manual calibration. It may be that you could tweak this value and 
see how it changes your readings.

> I've had several HP 5370A/B counters, but never one of these. As I
> recall-and it's been a while-there's an adjustment in the 5370A/B for the
> interpolators that can remove this sort of bias. Is there a similar
> adjustment in the SR620? I've been through the manual and don't see anything
> obvious.

It is hidden in the details. I think you with the brief analysis above 
should be able to come to the same conclusion.

> Thanks for any advice!

Hope I got you onto the track.

Cheers,
Magnus



More information about the time-nuts mailing list