[time-nuts] Stanford Research SR620 Measurement Bias

Stan swperk at earthlink.net
Sun Sep 27 16:15:02 UTC 2009

I just acquired an SR620 that looks to be in excellent condition. I've
noticed something in its operation and I'm not sure if it represents a
problem, a calibration issue, or if it's just a "feature."


Yesterday, after letting the counter warm up for an hour, I calibrated the
10 MHz timebase and did a couple of Auto-Cals (just to be sure), but even
after all of that, there's still a bias of somewhere between 40 and 70
counts in the frequency readout, independent of the gate time. For example,
if I measure the 5 MHz output of my 5065A, and using a sample size of 1,
here's what I see (these are representative readings-the values are bouncing
up and down by 10 or 20 counts and I'm "visually averaging" them).
Increasing the sample size has no effect other than to reduce the
variability of the measurements, but the bias is still the same magnitude:


1 sec Gate:     5,000,000.00054 Hz

0.1 sec Gate:   5,000,000.0060 Hz

0.01 sec Gate:  5,000,000.055 Hz

1 Period Gate:  5,004,7__ Hz


I've had several HP 5370A/B counters, but never one of these. As I
recall-and it's been a while-there's an adjustment in the 5370A/B for the
interpolators that can remove this sort of bias. Is there a similar
adjustment in the SR620? I've been through the manual and don't see anything


Thanks for any advice!



More information about the time-nuts mailing list