[time-nuts] HP 53132A time interval measurement quantized?
stewart.cobb at gmail.com
Tue May 28 02:44:27 EDT 2013
My HP 53132A counter works fine as a frequency counter. When I try to use
it to measure time intervals (say, between two PPS pulses), the
measurements are quantized in steps of 100ns. For example, if the two PPS
are nearly aligned, the measurements will be either 0.0980us or -0.0020us.
If the PPS are drifting with respect to each other, there will be a long
string of (say) 0.100 000 0980 sec, and then another long string of 0.100
000 1980 sec, with no intermediate values. (I can send a TimeLab .TIM file
showing this behavior to anyone interested.) This quantization appears on
measurements using both front-panel inputs (1 and 2) and on measurements
using only input 1 and the "COMMON: ON" setting enabled.
According to the manual, the time-interval measurement resolution spec is
150ps. The quantization effect on my counter is about 3.5 orders of
magnitude worse than that. That made me wonder whether there was a problem
with the counter.
The 100ns quantization is the reciprocal of the 10 MHz timebase frequency,
which seems like a clue. One might suspect that the internal interpolators
were not working correctly. However, the counter passes all its self-tests
including the specific interpolator self-test, and it seems to interpolate
correctly in frequency mode. I ran the internal time-interval calibration
procedure "CAL: TI QUICK" with no errors, using a fast CMOS logic output
for the required square-wave input. (Before my "QUICK" cal, the
measurements were reported as 0.0965us and -0.0035us, so the cal procedure
did actually change something, but it didn't fix the quantization
problem.) I can't run the "CAL: TI FULL" procedure because it requires a
specific HP calibration fixture. The procedure in the operators manual
procedure for time-iinterval measurements doesn't seem to contain any
tricks. Making TI measurements with and without an external timebase
connected produces the same quantized results.
I'm trying to figure out whether this quantization effect is expected
behavior, operator error, or a problem with the counter. Any thoughts?
More information about the time-nuts