[time-nuts] How a phase microstepper works

Tom Van Baak tvb at leapsecond.com
Fri Mar 11 19:51:23 EST 2005

Suppose you had a 10 MHz signal from a frequency
standard that was used to drive a clock. The clock
probably digitally divides the 10 MHz down to 1 PPS
in order to display the time.

Now imagine a simple box of TTL parts that once in
a while, under user control, inserts one, or deletes
one of those 10 MHz clock pulses. The net effect
this action has on your clock is that it will advance
or retard the time by 100 ns. Make sense?

So that's a phase stepper. In the event that 100 ns
is too coarse a granularity, there are several clever
ways to implement smaller steps. This is not how
it's done in the real world, but a textbook example
is that you could multiply the 10 MHz by 50 to get
500 MHz, insert or delete your one clock pulse,
and then divide by 50 again. Now you have a 2 ns
phase stepper instead of a 100 ns phase stepper.

The point is you can make a box that takes 10 MHz
in and puts out 10 MHz with an occasional pulse, at
your discretion, that's a little bit wider or shorter.

Now, what if the user made one of those time steps
every hour. What is the average frequency of the
10 MHz output then? Well, it's 10 MHz plus 2 ns
per hour, which comes to 10 MHz + 5.5e-13. You
have now changed the average frequency by about
5.5 e-13. Similarly if you made a 2 ns step exactly
every 2000 s the frequency offset would be precisely

With this arrangement of continuous, but slow
periodic, micro steps instead of one-time macro
steps you no longer have just a phase stepper
but a way to change frequency; a frequency offset
generator, a narrow bandwidth frequency synthesizer,
a digital replacement for a C-field adjustment. So
that's a phase microstepper.

The old Austron 2055 uses a hybrid digital/analog
PLL and VCO technique and its step size is on
the order of 100 ps. Given the filter effects of the
PLL/VCO, given the typical amount of short-term
noise in a cesium standard or GPS receiver, and
given the long averaging times used when comparing
atomic standards, a 100 ps phase step is not directly
visible, yet a 100 ps step once every 100 s becomes
a smooth, measurable 1e-12 average frequency offset.

One more example: if you need a cesium frequency
offset of, say, 1.234E-12 then 100 ps steps every
81 seconds will do. The math is easy; you get the idea.

If there's interest I'll post more info on the internals
of the 2055. There's also the old TST 6490. These
both date back to the 80's, I think. The only phase
microstepper in current production that I know of is
made by Symmetricom for use with their H-maser.
These are $$ and are what NIST and USNO use.

I think current model cesium standards from Agilent
and Symmetricom include hardware to make digital
phase and frequency changes.

For use with a 5061A I would think it is possible to
home-brew a microstepper.

Or like I mentioned earlier just handle small frequency
offsets in software -- if you know your cesium is off by
1.23e-12 this month just subtract this quantity from
any measurements you make with your cesium.


More information about the time-nuts mailing list