[time-nuts] PM-to-AM noise conversion (was RE: New Question on HP3048A Phase Noise Test Set)

John Miles jmiles at pop.net
Thu Jan 17 13:54:08 EST 2008


> I confirm the spur level on the spectrum analyzer.  However the
> 3048A always
> says the spur is 6 dB lower than it actually is (6 dB plus/minus 0.5 dB).
>
> Can anyone tell me why this is so.
>
> I wondered if it had something to do with phase noise and amplitude noise
> not being the same thing.

That sounds likely.  It's interesting to think about where that 6-dB fudge
factor might be coming from.

The principle behind the most common measurement made with test sets like
the 35601A, 11848A and 11729B/C is to use a double-balanced mixer to
generate a simple baseband (0 Hz IF) output, given RF and LO inputs in
quadrature from the reference and DUT respectively.  The mixer acts as a
phase-to-amplitude converter with a particular gain constant.  Its baseband
output is an AM signal that can be amplified as much as necessary and viewed
on a low-frequency spectrum analyzer with the appropriate level correction.

If you are calibrating such a measurement by hand -- e.g., say you're using
an 11729B/C without any system software -- the HP-suggested way is to
temporarily tweak the reference off-frequency by 10 kHz and drop its
amplitude by 40 dB, and look at the amplitude of the resulting beat note on
the spectrum analyzer.  That amplitude is considered the "carrier" level for
the measurement, with all measured data points being expressed relative to
it.

Then, at each data point, you subtract the system's PM-to-AM conversion
constant, which is basically the post-mixer gain plus the aforementioned
6-dB fudge factor.  If you have a 40-dB LNA between the mixer's IF port and
the analyzer's input port, for example, that constant is -(40 dB+6 dB),
according to HP's app notes.  This procedure works as long as the mixer's
conversion gain is constant; i.e., its RF port isn't driven into saturation
during the real measurement.

That '6 dB' part is probably what is bothering you.  According to the notes
written by the guy whose office door the 3048A authors would have knocked on
for advice (see www.ke5fx.com/Scherer_Art_of_PN_measurement.pdf page 12),
you need to subtract 6 dB from the noise trace for two reasons.  3 dB of it
comes from a mysterious "Accounts for RMS value of beat signal (3 dB)"
clause in Scherer's app note.  Second, the AM noise at baseband contains
energy from both conversion sidebands.  You're trying to measure SSB noise,
so there's the other 3 dB.

I will confess I don't completely follow his reasoning.  First, subtracting
the first 3 dB from the noise trace implies that there's something different
about the "RMS-ness" of SA-measured spur levels versus SA-measured noise
levels.  I don't see the mathematical justification for that.  If nothing
else, the difference between the crest factors of discrete spurs and
Gaussian noise is much greater than a 3-dB correction would account for.

Second, the two mixing products that are folded into the baseband spectrum
by the mixer are identical, since the IF spectrum is symmetrical.  Given 50R
impedance at all three mixer ports, wouldn't the DSB output voltage be 2x (6
dB) the equivalent SSB value, not 1.414x (3 dB) as Scherer indicates?  -3 dB
is what you use to correct for addition of dissimilar sources, -6 dB for
in-phase ones.

So my thinking is that while Scherer has the right figure for AM-to-PM noise
conversion (-6 dB), he may have reached it for the wrong reasons.  I do not
believe that any RMS correction needs to be applied for the calibration
beat-note amplitude versus that of the sample-detected AM values that will
eventually be interpreted as either phase noise values *or* discrete spurs.
And I believe that the proper correction value for baseband downconversion
is 6 dB rather than 3 dB, regardless of whether the spectrum is full of
noise, spurs, or both.



More information about the time-nuts mailing list