[time-nuts] PM-to-AM noise conversion (was RE: New Question onHP3048A Phase Noise Test Set)
mfeher at eozinc.com
Thu Jan 17 14:19:35 EST 2008
Well, I am not certain of what all you said John. Not disputing it, but, not
totally understanding it either. A spectrum analyzer cannot discern between
AM noise or PN, in fact they are typically additive on an RSS basis but show
up the same. Older spectrum analyzers do require a correction factor for
noise vice discrete signals like spurious as a spurious is coherent and
constant, but, the noise is statistical and the detector in the SA of course
as you know treats them differently. I recall in the old days, say 20 years
ago we may have made a total of 2.5 dB of correction to the noise floor
number, first, because of the about 1.8 dB or so due to the SA detector and
the rest due to the noise BW of the IF filter. So, the total was about 2.5
dB to the best of my recollection. So, if you wanted to measure C/N you
would take the C value as shown but apply the correction to the N value
read. In modern analyzers this is done automatically and you can read C/N
directly. We always used a mixer to down convert to BB for our PN
measurements. - Mike
Mike B. Feher, N4FS
89 Arnold Blvd.
Howell, NJ, 07731
From: time-nuts-bounces at febo.com [mailto:time-nuts-bounces at febo.com] On
Behalf Of John Miles
Sent: Thursday, January 17, 2008 1:54 PM
To: Discussion of precise time and frequency measurement
Subject: [time-nuts] PM-to-AM noise conversion (was RE: New Question
onHP3048A Phase Noise Test Set)
> I confirm the spur level on the spectrum analyzer. However the
> 3048A always
> says the spur is 6 dB lower than it actually is (6 dB plus/minus 0.5 dB).
> Can anyone tell me why this is so.
> I wondered if it had something to do with phase noise and amplitude noise
> not being the same thing.
That sounds likely. It's interesting to think about where that 6-dB fudge
factor might be coming from.
The principle behind the most common measurement made with test sets like
the 35601A, 11848A and 11729B/C is to use a double-balanced mixer to
generate a simple baseband (0 Hz IF) output, given RF and LO inputs in
quadrature from the reference and DUT respectively. The mixer acts as a
phase-to-amplitude converter with a particular gain constant. Its baseband
output is an AM signal that can be amplified as much as necessary and viewed
on a low-frequency spectrum analyzer with the appropriate level correction.
If you are calibrating such a measurement by hand -- e.g., say you're using
an 11729B/C without any system software -- the HP-suggested way is to
temporarily tweak the reference off-frequency by 10 kHz and drop its
amplitude by 40 dB, and look at the amplitude of the resulting beat note on
the spectrum analyzer. That amplitude is considered the "carrier" level for
the measurement, with all measured data points being expressed relative to
Then, at each data point, you subtract the system's PM-to-AM conversion
constant, which is basically the post-mixer gain plus the aforementioned
6-dB fudge factor. If you have a 40-dB LNA between the mixer's IF port and
the analyzer's input port, for example, that constant is -(40 dB+6 dB),
according to HP's app notes. This procedure works as long as the mixer's
conversion gain is constant; i.e., its RF port isn't driven into saturation
during the real measurement.
That '6 dB' part is probably what is bothering you. According to the notes
written by the guy whose office door the 3048A authors would have knocked on
for advice (see www.ke5fx.com/Scherer_Art_of_PN_measurement.pdf page 12),
you need to subtract 6 dB from the noise trace for two reasons. 3 dB of it
comes from a mysterious "Accounts for RMS value of beat signal (3 dB)"
clause in Scherer's app note. Second, the AM noise at baseband contains
energy from both conversion sidebands. You're trying to measure SSB noise,
so there's the other 3 dB.
I will confess I don't completely follow his reasoning. First, subtracting
the first 3 dB from the noise trace implies that there's something different
about the "RMS-ness" of SA-measured spur levels versus SA-measured noise
levels. I don't see the mathematical justification for that. If nothing
else, the difference between the crest factors of discrete spurs and
Gaussian noise is much greater than a 3-dB correction would account for.
Second, the two mixing products that are folded into the baseband spectrum
by the mixer are identical, since the IF spectrum is symmetrical. Given 50R
impedance at all three mixer ports, wouldn't the DSB output voltage be 2x (6
dB) the equivalent SSB value, not 1.414x (3 dB) as Scherer indicates? -3 dB
is what you use to correct for addition of dissimilar sources, -6 dB for
So my thinking is that while Scherer has the right figure for AM-to-PM noise
conversion (-6 dB), he may have reached it for the wrong reasons. I do not
believe that any RMS correction needs to be applied for the calibration
beat-note amplitude versus that of the sample-detected AM values that will
eventually be interpreted as either phase noise values *or* discrete spurs.
And I believe that the proper correction value for baseband downconversion
is 6 dB rather than 3 dB, regardless of whether the spectrum is full of
noise, spurs, or both.
More information about the time-nuts