[time-nuts] Commercial software defined radio for clock metrology

Sherman, Jeffrey A. (Fed) jeff.sherman at nist.gov
Wed Aug 3 14:05:55 EDT 2016

On Jul 29, 2016, at 5:14 PM, Kevin Rosenberg wrote:

I had a question about your experience. You mentioned using a input
signal near the maximum of the USRP’s ADC to get the best SNR. I
reviewed the schematics and application notes. I found a maximum Vpp
mentioned of 3.3V. I was wondering what voltage you were using to
drive the USRPs. When I go above 1.5-2 Vpp, I start getting signal
distortions and not much increase in the amplitude.

The absolute limit is not the 3.3V supply but a voltage reference embedded in the ADC (ADS62P44): 2V peak-to-peak. For the studies in the paper, the RF power input to the BasicRX board (which features a transformer-coupling into the differential ADC inputs) was kept below 0 dBm; more often approximately -3 dBm. I noticed distortion first in the harmonic content of signals after down-conversion beyond about half-scale on the ADC. The "gain" setting in the firmware will have an impact as well; I believe I set the gain to 0 (i.e. no gain) for all the published data. N.B. the ADC is not a 50-ohm device; the daughterboard does the impedance matching.

Best wishes,
Jeff Sherman
National Institute of Standards & Technology
Time and Frequency Division (688)
325 Broadway / Boulder, CO 80305 / 303-497-3511

More information about the time-nuts mailing list