[time-nuts] Oh dear

Hal Murray hmurray at megapathdsl.net
Mon May 7 20:01:47 UTC 2012

jfor at quikus.com said:
> Suppose you have a perfect, ideal clock that puts out 'convert' pulses at an
> exact rate is used to strobe a high precision A/D.

> Now suppose you add jitter to that perfect clock so that the rate stays the
> same but time interval between successive pulses varies randomly between
> P(1-x) and P(1+x).

> How big would x have to be before anyone could detect any difference in the
> sound?

It's easy enough to work out the right ballpark.

Feed a theoretical sine wave into your A/D.  Set it to the max amplitude and 
max frequency that you expect the system to handle.  Look at the zero 
crossing (max slope).  How much time does it take for the signal to 
transition from an output of 0 to an output of 1 (LSB).

If your clock if off by that much in time, the analog voltage that you sample 
will be off by 1 bit.

It's a big deal at radar frequencies, less so at audio.  You want to make 
sure that you don't use one of the oscillator packages that has a 
programmable PLL.  Their jitter specs are nasty.

These are my opinions, not necessarily my employer's.  I hate spam.

More information about the time-nuts mailing list