[time-nuts] Are cable delays frequency dependent?
magnus at rubidium.dyndns.org
Sat Sep 12 21:36:04 UTC 2009
Bruce Griffiths wrote:
> Bruce Griffiths wrote:
>> Bruce Griffiths wrote:
>>> Hal Murray wrote:
>>>> If so, what's the mechanism?
>>>> I know that attenuation is frequency dependent due to skin effect but I can't
>>>> turn that into variable delays. Is there a magic term I should google for
>>>> and/or does anybody have a good URL?
>>>> Context is a memory from 20 years ago. I think it was a data sheet or app
>>>> note for clock recovery on a T1 line. Maybe it was just explaining the specs
>>>> for a line amplifier. The idea was that the recovered clock would shift
>>>> depending on the frequency of the signal. The frequency depended on the data
>>>> pattern so you could harass the clock recovery by picking nasty data patterns.
>>>> I think I almost understood it back then when I had the info in front of me.
>>>> I've tried to remember or reconstruct it a couple of times over the years,
>>>> but I've never been successful.
>>> Even in the RF region where cables act like distributed LC transmission
>>> lines with a relatively constant characteristic impedance, all
>>> dielectrics are lossy.
>>> A lossy dielectric has a frequency dependent dielectric constant.
>>> This is particularly evident in the vicinity of an absorption edge.
>>> Even remote from an absorption edge the dielectric constant varies with
>>> The dielectric constant behaviour as a function of frequency can be
>>> approximated by a Cole-Coles relationship.
>>> This in turn can be approximated by a set of Sellemeier equations, one
>>> per absorption edge.
>>> Dielectric loss (and dispersion) increase with the water content of the
>>> At lower frequencies the cable acts like a distributed RC transmission
>>> line with a strongly frequency dependent characteristic impedance and
>>> propagation delay.
>>> Google telegraphers equations for details.
>> For measurements of the dispersion (variation of propagation delay with
>> frequency) of coaxial cable in the RF region see:
> The above paper indicates that where the skin effect dominates the cable
> propagation delay dispersion (probably the case for all coax in the RF
> region except perhaps when superconductors are used), the dispersion
> can be estimated from the variation of cable loss with frequency.
Skin effect is pretty apparent when doing TDR of sufficient resolution.
The skin effect has the effect that it dies off with time, as the
reverse field is inductive and depends on the change over time, but as
the change stops so does the reverse EMF in the core. This can be seen
when viewing a TDR trace or response of a squarewave of sufficient
transient and with no ringing or impedance mismatch.
Group delay is the derivate of the phase response, and the phase
response varies when amplitude response changes.
It is usually safe to assume the group delay is anything but flat.
(Engineer approach, be safe by assume imperfections)
More information about the time-nuts