[time-nuts] 50 vs 75 ohm cables
jmfranke at cox.net
Fri May 11 15:43:55 EDT 2007
As the impedance goes up, the current drops for a given power level drops.
----- Original Message -----
From: "Hal Murray" <hmurray at megapathdsl.net>
To: "Discussion of precise time and frequency measurement"
<time-nuts at febo.com>
Sent: Friday, May 11, 2007 3:26 PM
Subject: Re: [time-nuts] 50 vs 75 ohm cables
> phk at phk.freebsd.dk said:
>> I can confirm that the choice of 75 Ohm for telecom use indeed is
>> because of the low attenuation. The first use of coax was for
>> "Carrier Frequency" systems, where a number of telephone conversations
>> were AM modulated on individual carriers, usually 4 kHz apart.
> What's the attenuation mechanism?
> I thought the old 10 megabit vampire-tap Ethernet picked 50 ohms because
> lower attenuation. The story I remember is that for a given outside
> diameter, the inside diameter was bigger at a lower impedance. The main
> losses were resistive on the center conductor due to skin effect. A
> center conductor had more area at a given skin depth and hence lower
> These are my opinions, not necessarily my employer's. I hate spam.
> time-nuts mailing list
> time-nuts at febo.com
More information about the time-nuts