[time-nuts] 50 vs 75 ohm cables

John Day johnday at wordsnimages.com
Fri May 11 15:58:29 EDT 2007


At 03:26 PM 5/11/2007, you wrote:

>phk at phk.freebsd.dk said:
> > I can confirm that the choice of 75 Ohm for telecom use indeed is
> > because of the low attenuation.  The first use of coax was for
> > "Carrier Frequency" systems, where a number of telephone conversations
> > were AM modulated on individual carriers, usually 4 kHz apart.
>
>What's the attenuation mechanism?
>
>I thought the old 10 megabit vampire-tap Ethernet picked 50 ohms because of
>lower attenuation.  The story I remember is that for a given outside
>diameter, the inside diameter was bigger at a lower impedance.  The main
>losses were resistive on the center conductor due to skin effect.  A bigger
>center conductor had more area at a given skin depth and hence lower losses.

In fact that is correct. The Bell Labs experiments were done with a 
constant size CENTRE conductor.

But there was an over-riding concern with the Ethernet developers - 
if they chose a 75 ohm cable then it was felt that it was inevitable 
that somebody would use 'F' connectors and cheapen the whole thing 
down. If they stuck with 50 ohm then it was easy to stay with BNC.

In the Ethernet era you are referring to though, the timing limits 
the length of the Ethernet. I recall having some pretty horrible 
RG-58 style cable in a huge pile on the floor of the lab with as many 
taps as we could put onto it and the losses were not enough to stop it.

John






>--
>These are my opinions, not necessarily my employer's.  I hate spam.
>
>
>
>
>_______________________________________________
>time-nuts mailing list
>time-nuts at febo.com
>https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts




More information about the time-nuts mailing list