[time-nuts] 50 vs 75 ohm cables
John Day
johnday at wordsnimages.com
Thu May 10 12:26:05 EDT 2007
At 11:00 AM 5/10/2007, WB6BNQ wrote:
>Hi Peter,
>
><snipped>
>
>In an effort to standardize, the industry selected the mid point between
>the 35 Ohms and the 72 Ohms, that being 50 Ohms. This forced the antenna
>manufacturers to design their antennas for 50 Ohms or provide a matching
>network.
Nice thought, but in fact the comments made earlier are more correct.
Early co-axial connectors go back to Belling-Lee in the UK in the
early 20's with what has now become the IEC-692 or 'PAL' connector.
Although not a controlled impedance connector it made the use of
coaxial cable convenient. But like most coaxial components it need to
wait for radar to become really useful.
During the war it was used for video and IF connections in radar equipment.
52 ohms was in fact the compromise. In 1929 experimental work at Bell
Laboratories found that the ideal impedances for coaxial cable
were 30 ohms for high power, 60 ohms for high voltage and 77 ohms
for low attenuation. Thirty ohm cable is very difficult to make, not
very flexible and is expensive. So the Bell folk decided that 52 ohms
was the best compromise between 30 and 60 ohms. This has become the
50 ohm cable we know today.
Seventy-five ohms is also a compromise, in fact a folded dipole has,
if I recall correctly because I cant check any books as I am away
from the office, a feed point impedance of 73 ohms. So 75 ohms as we
know it now is a compromise between the low attenuation 77 ohms and
the 73 ohm dipole feed-point.
We also tend to think in terms of ONLY 50 and 75 ohm. But in fact
(again IIRC) RG-8 cable is actually 52 ohms, RG-59A is 73 ohms
(wonder why!) but RG-58B is 75 ohms. RG-11, which found use for video
cable during the war is 75 ohms. But amongst the older cables their
are cables at 52.5, 51, 76 as well as 50, 52 and 75 ohms. Newer types
tend to be either 50 or 75 ohm with 93, 120, 125 and even 950 ohm
available for special uses.
>The television world originally used 300 Ohms at the antenna, along with
>300 Ohm "Twin Lead" and the 300 Ohm input to the television itself. The
>reason for the 300 Ohms was due to the use of a folded dipole which has a
>characteristic impedance of 300 Ohms.
Well, again IIRC, it is not actually 300 ohms, if memory serves it is
in fact more like 273 ohms - plus or minus the effect of conductor sizes.
> The folded dipole design could be
>tweaked to provide a wide frequency range needed to cover all the TV
>frequencies, especially with the UHF channels.
>
>The switch to coax for TV use came about in an effort to prevent or
>greatly reduce ghosting problems and for cable systems as a reliable
>means of transporting the signals to many locations. "Twin Lead" cannot
>tolerate being near metal objects and is unable to be buried. "Coax"
>contains the signal entirely within its own shielded structure and
>therefore can be buried and laid next to other metal objects without
>degrading the signal quality.
>
>The reason 75 Ohms was selected for the TV world was because a simple,
>easily constructed, 4:1 balun (transformer) would transform 300 Ohms to
>75 Ohms.
That factor accelerated its acceptance. But in fact 75 ohms was well
established in the TV industry before coax found its way to the
receiving antenna - as the cable for carrying video signals due to
the lower losses it exhibited when coax ran all the way around
buildings carrying video. Although other cable types (especially a
shielded twisted pair amongst them) were tried, manufacturers were
already making lots of 75 ohm coax so it was cheap and readily
available. The fact that blind freddy could wind baluns cheaply just
made it more economical. It is said by some historians of television
that 75 ohms was adopted early on because of low losses and the fact
that 75ohms could easily be made in resistors (for terminations and
feed resistors) by paralleling two 150 ohm units before the E-24
values became common.
> Trying to go from 300 Ohms to 50 Ohms would require a 6:1 ratio
>with increased I/R losses and greater difficulty in obtaining wide band
>operation in the early days of ferrite mixes.
Remember also that early television antenna baluns didn't use ferrite
at all! Many of them were reasonably large and used 'air' cores.
Baluns with ferrite cores only became common much later because they
could be wound by automatic machines and were better on UHF, although
many viewers would never have noticed the difference.
A few points of reference: Coaxial cable was invented in 1884 and
patented in Germany by Ernst von Siemens, the founder of Siemens. But
nothing really happened as they couldn't figure out a use for it.
About a decade later Tesla took out a US patent , which you should be
able to find at the USPTO website. Crudely made coaxial cable, or
perhaps more correctly, shielded cable of coaxial construction is
found in the UK in the early 20's. In the US it was not until the
work at Bell Labs in 1929 that coax became widely known there.
Connectors were an issue and whilst the Brits had Belling-Lee from
1922, the Americans didn't have a standard'ish style of connector
until the advent of the 'UHF' (PL-259, SO-239) connector during the
war - as a result of needing a connector for carrying radar video signals.
John
More information about the time-nuts
mailing list