[time-nuts] 50 vs 75 ohm cables

WB6BNQ wb6bnq at cox.net
Thu May 10 13:50:59 EDT 2007




   John Day wrote:

     At 11:00 AM 5/10/2007, WB6BNQ wrote:
     >Hi Peter,
     >
     ><snipped>
     >
     >In an effort to standardize, the industry selected the mid point
     between
     >the 35 Ohms and the 72 Ohms, that being 50 Ohms.  This forced the
     antenna
     >manufacturers to design their antennas for 50 Ohms or provide a
     matching
     >network.

     Nice thought, but in fact the comments made earlier are more
     correct.
     Early co-axial connectors go back to Belling-Lee in the UK in the
     early 20's with what has now become the IEC-692 or 'PAL' connector.
     Although not a controlled impedance connector it made the use of
     coaxial cable convenient. But like most coaxial components it need
     to
     wait for radar to become really useful.

     During the war it was used for video and IF connections in radar
     equipment.

   The reason that coax was not widely used was due to cost.  It was
   readily available in large quantities, so the only people able to get
   and use it were the Government and government funded research.



     52 ohms was in fact the compromise. In 1929 experimental work at
     Bell
     Laboratories found that the ideal impedances for coaxial cable
     were  30 ohms for high power, 60 ohms for high voltage and 77 ohms
     for low attenuation. Thirty ohm cable is very difficult to make,
     not
     very flexible and is expensive. So the Bell folk decided that 52
     ohms
     was the best compromise between 30 and 60 ohms. This has become the
     50 ohm cable we know today.

   I do not directly know of the referenced Ma Bell experiments.
   However, if such claims were made, it was certainly taken out of
   context !  Without the context or application that was being
   researched these numbers have little meaning.

   Why make coax for 30 Ohms when no systems were in existence that
   utilized such an impedance ?

   How does 60 Ohms differ from 77 Ohms when discussing high voltage ?

   How much high voltage are we talking about ?

   How does 77 Ohms provide lower attenuation then 60 Ohms ?

   The fact of the matter is that some concept was being studied by MA
   BELL utilizing what was currently available at the time of that
   study.  Those values could have been determined based upon the quality
   of available materials of that time frame.

   If  the high voltage was "DC" then the cable impedance has little
   importance, except during the rise time from off to on.  Not
   withstanding dielectric leakage.  If it was "AC" then :

   What was the applied frequency ?

   With no facts as to the targeted application or the design criteria,
   the reasons for the above quoted values have no merit.  What does make
   sense is you provide a medium that satisfies the maximum transfer of
   power between two points no matter what the matching impedance's are.
   This would mean the cable impedance would match whatever the system
   (source and terminating) impedance is.



     Seventy-five ohms is also a compromise, in fact a folded dipole
     has,
     if I recall correctly because I cant check any books as I am away
     from the office, a feed point impedance of 73 ohms.

   SORRY, a folded dipole is referred to as a 300 Ohm feed point.

     So 75 ohms as we
     know it now is a compromise between the low attenuation 77 ohms and
     the 73 ohm dipole feed-point.

     We also tend to think in terms of ONLY 50 and 75 ohm. But in fact
     (again IIRC) RG-8 cable is actually 52 ohms, RG-59A is 73 ohms
     (wonder why!) but RG-58B is 75 ohms. RG-11, which found use for
     video
     cable during the war is 75 ohms. But amongst the older cables their
     are cables at 52.5, 51, 76 as well as 50, 52 and 75 ohms. Newer
     types
     tend to be either 50 or 75 ohm with 93, 120, 125 and even 950 ohm
     available for special uses.

     >The television world originally used 300 Ohms at the antenna,
     along with
     >300 Ohm "Twin Lead" and the 300 Ohm input to the television
     itself.  The
     >reason for the 300 Ohms was due to the use of a folded dipole
     which has a
     >characteristic impedance of 300 Ohms.

     Well, again IIRC, it is not actually 300 ohms, if memory serves it
     is
     in fact more like 273 ohms - plus or minus the effect of conductor
     sizes.

   OK, if we are going to split hairs then the folded dipole is a 4:1
   transformation and the nominal free space impedance is 72 Ohms.  So a
   4:1 ratio would make the folded dipole, in actuality, 288 Ohms.  288
   is an awkward number to roll off your tongue, that is why it is called
   300 Ohms.



     >  The folded dipole design could be
     >tweaked to provide a wide frequency range needed to cover all the
     TV
     >frequencies, especially with the UHF channels.
     >
     >The switch to coax for TV use came about in an effort to prevent
     or
     >greatly reduce ghosting problems and for cable systems as a
     reliable
     >means of transporting the signals to many locations.  "Twin Lead"
     cannot
     >tolerate being near metal objects and is unable to be buried.
     "Coax"
     >contains the signal entirely within its own shielded structure and
     >therefore can be buried and laid next to other metal objects
     without
     >degrading the signal quality.
     >
     >The reason 75 Ohms was selected for the TV world was because a
     simple,
     >easily constructed, 4:1 balun (transformer) would transform 300
     Ohms to
     >75 Ohms.

     That factor accelerated its acceptance. But in fact 75 ohms was
     well
     established in the TV industry before coax found its way to the
     receiving antenna - as the cable for carrying video signals due to
     the lower losses it exhibited when coax ran all the way around
     buildings carrying video.

   SORRY, again this is simply not so !

   Base band video is not "RF" at 60 Mhz and up.  Base band video barely
   made it to 5 MHz in the Black & White days.

   Again, "Twin Lead" was dirt cheap compared to the manufacturering cost
   of coax back in the 1950's and 1960's.  "Twin Lead," if made with
   quality material, has much lower loss then coax, especially at very
   high frequencies.  The TV transmitters were of lower power and their
   antennas were not all that high in gain in those early years.  So,
   with relatively insensitive TV's of the early years and the poor gain
   of the receiving TV antennas, you really needed all the help you could
   get.

     Although other cable types (especially a
     shielded twisted pair amongst them) were tried, manufacturers were
     already making lots of 75 ohm coax so it was cheap and readily
     available. The fact that blind freddy could wind baluns cheaply
     just
     made it more economical. It is said by some historians of
     television
     that 75 ohms was adopted early on because of low losses and the
     fact
     that 75ohms could easily be made in resistors (for terminations and
     feed resistors) by paralleling two 150 ohm units before the E-24
     values became common.

     >  Trying to go from 300 Ohms to 50 Ohms would require a 6:1 ratio
     >with increased I/R losses and greater difficulty in obtaining wide
     band
     >operation in the early days of ferrite mixes.

     Remember also that early television antenna baluns didn't use
     ferrite
     at all! Many of them were reasonably large and used 'air' cores.
     Baluns with ferrite cores only became common much later because
     they
     could be wound by automatic machines and were better on UHF,
     although
     many viewers would never have noticed the difference.

     A few points of reference: Coaxial cable was invented in 1884 and
     patented in Germany by Ernst von Siemens, the founder of Siemens.
     But
     nothing really happened as they couldn't figure out a use for it.
     About a decade later Tesla took out a US patent , which you should
     be
     able to find at the USPTO website. Crudely made coaxial cable, or
     perhaps more correctly, shielded cable of coaxial construction is
     found in the UK in the early 20's. In the US it was not until the
     work at Bell Labs in 1929 that coax became widely known there.
     Connectors were an issue and whilst the Brits had Belling-Lee from
     1922, the Americans didn't have a standard'ish style of connector
     until the advent of the 'UHF' (PL-259, SO-239) connector during the
     war - as a result of needing a connector for carrying radar video
     signals.

     John

   OK, I am ready for round two.

   Bill....WB6BNQ


More information about the time-nuts mailing list