UK Sales: 0191 418 1122         International: +44 191 418 1133

Menu

www.canford.co.uk

30 December 2024

A brief history of cable impedance

The earliest viable long-distance electrical communications system was the telegraph and its introduction spawned a whole range of new studies, techniques and products intended to maximise its benefits and its efficiency. New industries, producing long lengths of ductile, corrosion resistant wire, termite resistant wooden poles and ceramic insulators were born, and their successors are amongst the largest corporations of the 21st century.

It is a plausible piece of folklore that the choice of wire thickness and position of the insulators on the ubiquitous 'telegraph poles' was such that the characteristic impedance of the resulting transmission lines was 600 ohms. It is certainly true that the characteristic impedance of a wire-pair transmission line is a function of wire thickness, distance between the conductors and the permittivity of the insulation between the pair of wires and if you insert the typical dimensions used in early telegraph poles into that function, the figure of 600 ohms is a likely outcome.

Whether 'folklore' or not, 600 ohms was widely adopted as the 'standard' for telecommunications systems and later broadcast studio installations. Thus sending and receiving equipment and the cables linking them all had to work at 600 ohms, providing optimisation of signal-transfer and consistency in the operation of hybrid transformers used to provide the separation between the relative sending and receiving levels experienced by telephone users. In practice, more modern multi-paired cables had characteristic impedance closer to 140 ohms, resulting in the widespread use (in transmission line applications) of transformers (typically 2:1 ratio) to convert this to 600 ohms.

600 ohm sending and receiving impedance had been adopted by the broadcasters from the beginning of the industry in the 1920s and continued as the industry and its infrastructure grew. During the 1960s, in a period which saw further growth in radio and television output and facilities, it became apparent that the sheer volume of cable installed in the main distribution frames in the large broadcasting centres had effectively generated a large value of self capacitance which, in conjunction with 600 ohm impedance, began to manifest itself in noticeably poor frequency response. As a quick fix, studio sending impedances were dropped to typically 50 ohms and eventually the practice of sending from 'low' impedance studios into high or 'bridging' loads was universally adopted for subsequent analogue use without, it should be noted, any change in the cable infrastructure. The widespread use of 600 ohm systems persisted for much longer in telecommunications.

Given that characteristic-impedance only has significance where the cable distances are a significant fraction of the wavelengths of the signals being carried, it is perhaps odd that 600 ohm usage became so prevalent in broadcast studio centres where the cable runs in general didn't come close to the distances where characteristic impedance is an important factor. The adoption of 'unmatched load' practice, other than for specific long distance circuits, reduced the importance of having to consider specific values for characteristic impedance of cabling, although other factors, such as size, loss, noise resistance and cost still influenced the physical characteristics of cables. The introduction of digital technology, however, revived the importance of characteristic impedance as the cable now had to demonstrate a reliable and predictable performance at frequencies significantly beyond their analogue counterparts and were, of course, now operating with signal wavelengths closer to the run-lengths of the cables. The AES digital audio specification defines an operating impedance of 110 ohms but also allows an operating tolerance that has enabled 100 ohm fixed structured cabling installations to be utilised for the distribution of digital audio with relative confidence.

The widespread adoption of 100 ohm cables in digital applications has led to the reverse question of whether such cables are suitable for traditional analogue applications. The simple answer to this is yes BUT; - if the analogue application was, for example, the use of a quad construction cable for minimising induced noise, or for high power transfer (such as loudspeaker use) this is less likely to be consistent with 100 ohm cable. In general terms, however, using cables such as Canford's D'n'A (Digital and Analogue) range in analogue applications will be perfectly acceptable.

In summary, at analogue frequencies, where the 'normal' cable-run length, usually well below a kilometre, is an insignificant fraction of the signal wavelength, the use of a specific characteristic impedance of a cable in a modern 'bridging' application has very little relevance to the performance. In digital applications it is vitally important to the reliability of the performance particularly over longer-distance run-lengths. Just to put this in context, the wavelength of a 1kHz signal travelling along a typical wire might have a wavelength of approximately 200 kilometres, whereas the frequency response allowing a comfortable handling of an AES/EBU digital signal could imply a wavelength of as little as 200 metres!