In a quest to find inventor heros, Nicola Tesla (1856 – 1943) is frequently attributed as the inventor of alternating current (AC). Unfortunately, the world is a messy place, and a long list of contributors to the development of AC needs to be acknowledged. In 1831, Michael Faraday (1791 – 1867) devised a machine that generated electricity from rotary motion. This was made into a machine by Hippolyte Pixii (1808 – 1835) in 1832.
“ZBD” (Károly Zipernowsky, Ottó Bláthy and Miksa Déri) invented a highly efficient transformer in 1885. Transformers are important because they allow different voltages to co-exist on a network. High-voltages reduce transmission losses when transferring energy over long distances. Low-voltages offer safer environments in domestic, commercial and industrial settings.
Tesla did play a role in AC development, but is usually remembered for the invention of an AC motor, rather than anything to do with transformers or generators. The challenge at the end of the 19th century and the beginning of the 20th, was to develop a safe, convenient electrical system that could be commoditized.
Perhaps one should go further back in time, with William Gilbert’s experiments on the relationship between static electricity and magnetism, recorded in De Magnete (1600). Benjamin Franklin, is famous for his kite in lightning experiment of 1752. Alessandro Volta is credited as the inventor of the first electrical battery, the Voltaic pile in 1799. Even if one regards Faraday’s thought experiment as the starting point, it took almost 50 years for the technology to reach a commercially viable stage. In 1878 the time was ripe. Joseph Swan, Thomas Edison and perhaps as many as fourteen others developed domestic light bulbs. which led to the first commercial power plant in 1881.
As AC systems rapidly expanded in the United States, at the expense of DC systems, a media war of the currents emerged in the late 1880s and early 1890s. Many see it as a propaganda campaign by the (DC oriented) Edison Electric Light Company to stifle commercial competition by raising electrical safety issues that put its rival, (AC oriented) Westinghouse Electric Company, in a bad light.
Unfortunately, one of the main challenges with DC, is its inability to transform to lower or (especially) higher voltages, which was needed for the economic transmission of power over long distances. DC power conversion is not a hurdle today, and HVDC (high-voltage, direct current) systems always includes at least one rectifier (converting AC to DC) and one inverter (converting DC to AC). HVDC systems can be less expensive to construct, and offer lower electrical losses compared to equivalent AC systems. HVDC is especially allows transmission between unsynchronized AC transmission systems. ABB entered into a contract in China in 2016 to construct an ultra-high-voltage direct-current (UHVDC) line featuring 1.1 MV voltage, 3,000 km transmission length and 12 GW of power, which, when completed, would set world records for highest voltage, longest distance and largest transmission capacity.
There is a lot of uncertainty about reason having any role in the selection of an AC frequency. Since the main purpose of electricity was to provide lighting, the only consideration was to prevent flicker. Thus, a multitude of frequencies emerged, in the period 1880 through 1900. Single-phase AC was common and typical generators were 8-pole machines operating at 2000 RPM, a common frequency was 133 Hz.
At the other extreme 16.7 Hz is used in AC railway electrification system in Germany, Austria, Switzerland, Sweden and Norway. The low frequency was chosen to reduce energy losses from early 20th century traction motors. The high voltage (15 kV) enables high power transmission. The preferred standard frequency for new railway electrifications is 50 Hz with an evening higher voltage (25 kV). Yet, extensions to existing networks, often use 15 kV, 16.7 Hz electrification. High conversion costs mean that older systems keep their voltage and frequency, despite potential on-board step-down transformer weight reductions to one third that used on the older system.
In 1877, Charles Renard proposed a set of preferred numbers, later adopted as international standard ISO 3 in 1952. This system divides the interval from 1 to 10 into 5, 10, 20, or 40 steps, leading to the R5, R10, R20 and R40 scales, respectively. For some, the R5 series provides a too fine graduation. Often a 1, 2, 5 series is used, a R3 series rounded to one significant digit, a pseudo preferred number.
Myth has it, that when AEG built their European generating facility, its engineers decided to fix the frequency at 50 Hz, because the number 60 wasn’t a “R3” preferred number. This standard spread to the rest of the continent, including Britain after World War II.
Lower frequencies have a number of negative characteristics. Not only is 50 Hz 20% less effective in generation, it is 10-15% less efficient in transmission, and requires up to 30% larger windings and magnetic core materials in transformer construction. Electric motors are much less efficient at the lower frequency, and must also be made more robust to handle the electrical losses and the extra heat generated. But there are advantages too, such as lower impedance losses.
Yet, there are enlightened countries with the insight to follow Tesla’s advice and use the 60 Hz frequency together with a voltage of 220-240 V: Antigua, Guyana, the Leeward Islands, Peru, the Philippines and South Korea.
Originally Europe was 110 V too, just like Japan and North America today. Voltages increased to get more power with less voltage drop (power loss) from the same copper wire diameter. At the time the US also wanted to change but because of the cost involved to replace all electric appliances, they decided not to. At the time (50s-60s) the average US household already had a fridge, a washing-machine, etc., but this was not the situation in Europe.
The end result is that now, the US seems static. It appears to be the same now as it was in the 1950s and 1960s. It still has to cope with transformer related problems, such as light bulbs that burn out rather quickly when they are close to the transformer (too high a voltage), or too far away, with insufficient voltage at the end of the line (105 to 127 volt spread !).
Most new North American buildings provides a 240 volt residential service in the form of two 120 volt conductors and a neutral conductor. When a load is applied from either 120 volt conductor to the neutral it uses 120 volts. When a load is applied from one 120 volt conductor to the other, without using the neutral, 240 volts is used, which is useful for air conditioners, clothes dryers, electric furnaces, stoves, water heaters and others high power appliances.
There is some confusion about North American voltages. It is 120 V, not 110 V. This was increased starting in the 1950s. The historic reason for 110 V was due to Thomas Edison’s DC power systems, which probably used 110 V because that was the optimal voltage for his light bulbs. These systems converted to AC, but the voltage wasn’t changed so existing lighting didn’t need to be replaced.
North Americans could get a better system than Europeans, with no infrastructure changes, except inside buildings. Since houses get 240 V delivered, wall outlets could be supplied this too, offering the lower current and higher power advantage of the European system.
In terms of safety, current (amperes) kills, not electrical potential (volts). Even so, 240 V is regarded as more dangerous than a 120 V system. To compensate Europeans use high quality insulation and wiring methods, that include Ground Fault Current Interruptor (GFCI) or Residual Current Device (RCD) in the breaker box to cut the supply instantaneously if any significant difference is detected between the currents flowing in the live (hot) and neutral wires. This saves lives.