Origin of Electrical Power The era of large-scale electric power distribution arguably began on August 26, 1895, when water flowing over Niagara Falls was diverted through a pair of high-speed turbines that were coupled to two 5,000-horsepower generators. The bulk of the electricity produced at about 2200 volts and used locally for the manufacture of aluminum and carborundum. But the following year a portion was raised to 11,000 volts and transmitted twenty miles by wire to the city of Buffalo, where it was used for lighting and street cars. This remarkable achievement was made possible by a series of inventions and discoveries made during the preceding two centuries involving the generation and transportion of electric energy. Some were conceptual, some were technical, and some involved developing a technology to the point where it was economically practical. In the case of producing electricity, practice came first; the concepts followed. Perhaps this is not surprising, given that the only natural source was lightning--impressive enough to capture the imagination but unavailable for close observation or experimentation.. By the beginning of the eighteenth century, however, there was agreement among a group of experimenters that rubbing various materials--most commonly glass--produced a condition that they would call electric; and associated with this condition were sparks and properties of attraction and repulsion. The Englishman, Stephen Hales, demonstrated that this condition could be conducted over wet threads or metal wires. In the 1740s the German invention of a machine that rotated glass spheres so that they could be more easily rubbed made it much easier to generate the electric condition. And the discovery, in Leyden, of a means that apparently accumulated electricity in a jar, made it possible to store electricity for future use and even to carry it from one place to another.
Armed with this conceptual base and with improved machines, a wide range of experimenters tested the medical, biological and chemical effects of the electric fluid. They found, they thought, that it could cure a variety of diseases; it could decompose water into two gases, and it could cause muscles to twitch. This last discovery, made by the Italian Luigi Galvani in the early 1790s, was significant mainly because it caused his countryman, Alessandro Volta, to propose an alternative concept (involving electric differences between pairs of metals) which led him to invent the electric pile (so-named because it consisted of a pile of pairs of metal discs, separated by moistened cloth). Volta made the details of his pile, or battery, known in a publication in 1800.
But, for our purposes, of particular importance was the connection that would be made between electricity and magnetism. And here concept would be the initiating factor. Magnetic rocks (lodestones) occur naturally and had been known for centuries; since the seventeenth century they had even been used for the production of compass needles. But there was no significant theory regarding their means of operation, and certainly none that linked magnetism to electricity. However, Hans Christian Oersted, a Dane, believed as did many of his contemporaries that there should be a certain unity to nature's forces. On the basis of this he embarked on a series of experiments to find a connection between electricity and magnetism. In 1819 he discovered that an electric current in a wire produced a magnetic effect. Almost immediately, in 1820, Michael Faraday, in London, showed how this relationship could be used to produce motion (a primitive electric motor); and William Sturgeon, another Englishman, showed in 1826 that winding a wire around an iron bar concentrated the magnetic effect. Joseph Henry, an American, went on to use many windings of insulated wire to build electromagnets that were capable of lifting a thousand pounds and more. Finally, Faraday, driven by the same notions of unity that had inspired Oersted, discovered in 1831 that an electric current was produced in a wire moving near a magnet--the principle of the generator.
The first practical consequence of all this activity was the electromagnetic telegraph, developed in the late 1830s by Samuel Morse and Alfred Vail in America and by Charles Wheatstone and William Cooke in England. They used batteries to produce electric current, wires to conduct it over substantial distances, and electromagnets to produce an effect at one end when a switch was closed at the other. By using magnets with many windings they were able to operate with low electric currents. This was important because the long conducting wire had an appreciable resistance to the flow of the current, resulting in lost energy. They understood, with some help from Henry and from Georg Simon Ohm, that in a long electric line it was better to have relatively high voltage and low current. They used batteries because the generating effect discovered by Faraday was still very inefficient. Even the best generators up through the middle of the century were impractical except for very specialized applications. Typically the effect was achieved by rotating coils of wire (the armature) across the faces of permanent magnets (the field). In 1866, Wheatstone and Wilhelm Siemens independently constructed generators with electromagnets in the field, letting some portion of the generated current be fed back into these field magnets. They correctly reasoned that there would be enough residual magnetism in these magnets so that the generators would function even from a dead start and quickly build up to full capacity.
Large-scale electricity distribution also provided a new incentive to develop
a less powerful incandescent lamp that could be used indoors. Thomas Edison
achieved this goal in 1879, and in September 1882 he exploited his invention
by establishing a central generating station at Pearl Street in lower
Manhattan. By the mid-eighties towns across America were vying to be first
in their area to be electrified, primarily for lighting. (Pictured
is Edison's paper-filament incandescent lamp used at Menlo Park ca. 1879.
Smithsonian negative #13,369). But all of these were relatively small stations, delivering power through short distances. The reason was clear. There was general agreement that safe operation in the home meant voltages of no more than about 100 volts. But transmission at this voltage (or, for a three-wire DC system, at 200 volts by arranging +100 and -100 around a neutral wire) was efficient only for a half mile or so. Conceptually there was an easy answer: transmit at high voltage and change at the receiving end to low voltage. One could use a high-voltage motor to drive a low-voltage generator, but such a solution was expensive. For alternating current a much simpler mechanism--called a transformer--was available. But no one had invented a practical AC motor, and it didn't make sense to create a system that was only good for lighting. In the United States, in 1888, Nikola Tesla made the breakthrough. His best motor designs called for two- or three-phase operation. Existing AC systems were single-phase, with voltage and current undergoing regular reversals. In a multiple-phase generator, which Tesla also designed, two or more currents were produced at the same time, with their phases overlapping. It was with Tesla's patents that George Westinghouse won the contract to construct the generators at Niagara Falls. By the end of the century Westinghouse had supplied ten two-phase generators, operating at a frequency of 25 hertz, thus completing the first Niagara power station. General Electric would build the next eleven units, completing a second station by 1904. Although changes would be made in number of phases and frequency, and certainly in the power of individual units, Niagara demonstrated clearly that large-scale generation and transmission of electricity was conceptually sound, technically feasible, and economically practical. It set the stage for developments for the century to come. (Pictured is a Tesla Motor, 1888. SI negative #79-9471-4.) |