Energy Crises of the 1970's

The good times for the utility industry would not last forever. A combination of "macro-economic" events occurring outside the realm of the industry and the apparent end of progress with traditional technology within the industry challenged the grow-and-build strategy in the 1970s.

Energy Crisis 1973


PEANUTS© United Features Syndicate, Inc.
Reprinted with permission by UFS.

 

Hints of Energy Problems

Most people who think back to the energy problems of the 1970s point to the Arab oil embargo of 1973 as the event that triggered economic and energy malaise for the United States. While the embargo certainly began what became known as the "energy crisis," it was not the first event that suggested problems with the American energy system.

 

Growth Spurts and Problems Meeting Demand

Hints of problems occurred in the late 1960s, when demand for electricity in some parts of the country exceeded the traditional 7-to-8% annual growth rate, and when some utilities could not build enough capacity. (Virginia Electric Power Company, for example, witnessed 14% per year spurts in the late 1960s as a result of rapid population and industrial growth in its service territory.) While the companies ordered equipment from manufacturers, a large backlog of requests for huge turbine-generators from other companies, combined with delays occurring during construction of the plants, meant that some firms could not meet demand. During the blistering hot summer days of 1967 through 1969, some utilities on the east coast reduced voltage as one way to deal with increased demand, causing "brownouts," while they asked customers to reduce power consumption.

 

Fuel Shortages before 1973

Compounding the problem, utilities sometimes could not buy enough fuel to heat the boilers that powered the turbine-generators. While the most abundant fuel in the United States, coal came in short supply as mine operators opened few new mines in the 1960s, convinced that utilities would move to adopt new nuclear power plants, which powered their boilers with small amounts of uranium. Shortages of natural gas also occurred in the early 1970s, causing some factories in the industrial midwest to shut down. At the same time, electric utilities increased their dependence on oil, because it burned more cleanly than coal, and because the public and Congress had become concerned about the environmental effects of producing electricity. The Clean Air Act of 1970 put further environmental pressure on utilities, and they increasingly bought oil from the Middle East, whose oil contained less pollution-causing sulfur. Unfortunately for utilities, as American production of oil peaked and then declined, the Middle Eastern countries, which had created the Organization for Petroleum Exporting Countries (OPEC) in 1960, began flexing its muscles and demanded increased royalty payments from oil companies, thereby sending prices up. The overall energy situation became so precarious that, in 1971, President Nixon called on Congress to pass legislation for enhancing production of energy resources while also encouraging conservation.

 

The Energy Crunch Begins

But the county's energy problems truly became raised to a high state of consciousness after the OPEC countries embargoed their oil in October 1973 to countries that supported Israel during the Arab-Israeli War. Begun on 6 October, when Egyptian forces attacked Israel on Yom Kippur, the holiest day of the year in the Jewish state, the war lasted into the next year, with Israel occupying territory formerly held by Egypt and Syria. To punish western supporters of Israel, however, Arab supporters within OPEC stopped shipping oil them. When they lifted the embargo in early 1974, they raised the price of oil dramatically, from under $2 per barrel before the October war, to more than $12.

The United States depended on OPEC oil for under 15% of the country's petroleum needs, but it proved critical for transportation and industrial needs. The price hikes rippled throughout the energy industries and caused alternative fuels, such as coal, to increase in price as well. As one manifestation of the energy crisis, Americans impatiently waiting in lines at gas stations to pay higher prices for rationed petroleum products. As another manifestation, the higher energy prices caused the cost of other goods to increase, spurring the inflation rate to jump to about 10% by early 1975. Investors in the stock market had little to cheer about either. As consumers spent more money on energy and less on other goods, business activity slowed, and the Dow Jones Industrial average dropped 45% from its 1973 high to its 1974 low. President Nixon tried to urge the country to conserve resources and develop new sources of energy, so that the country would become energy self-sufficient by 1980. Incapacitated politically by the Watergate scandal, however, the President could not lobby effectively with Congress to pass effective emergency legislation in 1973 or 1974.

 

Technological problems

In previous decades, the utility industry had been successful in mitigating higher costs of construction, fuel, and other necessities by exploiting improved thermal efficiencies and scale economies of power plants. But by the time of the energy crisis, it appeared that the traditional sources of technological progress had been tapped out.

 

Limit to Improvements in Thermal Efficiency

From Edison's day through the 1960s, utilities enjoyed the ability to obtain more kilowatt-hours from each unit of fossil fuel. In the 1950s, manufacturers drove up steam temperatures and pressures and could achieve, with the best unit, about 40% thermal efficiency. But even at this level, problems occurred that made utility managers wary of trying to exceed this point. When the hot, super-pressurized steam pressed against the parts of the turbine, it caused metallurgical fatigue that caused reliability to decrease and maintenance costs to increase, even with the specialized (and expensive) metals that were used. As a result, utility economists realized that lower-efficiency plants might be more economical to operate. Though they would not eke out the last kWh out of a barrel of oil, they would be cheaper to build and less costly to maintain. While making economic sense, though, the decision to stop pushing thermal efficiencies yet higher also meant that utilities could no longer expect to see cost declines from this source of technological advance.

 

Economies of Scale Barriers

The other source of cost reductions came from increasing the capacity of turbine-generators. Though utilities ordered power units in ever-larger sizes, they found in the late 1960s and early 1970s that economies of scale no longer were automatic. It appeared that metallurgical problems began occurring with the huge blades attached to the massive turbine shafts when they reached and exceeded the 1,000 MW mark. Moreover, as the big units became more complex, they seemed to endure more failures than their smaller counterparts, partly because of faulty design strategies undertaken by manufacturing companies. Some economists posited that the optimum capacity of power units remained in the 500-to-900 MW range, and utilities after the early 1970s stopped ordering gigantic units as a consequence. (The slower growth rate of customer usage after the energy crisis also suggested that utilities did not need as many of the huge units.)

 

Woes with Nuclear Plants

Of course, some people pointed to nuclear power plants as the answer to the energy crisis. After all, the plants heated water in a nuclear boiler (with the rest of the plant ideally similar to fossil-fuel burning plants), but it used a fuel--uranium--that was not controlled by OPEC and which produced a lot of heat from small amounts of the material. Unfortunately for nuclear's proponents, this technology had problems as well, as its opponents made increasingly clear to the public. First of all, nuclear power plants emitted more thermal waste than fossil plants, and that thermal pollution became subject to environmental protection laws. Complying with the laws caused the cost of the plants to increase as manufacturers developed means to cool the output before it re-entered rivers and oceans. Perhaps more importantly, the redundant safety systems of the plants, which were designed to prevent leaks of dangerous radioactive materials and by-products, were shown to be inadequate in some circumstances. When challenged by the Union of Concerned Scientists in 1972, for example, the Atomic Energy Commission admitted that the vital safety feature of nuclear plants may not have been as reliable as previously believed. Accidents at the Tennessee Valley Authority's Brown's Ferry nuclear plant in 1975 and the near-meltdown of the Three Mile Island nuclear unit in 1979 further convinced many people that the risks of nuclear power might not be worth the supposed benefits of inexhaustible supplies of nuclear-electricity.

At the same time, these plants also became increasingly complex and difficult to construct, thus raising their cost dramatically. Even if the energy cost of the nuclear plants were near zero, the capital costs were huge--sometimes as high as five-times the average cost of construction for more conventional plants in the 1970s and 1980s. That feature alone suggested to many people--nuclear opponents and utility executives--that nuclear power would not pull the country out of the energy crisis by itself.

Net Effects

 

Higher Prices

As energy prices increased and as technological progress in generating electricity failed to mitigate those higher prices, customers and others quickly became disenchanted with the utility system. Having paid only 2.2 cents per kWh in 1969 (in current terms), the average residential customer in 1977 paid almost double that rate--just over 4 cents. As rates increased, customers made efforts to reduce consumption. During the crisis period between 1973 and 1974, electricity usage actually dropped 0.1%. The next year, it grew at 1.9% per year, and it rose for the next several years by about 2.5% annually, which compared to a 7-to-8% rate before the crisis.

 

Customer Unrest

As can be imagined, customers did not enjoy watching their rates and bills increase, and they complained to state regulators. Unfortunately, regulators found themselves in an unusual position and ill-prepared to deal with the new stresses resulting from the energy crisis. On one hand, they had an obligation to the public to assure that they obtained a reliable supply of electricity at "fair" prices. On the other hand, they needed to assure the financial integrity of power companies so power would always be available to customers. To accomplish the latter task, commissions usually acceded to requests by utility companies for higher rates to pay for the greater costs of constructing power plants and the fuel that fed them.

 

Regulation Unable to Cope

At the same time, regulators generally had few tools and skills for immediately helping to resolve a difficult situation. During the golden years of the utility business, regulators had little to do but to approve rate decreases. Thus, commission work did not attract the best and brightest individuals. In many states where governors had the prerogative, the elected officials appointed members of commissions in return for political favors. Rarely was a commissioner appointed because of his or her recognized ability to understand issues of public utility finance or management. Moreover, with limited funding from state legislatures, commissions could not hire the expert and inquisitive type of staff members who could develop innovative solutions to difficult problems. As a result, regulation--even after the energy crisis hit--proved to be fairly unimaginative and acquiescent to the demands of utility companies, often to the chagrin of customers who paid higher rates.

 

An Unforgiving Economy Troubles the Utility Industry

The higher fuel prices that were passed on to utility customers were just one of the sources of complaint by ratepayers. Utility managers too disliked watching their business, which they honestly believed provided an essential product for social and economic progress, being criticized by customers, politicians, and anyone else who seemed to need an easy target. But the executives also fretted because other costs had begun to skyrocket as well; these costs too would ultimately be reflected in higher rates.

 

Interest Rates Soar

One such cost was that of growing interest payments on bonds that utilities floated so they could pay for construction of new plants. But with inflation rising, partly because of the energy crisis, so did the cost of borrowing money. Through the 1950s, utilities could borrow money at 5% or so annual rates. As inflation soared in the 1970s and early 1980s, the companies were forced to pay more than 15% or more, and that was if they enjoyed good credit ratings. At that rate, a bondholder received the equivalent of his or her invested capital back in interest payments in fewer than six years. When building the delayed and troubled Shoreham nuclear power plant, the Long Island Lighting Company spent $1.5 million a day in interest payments in 1984, which contributed to the firm's brush with bankruptcy.

 

Escalating Construction Costs

Overall, the cost of building large power plants became a headache to utility companies. Having lost the benefit of economies of scale, new plants would have costs that reflected the staggering price increases seen in an inflationary economy. When operating, the plants would not benefit from increased thermal efficiencies; hence, rates would continue to rise along with fuel costs. Because of the increased complexity of the large plants, they took longer to construct. Coal burning plants ordered in the late 1960s and 1970s took an average of eight years to complete; nuclear plants, because of changing safety and environmental regulations--especially after the near meltdown of a Three Mile Island unit in 1979--took about 12 years to finish. During these years, interest payments had to be maintained to bondholders. Since most states did not allow utilities to recover the cost of plants until they were up and running, the power companies had to fork over huge sums of money years before earning a cent of return on their investments.

 

Stock Market Reflects Utilities' Woes

The poor financial condition of utilities was accurately reflected by the stock market. Reaching heights not seen since before the Great Depression started in 1929, utility stocks peaked in 1965 at the height of the "Golden Years." A decade later, utility stocks had been discarded by many investors, who saw only dim prospects ahead for a beleaguered industry.