In the mid-1950's, Lewis Strauss, chairman of the US Atomic Energy Commission, predicted that electricity produced by nuclear power would soon be "too cheap to meter". There was great precedent for Mr. Strauss' comments. In the 10 years since the use of nuclear energy to end World War II, scientists had been developing ways to harness the power of the nucleus with great success. From nuclear-powered submarines to radioactive battery cells, progress proceeded at such a pace that the promise of vast leaps in technology that would make our lives so much easier and healthier seemed right at hand.

This promise, though, never has come to fruition. Far from being the savior to our energy needs as promised, nuclear power has never accounted for much more than 20% of the total electrical use in the U.S., which is less than 10% of the total energy needs. Nuclear energy has proved more costly than it was originally thought, both in terms of construction costs for plants and disposal facilities and in terms of environmental damage. The erosion of public opinion on nuclear power during the early 1970's was accelerated by accidents at Three Mile Island and Chernobyl in the late 1970's and mid 1980's. For the last 20 years, the nuclear power industry in the U.S. has been in stasis, as no new plants have been proposed and several proposed plants have been converted to other forms of energy.

The situation in the U.S. is not universal. Other countries, most notably France, rely heavily on nuclear power for their energy needs. They point to nuclear energy's positive points (no emission of greenhouse gases, reliability of fuel sources, etc.) as being stronger reasons to use it than its negative points (waste disposal, possible radiation leaks, etc.) for not using it. Recent concerns over global warming and our country's reliance on imported oil have caused some in the U.S. to give nuclear energy a new look.

This module will look at the science behind nuclear energy and the methods used to generate electricity from it.

Follow