As mentioned above, nuclear energy is released either when an atom is split (fission) or when atoms are joined (fusion). While fusion is still being developed on an industrial scale, we will start by looking at how nuclear fission works…
Nuclear fission occurs when the nucleus of an atom is split into two or more smaller nuclei. For example, an atom of uranium-235 splits into nuclei of barium and krypton along with two or three neutrons. These additional neutrons hit other uranium-235 atoms, causing them to also split and generate more neutrons in a chain reaction that also releases energy in the form of heat and radiation. The heat from this release of energy can be converted into electricity at a nuclear power plant, usually by heating water to produce steam that then spins turbines, activating a generator and creating low carbon electricity. This process can be broken down into eight steps, as follows:
1. The Reaction
Sealed metal cylinders containing uranium are placed inside the steel reactor vessel where neutrons are fired at uranium atoms, which split and release more neutrons that, in turn, hit other atoms creating a chain reaction that generates heat.
2. Water is Heated
Water flows through the reactor vessel where it is heated to around 300°C by the chain reaction. To prevent the water from boiling away, a pressuriser applies around 155 times atmospheric pressure.
3. The Hot Water Circulates
The heated, pressurised water is then circulated from the reactor vessel to a steam generator.
4. Steam Generation
The hot, pressurised water flows through a series of looped pipes as a second stream of water is passed around the outside of the pipes. This second stream of water is under much less pressure so boils away into steam.
5. Steam Becomes Electrical Energy
The steam that has been generated in step four is then passed through a series of turbines, causing them to spin and creating mechanical energy. The turbines are connected by a shaft to a generator that uses an electromagnetic field to turn the mechanical energy into electrical energy.
6. Electrical Energy goes to Grid
The electrical energy is converted into a high voltage by a transformer it can go to the grid system.
7. Electricity Sent to Homes
The high voltage electricity is transmitted through power lines to homes, businesses and services where it is transformed back to a more usable level.
8. Generator Steam is Cooled and Recycled
Meanwhile, back at the power station, the steam that powered the turbines is passed over pipes of cold water pumped in from the sea. This cools the steam and condenses it back to water so it can be reused to turn the turbines and generate electricity again.
Nuclear fusion has the potential to revolutionise energy production with no radioactive waste or greenhouse gas emissions and a virtually limitless source of energy. Where nuclear fission splits atoms to generate energy, nuclear fusion combines atomic nuclei to release energy. This is the same process that occurs in the Sun’s core, but creating the right conditions for fusion has posed challenges for decades, including how to overcome the natural repulsion between atomic nuclei and how to create more energy than is consumed by the process.
Ventures in China, Europe, Australia, the United States and the UK have been working to solve the technical challenges of nuclear fusion. This includes the ITER Project, where a tokamak reactor uses a gas, usually a hydrogen isotope called deuterium, which can be extracted from seawater. The deuterium is subjected to high heat and pressure, forcing electrons from the atoms and creating a plasma. Plasma is a superheated, ionised gas which can reach temperatures of 100,000,000°C or more and has to be contained by strong magnetic fields. These temperatures are close to ten times that found at the Sun’s core, yet are required for the process since it is impossible to create the gravitational pressure within the Sun instead. Alternative reactors that are being tested use lasers to heat and compress hydrogen fuel to create fusion.
Fusion power plants operate in a similar manner to fission plants, in that heat created by the atomic reaction is used to produce steam and drive turbines to generate electricity. However, nuclear fusion for regular energy generation seems to be some years away yet.
Nuclear energy offers a number of advantages, particularly related to climate change:
- A low carbon energy source with a small carbon footprint
- Produces electricity around the clock, without relying on environmental factors such as wind to generate power
- Cost-effective to run
There are also a number of disadvantages associated with nuclear energy, including:
- Nuclear plants are expensive to build
- Danger of potential accidents and the associated security threat
- Produces radioactive waste that needs to be stored without harming the environment
- Can release pollutants into the environment
Nuclear energy delivers more uses than providing carbon-free electricity. Nuclear powers space exploration, provides water through desalination, is used to sterilise medical equipment, and supplies radioisotopes for treating cancer. For example, the isotope cobalt-60 can be produced by commercial nuclear plants and used for cancer treatments, medical imaging and for sterilising medical equipment. Nuclear radiation used to treat food and kill illness-causing bacteria, insects or parasites. Radiation is also used in small amounts for smoke detectors, photocopiers and other consumer products. In addition, nuclear is used to power submarines and aircraft carriers as well as space exploration craft like the Cassini-Huygens probe and the Mars rover, Perseverance.
Decades of early work into understanding radioactivity and nuclear physics led to the discovery of nuclear fission in 1938. This led, in 1939, to the discovery that nuclear fission can create a self-sustaining chain reaction of further nucleus fissions. The outbreak of the Second World War in 1939 led scientists to turn their attention to fission research for the development of nuclear weapons rather than for energy production.
In the United States, this nuclear weapons research – known as the Manhattan Project – led to the creation of the first nuclear reactor, the Chicago Pile-1, which achieved criticality on 2 December 1942. From here, larger single purpose production reactors were built to produce weapons-grade plutonium, with the first nuclear weapon test, the Trinity Test, taking place in July 1945, followed by the bombings of Hiroshima and Nagasaki one month later.
While the first nuclear devices were military in nature, there was a strong feeling during the 1940s and 50s that nuclear power could be harnessed to provide cheap energy. This belief came to fruition on 20 December, 1951, when the EBR-1 experimental station near Arco, Idaho became the first nuclear reactor to produce electricity. In 1953, U.S. President Dwight Eisenhower spoke on developing “peaceful” applications for nuclear energy in his “Atoms for Peace” speech. The U.S. Atomic Energy Act in 1954 followed this speech, allowing for the rapid declassification of U.S. reactor technology, encouraging development by the private sector.
However, the first organisation to develop practical nuclear energy use was the U.S. Navy with the S1W reactor, which was used to create energy for submarines and aircraft carriers. This reactor was a Pressurised Water Reactor (PWR), designed to be simple, compact and easy to operate, making it suitable for use in submarines. As a result, the PWR became the chosen method for power production for the civilian market. The first nuclear-powered submarine, the USS Nautilus, put to sea in January 1954 and became the first vessel to reach the North Pole in 1958.
Meanwhile, on 27 June 1954, the USSR’s Obninsk Nuclear Power Plant became the first in the world to generate electricity for a power grid, with the production of 5 megawatts of electric power. Around two years later, on 27 August 1956, Calder Hall at Windscale in England was connected to the national power grid, becoming the world’s first commercial nuclear power station. However, Calder Hall didn’t just produce electricity, as it was also used to produce plutonium-239 for the British nuclear weapons programme.
It was around this time that the first nuclear accidents occurred, including a fire at Windscale and the Kyshtym disaster in the Soviet Union in 1957. An uncontrolled chain reaction at the U.S. Army’s experimental SL-1 reactor at the Idaho National Laboratory caused a steam explosion in 1961, killing three people and causing a meltdown. In 1968, two liquid-metal-cooled reactors had a fuel element failure on the Soviet submarine K-27, this caused gaseous fission products to leak, killing nine crew and injuring 83 more.
These accidents must have fed into the opposition to nuclear power that began in the United States during the early 1960s. By the late 1960s the scientific community added their own concerns around potential nuclear accidents, nuclear proliferation and terrorism, and the disposal of nuclear waste. Anti-nuclear activism grew through the 1970s and the growing public hostility led to increased licence procurement processes, regulations and safety requirements that all increased the cost of new construction. Projects began to be cancelled and, although there were no fatalities, the 1979 Three Mile Island accident only served to further slow the global desire for new plant constructions.
This trend did not carry to France and Japan, who began to invest in nuclear following the 1973 oil crisis and, by the 1980s, the nuclear industry was beginning to see a renaissance. However, the Chernobyl disaster of 1986 in the USSR created another tragic turning point in the history of nuclear power. Considered the worst nuclear disaster in history, Chernobyl led to 56 direct deaths and clean-up costs in the billions. Because of this disaster, nuclear safety and regulation was improved and the World Association of Nuclear Operators (WANO) was created to promote safety awareness and the professional development of nuclear operators. The disaster also saw a reduction in the number of plant constructions over the following years and Italians voted against nuclear power in a 1987 referendum, phasing it out in Italy by 1990.
The early 2000s saw nuclear energy grow once more as concerns grew over carbon dioxide emissions and a new generation of reactors began construction. However, a 2011 tsunami triggered by the Tōhoku earthquake caused three core meltdowns after the emergency cooling system failed due to a lack of electricity supply at the Fukushima Daiichi Nuclear Power Plant. Again, the safety of nuclear power reactors was questioned and a number of countries began to review or even close their nuclear power programmes.
Revised criteria for operations and a series of safety checks in the following years, coupled with an awareness of the importance of low carbon generating power to mitigate climate change, and the retirement of old assets saw a new generation of power stations coming online from 2015. The largest growth is expected in the United States and Asia, with China being forecasted to become the world's largest generator of nuclear electricity.
What are three types of nuclear energy?
Nuclear energy can be produced through nuclear fusion or nuclear fission. Fusion involves the joining of atomic particles together while fission is the splitting of atoms to produce energy. Fission can be either spontaneous or induced, creating three potential types of nuclear energy.
What is the most common nuclear energy?
Fission is the most common method of producing nuclear energy, with uranium being the most common nuclear fuel. Uranium is an abundant metal that is mined and process into U-235, an enriched form of the metal that is used in nuclear reactors as its atoms can be easily split. Uranium is 100 times more common than silver, but U-235 is much rarer, making up just over 0.7% of natural uranium.
Can nuclear energy replace fossil fuels?
Nuclear power is already generating nearly one-third of the world’s carbon free electricity and is an important part of a wider energy mix that could replace the use of fossil fuels and meet climate change goals. This energy mix would include a range of energy sources, including renewables like wind power, solar and geothermal.
Can nuclear energy be stored?
Nuclear energy, like other energy sources can be stored on the grid using a variety of techniques, including:
- Lithium-ion batteries: Similar to the battery in your smartphone, except much larger, these batteries store electricity as chemical energy that can be discharged back into the system as required.
- Pumped hydroelectric storage: This common form of energy storage involves the pumping of water uphill to a reservoir. This water can then be released as required, turning turbines and generating electricity.
- Hydrogen storage: Hydrogen can be produced by splitting water molecules and then burned to generate electricity again or fed into fuel cells that produce electricity and water.
Can nuclear energy be used for space travel?
Nuclear energy has been used for space travel, including powering the Cassini probe that travelled for 1 billion miles to study Saturn. Other equipment, such as the Mars rover Perseverance use nuclear energy as a source of power.
Can nuclear energy save the planet?
Nuclear energy can play a part in saving the planet as part of a lower carbon energy mix, including wind, solar, hydro and geothermal power. A single uranium pellet, which is the size of a peanut, can produce as much energy as 800kg of coal. With low running costs, nuclear is a reliable source of power that can help save the planet.
Can nuclear energy solve climate change?
Nuclear energy is unlikely to solve climate change on its own, but it can play a role as part of a wider strategy. Already generating nearly one-third of the world’s carbon free energy, nuclear is an important part of the green energy mix that could help solve climate change.
Can nuclear energy be used for transportation?
Nuclear energy can be used for transportation, either through energy production or more directly, for example, with spacecraft and submarines.
Will nuclear energy run out?
Although mined uranium is expected to run out in around two hundred years, nuclear energy will not run out as uranium can be extracted from seawater, offering an almost inexhaustible supply.
Uranium oxide is soluble in water so, as rain hits the ground it dissolves small quantities of uranium that then flows through the rivers and into the sea. The sun then evaporates the water way, leading the uranium to be concentrated. This process has been occurring naturally for the last 4 billion years, leading to a current concentration of uranium in seawater of 3.3 parts per billion.
This uranium can be extracted from the seawater, although this process is around 4 times the cost of the current market price for uranium experts believe that it should be possible to cover this increase in cost as the current price for uranium is low. There is also an option to use thorium for nuclear energy production, which is around 4.5 times as plentiful in the Earth’s crust than uranium.
Is nuclear renewable?
Although nuclear is a clean energy source it is not, strictly speaking, renewable. This is because mined uranium is a finite fuel source. However, if extracted from seawater, as discussed above, uranium will be close to renewable as the resource is replenished by the flow of dissolved uranium into seawater.
Is nuclear energy safe?
The safety of nuclear power plants has been highlighted by high profile accidents such as the Chernobyl and Fukushima disasters, leading many to wonder if they are a safe option for energy production. It is true that there are three distinct characteristics that affect the safety of these plants; the radioactive materials in the reactor could prove hazardous if leaked to the environment; the highly radioactive fission products can continue to decay and release heat that can lead to overheating and radioactive leakage; and finally a criticality accident where the nuclear fission chain reaction cannot be controlled.
However, modern reactors have been designed to prevent an uncontrolled increase in reactor power through the use of a negative void coefficient of reactivity. This means that the fission rate will decrease as the temperature or amount of steam in the reactor increases. In addition, the nuclear chain reaction can be stopped manually through the insertion of control rods into the reactor core. Emergency core cooling systems (ECCS) work to remove decay heat should the normal cooling systems fail, while physical barriers including the containment building itself also serve to limit the release of radioactive materials into the environment in the case of an accident.
Statistically-speaking, nuclear power has a death rate of 0.07 per TWh of energy produced, which is less than the deaths caused per unit of energy as a result of accidents and air pollution from coal, petroleum, natural gas and hydropower. In fact, nuclear power is estimated to have prevented 1.8 million deaths between 1971 and 2009 by reducing the amount of energy generated by fossil fuels.
Despite this, there is still concern over nuclear accidents, which have been shown to have social and psychological effects as people are evacuated from areas where accidents have occurred. A study found that poor mental health was the largest public health impact of the Chernobyl disaster. The American scientist, Frank N. von Hippel stated that a disproportionate fear of ionizing radiation (radiophobia) could have long-term psychological effects on the population of areas contaminated by the Fukushima disaster.
Where was nuclear energy discovered?
Nuclear energy was discovered in Berlin, Germany, by radiochemists Otto Hahn and Fritz Strassman as they worked in their laboratory in December 1938.
Who discovered nuclear energy?
As mentioned above, Otto Hahn and Fritz Strassman are widely credited with discovering nuclear energy in 1938. However, this discovery built upon centuries of scientific thought and innovation.
Ancient Greek philosophers came up with the idea that matter is composed of invisible particles, which are called atoms from the Greek word ‘atomos,’ meaning indivisible. Scientists working in the 18th and 19th centuries developed this idea and, by 1900, physicists had worked out that atoms contain large amounts of energy. Writing in 1904, British physicist Ernest Rutherford (called the father of nuclear science), noted, “If it were ever possible to control at will the rate of disintegration of the radio elements, an enormous amount of energy could be obtained from a small amount of matter.” One year later, Albert Einstein developed his theory for the relationship between mass and energy, with the formula E=mc2, or “energy equals mass times the speed of light squared.”
In 1934, Enrico Fermi’s experiments in Rome showed that neutrons could be split into different kinds of atoms. However, when he bombarded uranium with neutrons, he got elements that were much lighter than expected.
In December 1938, Otto Hahn and Fritz Strassman fired neutrons from a source containing radium and beryllium into uranium (atomic number 92) at their laboratory in Berlin. They were surprised to find lighter elements, such as barium (atomic number 56), in the leftover materials, which had only about half the atomic mass of uranium.
Hahn and Strassman contacted their Austrian colleague Lise Meitner who was living in Copenhagen after fleeing Nazi Germany. Working with Niels Bohr and her nephew Otto R Frisch, Meitner found that that the atomic masses of the fissioning products did not match that if uranim. She determined, through Einstein’s theory, that the lost mass changed to energy, proving that fission had occurred. Bohr travelled to the United States in 1939, where he met Einstein and shared the discovery as well as discussing the possibility of sustaining a chain reaction with Fermi at a theoretical physics conference in Washington, D.C.
In 1942, Fermi and his colleague Leo Slizard devised a design for a uranium chain reactor based on uranium placed in a stack of graphite to create a cube-like frame of fissionable material.
Following discussions at the University of Chicago, Fermi and a group of scientists were ready to test the theory with construction of world’s first nuclear reactor in November 1942, Chicago Pile-1. Erected on the floor of a squash court under the University’s athletic stadium, the reactor contained uranium, graphite and cadmium, which helped control the reaction by absorbing neutrons. The cadmium rods were slowly withdrawn from the pile during a demonstration of the reactor on 2 December 1942, allowing the reaction to speed up until, at 3:25pm Chicago time, the nuclear reaction became self-sustaining. This marked the start of the nuclear age. However, nuclear power generation would be a secondary concern compared to atomic research to develop a weapon for use in World War II as part of the Manhattan Project.
Nuclear energy can be produced through either fission or fusion. Nuclear reactors have used fission, or the splitting of atoms, to produce energy for decades. However, the use of fusion, the joining of atoms, is still in the experimental stage for energy production. Fusion occurs naturally in the core of the Sun, where hydrogen atoms fuse together to form helium.
While nuclear fusion is still being developed, engineers are also working on small nuclear reactors (SMRs), which offer a simple and cost-effective solution for nuclear power production.
Whether fission of fusion, nuclear power offers a clean energy source that is sure to be part of the energy mix for the future as we work to reduce emissions and tackle climate change.