Photo courtesy of highlander411 via Flickr.
What is nuclear fusion, and what could it mean for the future of energy?
Though nuclear fusion reactors are not a reality yet, steps taken by US scientists (as published in the journal Nature) proved a big forward movement for the science. By using lasers, the process successfully generated more energy than was absorbed – a first in fusion history.
Fusion energy — which powers our sun and other stars — was first described in 1938, and has been researched by scientists since the 1940’s, according to Encyclopedia Britannica. Though American, British, and Soviet research was classified until 1958, after that time it became a collaborative effort. In the ‘70s, it became a “big science” only fundable through international cooperation.
According to the World Nuclear Association, “fusion power offers the prospect of an almost inexhaustible source of energy for future generations, but it also presents so far insurmountable scientific and engineering challenges.”
How does it work?
Fusion works when the nuclei of two forms of hydrogen are heated to the point at which they transform from gas to plasma, with ions colliding at high speeds. The two are then fused together, resulting in a release of energy.
Gravitational forces and conditions in the sun make this process possible up above, but it’s difficult to replicate on Earth, the World Nuclear Association says, with necessary heats measuring up to 100 million degrees Celsius.
Instead, scientists create vacuum-spaces to test fusion, within which they control the plasmas using high-temp magnetic fields – this is called “magnetic confinement.”
Hydrogen isotopes deuterium (found in seawater) and tritium (produced through nuclear fission) are generally used in this process, with a promising reaction yield of energy in the form of neutron ejection and helium repulsion, as illustrated above.
What has been accomplished?
So what is there to show for 70+ years of research? While the practice fusion itself has been accomplished, a way to commercialize it for actual use as a clean, limitless energy source has always seemed decades away, even as decades passed.
One false alarm of success occurred in 1989 when two electrochemists, Stanley Pons and Martin Fleischmann, announced that they had achieved “cold fusion,” nuclear fusion at room temperature, of which there was found no actual evidence upon peer review.
While there have been viable tokamaks and reactors capable of producing energy, the released energy has never significantly matched or exceeded the amount absorbed, until recently.
How? Scientists in California reportedly used 192 lasers to compress a pellet of fuel (containing deuterium and tritium as gas) and generate more energy than was put into it, the Scientific American says.
The spherical pellet was cooled to –254.55 degrees Celsius, causing the gas to ice. The lasers were then fed through the ice and shot into the pellet’s gold container, imploding to create a hotspot of 50 million degrees C (denser than the center of the sun) where the fusion took place.
Though they did produce more energy than was in the original fuel, NPR notes that only 1% of the lasers’ energy even reached the fuel.
What are the possibilities?
Scientists have dreamed since the time of Albert Einstein about the possibility of a functioning fusion reactor, which could run on hydrogen from seawater, emitting minimal nuclear waste and zero carbon emissions: what some believe is the ultimate sustainable fuel. Now, they’ve taken a tiny – yet simultaneously enormous – step in that direction.
Though Congress granted $500 million in funding to nuclear fusion research in 2014 (an increase of $100 million), more may still be needed to continue this small yet very significant bit of momentum.