One of my favorite bar signs is the one that promises “Free beer tomorrow.” That’s how I’ve always thought of nuclear fusion—a (theoretically) cheap, pollution-free and inexhaustible energy source, the promise of which has pretty much been a decade away ever since the technology was first tested 70 years ago.
When “nuclear energy” is discussed, it’s almost always in reference to nuclear fission, which generates energy by splitting atoms—and is the source of power for nuclear weapons and all of the nuclear generators in operation today.
Nuclear fusion, on the other hand, occurs when two positively charged nuclei merge. It’s the same kind of reaction that powers our sun—sparked by the star’s massive size, heat and gravitational fields. To recreate that reaction on earth requires heating gasses to more than 100 million degrees Celsius and holding them in place with lasers or powerful magnets. That heat and compression overcomes the forces that would otherwise keep the positively-charged nuclei apart, and they fuse together. That fusion releases energy, and if maintained, the ongoing reaction creates more energy than it consumes.
A version of this story first appeared in the Climate Is Everything newsletter. To sign up, click here.
Electricity generated through fusion has no emissions, minimal waste, and there is no risk of out-of-control meltdowns like Chernobyl. The fuel, derived from helium or hydrogen, is cheap and plentiful.
That’s the theory.
In practice, no one really knows. Forcing two nuclei to merge takes enormous amounts of heat and energy, and in the rare instances where it has worked, the energy produced has more often than not been less than the amount required to launch, and maintain, the fusion action in the first place. In order to generate power, the reaction would need to be self-perpetuating. For the scientists pursuing the atomic equivalent of …read more
Source:: Time – Science