TL;DR
Fusion holds out the hope of limitless energy but has been limited by the tremendous challenge of sustaining fusion reactions. In today’s issue, we review some of the leading technical approaches to fusion and discuss some computational strategies that could help accelerate the advent of practical fusion reactors.
A Brief Overview of the Science of Fusion
Nuclear fusion occurs when simpler nuclei combine to form a more complex nuclei. The diagram below models the combination of deuterium and tritium to form Helium.
Nuclear fusion can occur since larger nuclei are more energetically stable (see diagram below) than smaller nuclei up to a point. As a result, there is a net energy gain that occurs when small nuclei combine to form larger nuclei. Of course, such a combination can only occur at extremely high temperatures since the nuclear forces binding protons and neutrons must be overcome for the fusion reaction to occur.
In nature, fusion powers stars. Leveraging fusion here on earth has been a longstanding dream for scientists and governmental planners. However, maintaining the extremely high temperatures needed for sustainable fusion reactions has proven extraordinarily difficult. One of the leading approaches for achieving fusion, the tokamak, maintains high temperature plasma in a donut ring as the diagram below shows. The ITER project is working on building the world’s largest tokamak reactor (see diagram below) and aims to produce more power than it consumes upon full operation.
Fusion science has improved steadily over the last several decades, with today’s fusion reactors considerably closer to sustainable power production than reactors from a few decades ago (see diagram below). ITER is currently the leading approach to build a functional fusion reactor, but as a large multinational consortium, ITER faces complex political and operational issues that have slowed construction (source). ITER does not expect to operate fully till 2035. The slow pace of planned development has encouraged a number of fusion startups to attempt innovative faster and cheaper approaches to achieving practical fusion (source).
Computational Plasma Models
Plasma modeling can be done at several levels of theory that model the underlying structure of the plasma with increasing fidelity. The diagrams below (sourced from this excellent blog post) explain various levels of computational modeling of plasmas. The simplest approach models plasma as an electrically charged ideal gas. Magnetohydrodynamics adds bulk fluid interactions to the model, and Two Fluid Models model ions and electrons separately.
The three approaches above have the strong benefit of simplicity, but struggle to model more sophisticated interactions. The Particle In Cell (PIC) method breaks up the plasma into a lattice of small cells and models the plasma through “super particles”, which represent some number of actual particles (see diagram below). PIC methods are considerably more computationally expensive than the earlier approaches we saw since they model more fine-grained interactions. Blob In Cell (BIC) models take a somewhat more sophisticated approach by modeling each cell through an empirical charge distribution that is iteratively updated.
Discussion
One of the most surprising things I learned writing today’s post is that most computational models of fusion reactors operate purely within the realm of classical physics. In actuality, fusion is a profoundly quantum phenomenon that depends on high temperatures being used to break nuclear bonds. However, computational modeling of nuclear forces is still in its infancy; lattice chromodynamics has only very recently been used to study the nuclear interactions that occur in fusion reactions (source). The inability of modern simulation techniques to capture the true physics contributes to the challenge of building fusion reactors. New techniques based on scientific machine learning and quantum computers could play a role in accelerating fusion modeling (source).
Highlights for the Week
https://spectrum.ieee.org/energywise/energy/batteries-storage/evs-to-drive-a-lithium-supply-crunch: Analysts are warning of a coming supply crunch for lithium. American planners should make sure that lithium supply chains aren’t monopolized by PRC (People’s Republic of China) companies.
The rate of PRC military buildup is alarming as the excellent Twitter thread linked below highlights. The US could very conceivably lose a conflict with the PRC given its rapidly growing naval capabilities. While global military escalation is dangerous, refusing to acknowledge the reality of dangerous PRC arms buildup could leave the US painfully exposed.
Subscription, Feedback and Comments
If you liked this post, please consider subscribing! We have weekly subscriber-only posts on Tuesdays.
Please feel free to email me directly (bharath@deepforestsci.com) with your feedback and comments!
About
Deep Into the Forest is a newsletter by Deep Forest Sciences, Inc. We’re a deep tech R&D company specializing in the use of AI for deep tech development. We do technical consulting and joint development partnerships with deep tech firms. Get in touch with us at partnerships@deepforestsci.com! We’re always welcome to new ideas!
Credits
Author: Bharath Ramsundar, Ph.D.
Editor: Sandya Subramanian
Great article! There was a workshop at last year's NeurIPS, Machine Learning and Physical Sciences - I reviewed a paper that was submitted that was related to fusion modeling "Training neural networks under physical constraints
using a stochastic augmented Lagrangian approach." It was an interesting read as I didn't know much about the computational modeling techniques in this domain. It looks like the work was finally published by Argonne Labs in various places, including https://arxiv.org/pdf/2009.07330.pdf