New AI models of plasma heating lead to important corrections in computer code used for fusion research

by

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Equilibrium configurations used for (a) NSTX and (b) WEST databases, corresponding to shots 138 506 and 56 898, respectively. On the left, the magnetic equilibrium is shown with highlighted LCFS (red) and plasma facing surface wall (black). On the right, the toroidal magnetic field amplitude Bt (colorbar) is shown, as well as the relevant resonance layers (white) in (a) NSTX (i.e. deuterium harmonic resonances, n > 1) and (b) WEST (i.e. fundamental resonance for hydrogen). Credit: Nuclear Fusion (2024). DOI: 10.1088/1741-4326/ad645d

New artificial intelligence (AI) models for plasma heating can do more than was previously thought possible, not only increasing the prediction speed 10 million times while preserving accuracy, but also correctly predicting plasma heating in cases where the original numerical code failed. The models will be presented on October 11 at the 66th Annual Meeting of the American Physical Society Division of Plasma Physics in Atlanta.

"With our intelligence, we can train the AI to go even beyond the limitations of available numerical models," said Álvaro Sánchez-Villar, an associate research physicist at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL). Sánchez-Villar is the lead author on a new journal article in Nuclear Fusion about the work. It was part of a project that spanned five research institutions.

The models use machine learning, a type of AI, to try to predict the way electrons and ions in a plasma behave when ion cyclotron range of frequency (ICRF) heating is applied in fusion experiments. The models are trained on data generated by a computer code. While much of the data agreed with past results, in some extreme scenarios the data wasn't what they expected.

"We observed a parametric regime in which the heating profiles featured erratic spikes in rather arbitrary locations," said Sánchez-Villar. "There was nothing physical to explain those spikes."

Sánchez Villar identified and removed problematic data, known as outliers, from the training set to train their AI since the scenarios were unphysical. "We biased our model by eliminating the spikes in the training dataset, and we were still able to predict the physics," Sánchez Villar said.

Heating profiles for deuterium are shown in a (d) minor, (e) major and (f) critical outlier cases. In black, the original numerical code is shown with outlier features (spikes). In red, the predictions of the AI model are shown. In green, the predictions of the corrected code are shown, which were anticipated by the AI models, even predicting the higher heating in the highlighted region. Credit: Álvaro Sánchez-Villar / PPPL

"As can be observed, the code correctly removes the spikes but anticipates higher heating in the highlighted region. However, there was nothing that would guarantee these predictions were physical."

Then, the team went a step further. After months of research, the cause—a limitation of the numerical model—was identified and resolved by Sánchez Villar, who then ran the corrected version of the code for the outlier cases that were originally showing the random spikes.

Not only did he find that the solutions were free of spikes in all problematic cases, but, to his surprise, these solutions were almost identical to the solutions in one of the machine learning models predicted months before, even in critical outlier scenarios.

"This means that, practically, our surrogate implementation was equivalent to fixing the original code, just based on a careful curation of the data," said Sánchez-Villar. "As with every technology, with an intelligent use, AI can help us solve problems not only faster, but better than before, and overcome our own human constraints."

As expected, the models also improved the computation times for ICRF heating. Those times fell from roughly 60 seconds to 2 microseconds, enabling faster simulations without notably impacting the accuracy. This improvement will help scientists and engineers explore the best ways to make fusion a practical power source.

More information: Á. Sánchez-Villar et al, Real-time capable modeling of ICRF heating on NSTX and WEST via machine learning approaches, Nuclear Fusion (2024). DOI: 10.1088/1741-4326/ad645d

Provided by Princeton Plasma Physics Laboratory