For fusion to become viable within the next two decades, it requires solving some of the major issues impacting the design and operation of reactors, namely engineering reactor walls that can tolerate high heat while keeping good core performance. This requires dissipating the heat and particles flowing towards the wall without reducing the performance of the core.
Physicists refer to the challenge as core-edge integration. If unmitigated, heat and particle flows at the edge of the plasma are so intense that the material surfaces would melt.
A team lead by Zinkle Fellow and Assistant Professor Livia Casali recently developed a pathway to improving core-edge integration by injecting impurities at the edge of the plasma and simultaneously optimizing the divertor, a device that removes excess heat and particles from the plasma’s edge. Once injected, the impurities convert the heat flux into electromagnetic radiation. This allows the heat to dissipate before contacting the walls.
The new findings were recently published in Nuclear Fusion in her article called “Impurity leakage and radiative cooling in the first nitrogen and neon seeding study in the closed DIII-D SAS configuration.”
While nitrogen is an excellent divertor radiator in present devices, it may pose a problem for ITER operation (the fusion experiment under construction in France), which led Casali’s research team to study neon as a replacement seeding gas in power exhaust application. Neon is both chemically inert and also radiates at temperatures relevant for ITER and future reactors.
Casali and the research team compared nitrogen and neon seeding in the small angle slot (SAS) divertor at DIII-D national fusion facility operated by General Atomics for the Department of Energy. The experiments showed that plasma cooling and impurity leakage have a strong relationship with divertor geometry and the type of impurity injected in the edge. These insights were made possible by state-of-art diagnostics as well as edge simulation tools that, for the first time at DIII-D, included impurities and full treatment of certain plasma flows known as drifts. The research indicates that the feasibility of fusion reactors can be improved with a combination of the appropriate radiative impurity species, an optimized divertor geometry, and tailored drifts for controlling particles.
This experiment was the first time impurity seeding studies were performed in the SAS graphite divertor at DIII-D. This work provides a demonstration of the impurity leakage mechanism in a closed divertor structure and its consequent impacts. This paper also demonstrates the different roles of carbon in the nitrogen versus neon-seeded cases, both in the experiments and in the numerical modeling.