Publications

4 Results

Search results

Jump to search filters

A semi-supervised learning method to produce explainable radioisotope proportion estimates for NaI-based synthetic and measured gamma spectra

Van Omen, Alan J.; Morrow, Tyler M.

Quantifying the radioactive sources present in gamma spectra is an ever-present and growing national security mission and a time-consuming process for human analysts. While machine learning models exist that are trained to estimate radioisotope proportions in gamma spectra, few address the eventual need to provide explanatory outputs beyond the estimation task. In this work, we develop two machine learning models for a NaI detector measurements: one to perform the estimation task, and the other to characterize the first model’s ability to provide reasonable estimates. To ensure the first model exhibits a behavior that can be characterized by the second model, the first model is trained using a custom, semi-supervised loss function which constrains proportion estimates to be explainable in terms of a spectral reconstruction. The second auxiliary model is an out-of-distribution detection function (a type of meta-model) leveraging the proportion estimates of the first model to identify when a spectrum is sufficiently unique from the training domain and thus is out-of-scope for the model. In demonstrating the efficacy of this approach, we encourage the use of meta-models to better explain ML outputs used in radiation detection and increase trust.

More Details

Controlling radioisotope proportions when randomly sampling from Dirichlet distributions in PyRIID

Van Omen, Alan J.; Morrow, Tyler M.

As machine learning models for radioisotope quantification become more powerful, likewise the need for high-quality synthetic training data grows as well. For problem spaces that involve estimating the relative isotopic proportions of various sources in gamma spectra it is necessary to generate training data that accurately represents the variance of proportions encountered. In this report, we aim to provide guidance on how to target a desired variance of proportions which are randomly when using the PyRIID Seed Mixer, which samples from a Dirichlet distribution. We provide a method for properly parameterizing the Dirichlet distribution in order to maintain a constant variance across an arbitrary number of dimensions, where each dimension represents a distinct source template being mixed. We demonstrate that our method successfully parameterizes the Dirichlet distribution to target a specific variance of proportions, provided that several conditions are met. This allows us to follow a principled technique for controlling how random mixture proportions are generated which are then used downstream in the synthesis process to produce the final, noisy gamma spectra.

More Details

Machine learning predictions of transition probabilities in atomic spectra

Atoms

Michalenko, Joshua J.; Clemenson, Michael D.; Murzyn, Christopher M.; Wermer, Lydia; Zollweg, Joshua D.; Van Omen, Alan J.

Forward modeling of optical spectra with absolute radiometric intensities requires knowledge of the individual transition probabilities for every transition in the spectrum. In many cases, these transition probabilities, or Einstein A-coefficients, quickly become practically impossible to obtain through either theoretical or experimental methods. Complicated electronic orbitals with higher order effects will reduce the accuracy of theoretical models. Experimental measurements can be prohibitively expensive and are rarely comprehensive due to physical constraints and sheer volume of required measurements. Due to these limitations, spectral predictions for many element transitions are not attainable. In this work, we investigate the efficacy of using machine learning models, specifically fully connected neural networks (FCNN), to predict Einstein A-coefficients using data from the NIST Atomic Spectra Database. For simple elements where closed form quantum calculations are possible, the data-driven modeling workflow performs well but can still have lower precision than theoretical calculations. For more complicated nuclei, deep learning emerged more comparable to theoretical predictions, such as Hartree–Fock. Unlike experiment or theory, the deep learning approach scales favorably with the number of transitions in a spectrum, especially if the transition probabilities are distributed across a wide range of values. It is also capable of being trained on both theoretical and experimental values simultaneously. In addition, the model performance improves when training on multiple elements prior to testing. The scalability of the machine learning approach makes it a potentially promising technique for estimating transition probabilities in previously inaccessible regions of the spectral and thermal domains on a significantly reduced timeline.

More Details
4 Results
4 Results