Publications

Results 1–25 of 69

Search results

Jump to search filters

Tunnel ionization within a one-dimensional, undriven plasma sheath

AIP Advances

Hooper, Russell H.; Patel, Nishant B.; Pacheco, Jose L.

In high density, high temperature plasmas, the plasma sheath that develops can result in extremely high electric fields, on the order of tens to hundreds of V/nm. Under the right conditions, these electric fields can reach magnitudes that can increase the probability of electron tunneling ionization to occur, resulting in one or more electron-ion pairs. The presence of tunneling ionization can then modify the development of the plasma sheath, as well as properties such as the ion and electron densities and plasma potential. The tunnel ionization process for hydrogen atoms is demonstrated, in this work, as implemented in a Sandia National Laboratories, particle-in-cell code Aleph. Results are presented for the application of the tunnel ionization process to a one-dimensional, undriven plasma sheath. Additional results for cases that consider warm ions and neutrals, the inclusion of electron-neutral collisions, and the injection of neutral particles, as well as the application to various plasma devices, will be discussed.

More Details

A software environment for effective reliability management for pulsed power design

Reliability Engineering and System Safety

Robinson, Allen C.; Swan, Matthew S.; Smith, Thomas M.; Bennett, Nichelle L.; Drake, Richard R.; Hooper, Russell H.; Laity, George R.

The reliable design of magnetically insulated transmission lines (MITLs) for very high current pulsed power machines must be accomplished in the future by utilizing a variety of sophisticated modeling tools. The complexity of the models required is high and the number of sub-models and approximations large. The potential for significant analyst error using a single tool is large, with possible reliability issues associated with the plasma modeling tools themselves or the chosen approach by the analyst to solve a given problem. We report on a software infrastructure design that provides a workable framework for building self-consistent models and constraining feedback to limit analyst error. The framework and associated tools aid the development of physical intuition, the development of increasingly sophisticated models, and the comparison of performance results. The work lays the computational foundation for designing state-of-the-art pulsed-power experiments. The design and useful features of this environment are described. We discuss the utility of the Git source code management system and a GitLab interface for use in project management that extends beyond software development tasks.

More Details

Towards Predictive Plasma Science and Engineering through Revolutionary Multi-Scale Algorithms and Models (Final Report)

Laity, George R.; Robinson, Allen C.; Cuneo, M.E.; Alam, Mary K.; Beckwith, Kristian B.; Bennett, Nichelle L.; Bettencourt, Matthew T.; Bond, Stephen D.; Cochrane, Kyle C.; Criscenti, Louise C.; Cyr, Eric C.; Laros, James H.; Drake, Richard R.; Evstatiev, Evstati G.; Fierro, Andrew S.; Gardiner, Thomas A.; Laros, James H.; Goeke, Ronald S.; Hamlin, Nathaniel D.; Hooper, Russell H.; Koski, Jason K.; Lane, James M.; Larson, Steven R.; Leung, Kevin L.; McGregor, Duncan A.; Miller, Philip R.; Miller, Sean M.; Ossareh, Susan J.; Phillips, Edward G.; Simpson, Sean S.; Sirajuddin, David S.; Smith, Thomas M.; Swan, Matthew S.; Thompson, Aidan P.; Tranchida, Julien G.; Bortz-Johnson, Asa J.; Welch, Dale R.; Russell, Alex M.; Watson, Eric D.; Rose, David V.; McBride, Ryan D.

This report describes the high-level accomplishments from the Plasma Science and Engineering Grand Challenge LDRD at Sandia National Laboratories. The Laboratory has a need to demonstrate predictive capabilities to model plasma phenomena in order to rapidly accelerate engineering development in several mission areas. The purpose of this Grand Challenge LDRD was to advance the fundamental models, methods, and algorithms along with supporting electrode science foundation to enable a revolutionary shift towards predictive plasma engineering design principles. This project integrated the SNL knowledge base in computer science, plasma physics, materials science, applied mathematics, and relevant application engineering to establish new cross-laboratory collaborations on these topics. As an initial exemplar, this project focused efforts on improving multi-scale modeling capabilities that are utilized to predict the electrical power delivery on large-scale pulsed power accelerators. Specifically, this LDRD was structured into three primary research thrusts that, when integrated, enable complex simulations of these devices: (1) the exploration of multi-scale models describing the desorption of contaminants from pulsed power electrodes, (2) the development of improved algorithms and code technologies to treat the multi-physics phenomena required to predict device performance, and (3) the creation of a rigorous verification and validation infrastructure to evaluate the codes and models across a range of challenge problems. These components were integrated into initial demonstrations of the largest simulations of multi-level vacuum power flow completed to-date, executed on the leading HPC computing machines available in the NNSA complex today. These preliminary studies indicate relevant pulsed power engineering design simulations can now be completed in (of order) several days, a significant improvement over pre-LDRD levels of performance.

More Details

Modeling a ring magnet in ALEGRA

Niederhaus, John H.; Pacheco, Jose L.; Wilkes, John; Hooper, Russell H.; Siefert, Christopher S.; Goeke, Ronald S.

We show here that Sandia's ALEGRA software can be used to model a permanent magnet in 2D and 3D, with accuracy matching that of the open-source commercial software FEMM. This is done by conducting simulations and experimental measurements for a commercial-grade N42 neodymium alloy ring magnet with a measured magnetic field strength of approximately 0.4 T in its immediate vicinity. Transient simulations using ALEGRA and static simulations using FEMM are conducted. Comparisons are made between simulations and measurements, and amongst the simulations, for sample locations in the steady-state magnetic field. The comparisons show that all models capture the data to within 7%. The FEMM and ALEGRA results agree to within approximately 2%. The most accurate solutions in ALEGRA are obtained using quadrilateral or hexahedral elements. In the case where iron shielding disks are included in the magnetized space, ALEGRA simulations are considerably more expensive because of the increased magnetic diffusion time, but FEMM and ALEGRA results are still in agreement. The magnetic field data are portable to other software interfaces using the Exodus file format.

More Details

A Novel use of Direct Simulation Monte-Carlo to Model Dynamics of COVID-19 Pandemic Spread

Pacheco, Jose L.; Echo, Zakari S.; Hooper, Russell H.; Finley, Melissa F.; Manginell, Ronald P.

In this report, we evaluate a novel method for modeling the spread of COVID-19 pandemic. In this new approach we leverage methods and algorithms developed for fully-kinetic plasma physics simulations using Particle-In-Cell (PIC) Direct Simulation Monte-Carlo (DSMC) models. This approach then leverages Sandia-unique simulation capabilities, and High-Performance Computer (HPC) resources and expertise in particle-particle interactions using stochastic processes. Our hypothesis is that this approach would provide a more efficient platform with assumptions based on physical data that would then enable the user to assess the impact of mitigation strategies and forecast different phases of infection. This work addresses key scientific questions related to the assumptions this new approach must make to model the interactions of people using algorithms typically used for modeling particle interactions in physics codes (kinetic plasma, gas dynamics). The model developed uses rational/physical inputs while also providing critical insight; the results could serve as inputs to, or alternatives for, existing models. The model work presented was developed over a four-week time frame, thus far showing promising results and many ways in which this model/approach could be improved. This work is aimed at providing a proof-of-concept for this new pandemic modeling approach, which could have an immediate impact on the COVID-19 pandemic modeling, while laying a basis to model future pandemic scenarios in a manner that is timely and efficient. Additionally, this new approach provides new visualization tools to help epidemiologists comprehend and articulate the spread of this and other pandemics as well as a more general tool to determine key parameters needed in order to better predict pandemic modeling in the future. In the report we describe our model for pandemic modeling, apply this model to COVID-19 data for New York City (NYC), assess model sensitivities to different inputs and parameters and , finally, propagate the model forward under different conditions to assess the effects of mitigation and associated timing. Finally, our approach will help understand the role of asymptomatic cases, and could be extended to elucidate the role of recovered individuals in the second round of the infection, which is currently being ignored.

More Details

Dakota, A Multilevel Parallel Object-Oriented Framework for Design Optimization Parameter Estimation Uncertainty Quantification and Sensitivity Analysis: Version 6.12 User's Manual

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John P.; Eldred, Michael S.; Hooper, Russell H.; Hough, Patricia D.; Hu, Kenneth H.; Jakeman, John D.; Khalil, Mohammad K.; Maupin, Kathryn A.; Monschke, Jason A.; Ridgway, Elliott M.; Rushdi, Ahmad R.; Seidl, Daniel T.; Stephens, John A.; Swiler, Laura P.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

Adams, Brian M.; Coleman, Kayla; Hooper, Russell H.; Khuwaileh, Bassam A.; Lewis, Allison; Smith, Ralph C.; Swiler, Laura P.; Turinsky, Paul J.; Williams, Brian W.

Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility.

More Details
Results 1–25 of 69
Results 1–25 of 69