Publications

Results 8801–9000 of 96,771

Search results

Jump to search filters

Ultrafast Electron Microscopy for Spatial-Temporal Mapping of Charge Carriers

Ellis, Scott R.; Chandler, D.W.; Michael, Joseph R.; Nakakura, Craig Y.

This LDRD supported efforts to significantly advance the scanning ultrafast electron microscope (SUEM) for spatial-temporal mapping of charge carrier dynamics in semiconductor materials and microelectronic devices. Sandia's SUEM capability in Livermore, CA, was built and demonstrated with previous LDRD funding; however, the stability and usability of the tool limited the throughput for analyzing samples. A new laser alignment strategy improved the stability of the SUEM, and the design and characterization of a new micro-channel plate (MCP)- based detector improved the signal-to-noise of the SUEM signal detection. These enhancements to the SUEM system improved throughput by over two orders of magnitude (before, a single time series of SUEM measurements would take several days to several weeks to acquire; now, the same measurements can be completed in~90 minutes in an automated fashion). The SUEM system can now be routinely used as an analytical instrument and will be a central part of several multi-year projects starting in FY22.

More Details

DNS/LES Study of Representative Wall-Bounded Turbulent Flows using SIERRA/Fuego

Koo, Heeseok; Hewson, John C.; Brown, Alexander B.; Knaus, Robert C.; Kurzawski, Andrew K.; Clemenson, Michael D.

This report summarizes a series of SIERRA/Fuego validation efforts of turbulent flow models on canonical wall-bounded configurations. In particular, direct numerical simulations (DNS) and large eddy simulations (LES) turbulence models are tested on a periodic channel, a periodic pipe, and an open jet for which results are compared to the velocity profiles obtained theoretically or experimentally. Velocity inlet conditions for channel and pipe flows are developed for application to practical simulations. To show this capability, LES is performed over complex terrain in the form of two natural hills and the results are compared with other flow solvers. The practical purpose of the report is to document the creation of inflow boundary conditions of fully developed turbulent flows for other LES calculations where the role of inflow turbulence is critical.

More Details

Spatio-temporal Estimates of Disease Transmission Parameters for COVID-19 with a Fully-Coupled, County-Level Model of the United States

Cummings, Derek; Hart, William E.; Garcia-Carreras, Bernardo; Lanning, Carl D.; Lessler, Justin; Staid, Andrea S.

Sandia National Laboratories has developed a capability to estimate parameters of epidemiological models from case reporting data to support responses to the COVID-19 pandemic. A differentiating feature of this work is the ability to simultaneously estimate county-specific disease transmission parameters in a nation-wide model that considers mobility between counties. The approach is focused on estimating parameters in a stochastic SEIR model that considers mobility between model patches (i.e., counties) as well as additional infectious compartments. The inference engine developed by Sandia includes (1) reconstruction and (2) transmission parameter inference. Reconstruction involves estimating current population counts within each of the compartments in a modified SEIR model from reported case data. Reconstruction produces input for the inference formulations, and it provides initial conditions that can be used in other modeling and planning efforts. Inference involves the solution of a large-scale optimization problem to estimate the time profiles for the transmission parameters in each county. These provide quantification of changes in the transmission parameter over time (e.g., due to impact of intervention strategies). This capability has been implemented in a Python-based software package, epi_inference, that makes extensive use of Pyomo [5] and IPOPT [10] to formulate and solve the inference formulations.

More Details

AXIOM Unfold 0.7.0, Users Manual

Radtke, Gregg A.

The AXIOM-Unfold application is a computational code for performing spectral unfolds along with uncertainty quantification of the photon spectrum. While this code was principally designed for spectral unfolds on the Saturn source, it is also relevant to other radiation sources such as Pithon. This code is a component of the AXIOM project which was undertaken in order to measure the time-resolved spectrum of the Saturn source; to support this, the AXIOM-Unfold code is able to process time-dependent dose measurements in order to obtain a time-resolved spectrum. This manual contains a full description of the algorithms used by the method. The code features are fully documented along with several worked examples.

More Details

A new equation of state for copper

Carpenter, John H.

A new copper equation of state is developed utilizing the available experimental data in addition to recent theoretical calculations. Semi-empirical models are fit to the data and the results are tabulated in the SNL SESAME format. Comparison to other copper EOS tables are given, along with recommendations of which tables provide the best accuracy.

More Details

Visible emission spectra of thermographic phosphors under x-ray excitation

Measurement Science and Technology

Westphal, Eric R.; Brown, Alex D.; Quintana, Enrico C.; Kastengren, Alan L.; Son, Steven F.; Meyer, Terrence R.; Hoffmeister, Kathryn N.

Thermographic phosphors have been employed for temperature sensing in challenging environments, such as on surfaces or within solid samples exposed to dynamic heating, because of the high temporal and spatial resolution that can be achieved using this approach. Typically, UV light sources are employed to induce temperature-sensitive spectral responses from the phosphors. However, it would be beneficial to explore x-rays as an alternate excitation source to facilitate simultaneous x-ray imaging of material deformation and temperature of heated samples and to reduce UV absorption within solid samples being investigated. The phosphors BaMgAl10O17:Eu (BAM), Y2SiO5:Ce, YAG:Dy, La2O2S:Eu, ZnGa2O4:Mn, Mg3F2GeO4:Mn, Gd2O2S:Tb, and ZnO were excited in this study using incident synchrotron x-ray radiation. These materials were chosen to include conventional thermographic phosphors as well as x-ray scintillators (with crossover between these two categories). X-ray-induced thermographic behavior was explored through the measurement of visible spectral response with varying temperature. The incident x-rays were observed to excite the same electronic energy level transitions in these phosphors as UV excitation. Similar shifts in the spectral response of BAM, Y2SiO5:Ce, YAG:Dy, La2O2S:Eu, ZnGa2O4:Mn, Mg3F2GeO4:Mn, and Gd2O2S:Tb were observed when compared to their response to UV excitation found in literature. Some phosphors were observed to thermally quench in the temperature ranges tested here, while the response from others did not rise above background noise levels. This may be attributed to the increased probability of non-radiative energy release from these phosphors due to the high energy of the incident x-rays. These results indicate that x-rays can serve as a viable excitation source for phosphor thermometry.

More Details

Fuel Fabrication and Single Stage Aqueous Process Modeling

Laros, James H.; TACONI, ANNA M.; Honnold, Philip H.; Cipiti, Benjamin B.

The Material Protection, Accounting, and Control Technologies program utilizes modeling and simulation to assess Material Control and Accountability (MC&A) concerns for a variety of nuclear facilities. Single analyst tools allow for rapid design and evaluation of advanced approaches for new and existing nuclear facilities. A low enriched uranium (LEU) fuel conversion and fabrication facility simulator is developed to assist with MC&A for existing facilities. Measurements are added to the model (consistent with current best practices). Material balance calculations and statistical tests are also added to the model. In addition, scoping work is performed for developing a single stage aqueous reprocessing model. Preliminary results are presented and discussed, and next steps outlined.

More Details

Engineering Forisome Scaffolds: Elucidating Spatial Self-Assembly Patterning of Bio-inorganic Complexes

Smallwood, Chuck R.; Podlevsky, Joshua P.; Snow, Todd; Ryan, Emmarie C.

Organisms can synthesize biomaterials incorporating an array of naturally occurring elements while overcoming challenges and insults. Although, it is known that most cellular biomaterials are synthesized in specialized cellular compartments, there are knowledge gaps about how organic/inorganic biomaterial synthesis is orchestrated inside cells. In addition, there is great potential in understanding how individual monomers can self-assembly into organized patterns to form responsive biomaterials. Forisomes are a natural responsive biomaterial found in legume plants that serve as a plug sieve element in the plant phloem that undergo anisotropic conformational changes by rapid (<1 s) ATP-independent from condensed spindle to plug-like form, triggered by the influx of Ca2+. Addressing principles of forisome synthesis and assembly will determine how biomaterials containing inorganic elements self-assemble and conduct chemical modification to produce biomaterials or undergo biomineralization. We employ transcription and translation (TXTL) using cell-free expression systems for forisome monomer expression, self-assembly, and pattern probing. We conducted experiments to precisely control forisome proteins synthesis of various monomers SEO1, SEO2, SEO3, and SEO4 to explore self- assembly. We demonstrate forisome self-assembly of the SEO monomers is possible and indicate unique monomer fluorescent labeling patterns that require additional analysis. We investigated locations and linkers for adding tetracysteine tag fluorophore probes to determine impacts of self-assembly and anisotropic conformational changes.

More Details

NMR spectroscopy of coin cell batteries with metal casings

Science Advances

Walder, Brennan W.; Conradi, Mark S.; Borchardt, John J.; Merrill, Laura C.; Sorte, Eric G.; Deichmann, Eric J.; Anderson, Travis M.; Alam, Todd M.; Harrison, Katharine L.

Battery cells with metal casings are commonly considered incompatible with nuclear magnetic resonance (NMR) spectroscopy because the oscillating radio-frequency magnetic fields ("rf fields") responsible for excitation and detection of NMR active nuclei do not penetrate metals. Here, we show that rf fields can still efficiently penetrate nonmetallic layers of coin cells with metal casings provided "B1 damming"configurations are avoided. With this understanding, we demonstrate noninvasive high-field in situ 7Li and 19F NMR of coin cells with metal casings using a traditional external NMR coil. This includes the first NMR measurements of an unmodified commercial off-the-shelf rechargeable battery in operando, from which we detect, resolve, and separate 7Li NMR signals from elemental Li, anodic β-LiAl, and cathodic LixMnO2 compounds. Real-time changes of β-LiAl lithium diffusion rates and variable β-LiAl 7Li NMR Knight shifts are observed and tied to electrochemically driven changes of the β-LiAl defect structure.

More Details

Trajectory Optimization via Unsupervised Probabilistic Learning On Manifolds

Safta, Cosmin S.; Najm, H.N.; Grant, Michael J.; Sparapany, Michael J.

This report investigates the use of unsupervised probabilistic learning techniques for the analysis of hypersonic trajectories. The algorithm first extracts the intrinsic structure in the data via a diffusion map approach. Using the diffusion coordinates on the graph of training samples, the probabilistic framework augments the original data with samples that are statistically consistent with the original set. The augmented samples are then used to construct conditional statistics that are ultimately assembled in a path-planing algorithm. In this framework the controls are determined stage by stage during the flight to adapt to changing mission objectives in real-time. A 3DOF model was employed to generate optimal hypersonic trajectories that comprise the training datasets. The diffusion map algorithm identfied that data resides on manifolds of much lower dimensionality compared to the high-dimensional state space that describes each trajectory. In addition to the path-planing worflow we also propose an algorithm that utilizes the diffusion map coordinates along the manifold to label and possibly remove outlier samples from the training data. This algorithm can be used to both identify edge cases for further analysis as well as to remove them from the training set to create a more robust set of samples to be used for the path-planing process.

More Details

Preliminary Radioisotope Screening for Off-site Consequence Assessment of Advanced Non-LWR Systems

Andrews, Nathan C.; Laros, James H.; TACONI, ANNA M.; Leute, Jennifer E.

Currently a set of 71 radionuclides are accounted for in off-site consequence analysis for LWRs. Radionuclides of dose consequence are expected to change for non-LWRs, with radionuclides of interest being type-specific. This document identifies an expanded set of radionuclides that may need to be accounted for in multiple non-LWR systems: high temperature gas reactors (HTGRs); fluoride-salt-cooled high-temperature reactors (FHRs); thermal-spectrum fluoride-based molten salt reactors (MSRs); fast-spectrum chloride-based MSRs; and, liquid metal fast reactors with metallic fuel (LMRs) Specific considerations are provided for each reactor type in Chapter 2 through Chapter 5, and a summary of all recommendations is provided in Chapter 6. All identified radionuclides are already incorporated within the MACCS software, yet the development of tritium-specific and carbon-specific chemistry models are recommended.

More Details

A New Route to Quantum-Scale Structures through a Novel Enhanced Germanium Diffusion Mechanism

Wang, George T.; Lu, Ping L.; Sapkota, Keshab R.; Baczewski, Andrew D.; Campbell, Quinn C.; Schultz, Peter A.; Jones, Kevin S.; Turner, Emily M.; Sharrock, Chappel J.; Law, Mark E.; Yang, Hongbin

This project sought to develop a fundamental understanding of the mechanisms underlying a newly observed enhanced germanium (Ge) diffusion process in silicon germanium (SiGe) semiconductor nanostructures during thermal oxidation. Using a combination of oxidationdiffusion experiments, high resolution imaging, and theoretical modeling, a model for the enhanced Ge diffusion mechanism was proposed. Additionally, a nanofabrication approach utilizing this enhanced Ge diffusion mechanism was shown to be applicable to arbitrary 3D shapes, leading to the fabrication of stacked silicon quantum dots embedded in SiGe nanopillars. A new wet etch-based method for preparing 3D nanostructures for highresolution imaging free of obscuring material or damage was also developed. These results enable a new method for the controlled and scalable fabrication of on-chip silicon nanostructures with sub-10 nm dimensions needed for next generation microelectronics, including low energy electronics, quantum computing, sensors, and integrated photonics.

More Details

Seismic Shake Table Test Plan

Kalinina, Elena A.; Ammerman, Douglas J.; Lujan, Lucas A.

This report is a preliminary test plan of the seismic shake table test. The final report will be developed when all decisions regarding the test hardware, instrumentation, and shake table inputs are made. A new revision of this report will be issued in spring of 2022. The preliminary test plan documents the free-field ground motions that will be used as inputs to the shake table, the test hardware, and instrumentation. It also describes the facility at which the test will take place in late summer of 2022.

More Details

Efficient prompt scintillation and fast neutron-gamma ray discrimination using amorphous blends of difluorenylsilane organic glass and in situ polymerized vinyltoluene

IEEE Transactions on Nuclear Science

Myllenbeck, Nicholas M.; Feng, Patrick L.; Benin, Annabelle L.; Tran, Huu T.; Carlson, Joseph S.; Hunter, McKenzie A.

High-performance radiation detection materials are an integral part of national security, medical imaging, and nuclear physics applications. Those that offer compositional and manufacturing versatility are of particular interest. Here, we report a new family of radiological particle-discriminating scintillators containing bis(9,9-dimethyl-9H-fluoren-2-yl)diphe-nylsilane (compound 'P2') and in situ polymerized vinyltoluene (PVT) that is phase stable and mechanically robust at any blend ratio. The gamma-ray light yield increases nearly linearly across the composition range, to 16 400 photons/MeV at 75 wt.% P2. These materials are also capable of performing γ/n pulse shape discrimination (PSD), and between 20% and 50% P2 loading is competitive with the PSD quality of commercially available plastic scintillators. The 137Cs scintillation rise and decay times are sensitive to P2 loading and approach the values for 'pure' P2. Additionally, the radiation detection performance of P2-PVT blends can be made stable in 60 °C air for at least 1.5 months with the application of a thin film of poly(vinylalcohol) to the scintillator surfaces.

More Details

Integrated System and Application Continuous Performance Monitoring and Analysis Capability (Final)

Schwaller, Benjamin S.

Scientific applications run on high-performance computing (HPC) systems are critical for many national security missions within Sandia and the NNSA complex. However, these applications often face performance degradation and even failures that are challenging to diagnose. To provide unprecedented insight into these issues, the HPC Development, HPC Systems, Computational Science, and Plasma Theory & Simulation departments at Sandia crafted and completed their FY21 ASC Level 2 milestone entitled "Integrated System and Application Continuous Performance Monitoring and Analysis Capability." The milestone created a novel integrated HPC system and application monitoring and analysis capability by extending Sandia’s Kokkos application portability framework, Lightweight Distributed Metric Service (LDMS) monitoring tool, and scalable storage, analysis, and visualization pipeline. The extensions to Kokkos and LDMS enable collection and storage of application data during run time, as it is generated, with negligible overhead. This data is combined with HPC system data within the extended analysis pipeline to present relevant visualizations of derived system and application metrics that can be viewed at run time or post run. This new capability was evaluated using several week-long, 290-node runs of Sandia’s ElectroMagnetic Plasma In Realistic Environments (EMPIRE) modeling and design tool and resulted in 1TB of application data and 50TB of system data. EMPIRE developers remarked this capability was incredibly helpful for quickly assessing application health and performance alongside system state. In short, this milestone work built the foundation for expansive HPC system and application data collection, storage, analysis, visualization, and feedback framework that will increase total scientific output of Sandia’s HPC users.

More Details

Overvoltage prevention and curtailment reduction using adaptive droop-based supplementary control in smart inverters

Applied Sciences (Switzerland)

Maharjan, Manisha; Tamrakar, Ujjwol; Ni, Zhen; Bhattarai, Bishnu; Tonkoski, Reinaldo

Recent developments in the renewable energy sector have seen an unprecedented growth in residential photovoltaic (PV) installations. However, high PV penetration levels often lead to overvoltage problems in low-voltage (LV) distribution feeders. Smart inverter control such as active power curtailment (APC)-based overvoltage control can be implemented to overcome these challenges. The APC technique utilizes a constant droop-based approach which curtails power rigidly, which can lead to significant energy curtailment in the LV distribution feeders. In this paper, different variations of the APC technique with linear, quadratic, and exponential droops have been analyzed from the point-of-view of energy curtailment for a LV distribution network in North America. Further, a combinatorial approach using various droop-based APC methods in conjunction with adaptive dynamic programming (ADP) as a supplementary control scheme has also been proposed. The proposed approach minimizes energy curtailment in the LV distribution network by adjusting the droop gains. Simulation results depict that ADP in conjunction with exponential droop reduces the energy curtailment to approximately 50% compared to using the standard linear droop.

More Details

Locating Seismic Events with Local-Distance Data

Davenport, Kathy D.

As the seismic monitoring community advances toward detecting, identifying, and locating ever-smaller natural and anthropogenic events, the need is constantly increasing for higher resolution, higher fidelity data, models, and methods for accurately characterizing events. Local-distance seismic data provide robust constraints on event locations, but also introduce complexity due to the significant geologic heterogeneity of the Earth’s crust and upper mantle, and the relative sparsity of data that often occurs with small events recorded on regional seismic networks. Identifying the critical characteristics for improving local-scale event locations and the factors that impact location accuracy and reliability is an ongoing challenge for the seismic community. Using Utah as a test case, we examine three data sets of varying duration, finesse, and magnitude to investigate the effects of local earth structure and modeling parameters on local-distance event location precision and accuracy. We observe that the most critical elements controlling relocation precision are azimuthal coverage and local-scale velocity structure, with tradeoffs based on event depth, type, location, and range.

More Details

A targeted opsonization platform for programming innate immunity against rapidly evolving novel viruses

Cahill, Jesse L.

Recent work has shown that artificial opsonins stimulate the targeted destruction of bacteria by phagocyte immune cells. Artificial opsonization has the potential to direct the innate immune system to target novel antigens, potentially even viral pathogens. Furthermore, the engagement of innate immunity presents a potential solution for the spread of pandemics in a scenario when a vaccine is unavailable or ineffective. Funded by the LDRD late start bioscience pandemic response program, we tested whether artificial opsonins can be developed to target viral pathogens using phage MS2 and a SARS-CoV-2 surrogate. To direct opsonization against these viruses we purified antibody derived viral targeting motifs and attempted the same chemical conjugation strategies that produced bacterial targeting artificial opsonins. However, the viral targeting motifs proved challenging to conjugate using these methods, frequently resulting in precipitation and loss of product. Future studies may be successful with this approach if a smaller and more soluble viral-targeting peptide could be used.

More Details

Sensitivity Analysis Comparisons on Geologic Case Studies: An International Collaboration

Swiler, Laura P.; Becker, Dirk-Alexander; Brooks, Dusty M.; Govaerts, Joan; Koskinen, Lasse; Plischke, Elmar; Rohlig, Klaus-Jurgen; Saveleva, Elena; Spiessl, Sabine M.; Stein, Emily S.; Svitelman, Valentina

Over the past four years, an informal working group has developed to investigate existing sensitivity analysis methods, examine new methods, and identify best practices. The focus is on the use of sensitivity analysis in case studies involving geologic disposal of spent nuclear fuel or nuclear waste. To examine ideas and have applicable test cases for comparison purposes, we have developed multiple case studies. Four of these case studies are presented in this report: the GRS clay case, the SNL shale case, the Dessel case, and the IBRAE groundwater case. We present the different sensitivity analysis methods investigated by various groups, the results obtained by different groups and different implementations, and summarize our findings.

More Details

Multimode Metastructures: Novel Hybrid 3D Lattice Topologies

Boyce, Brad B.; Garland, Anthony G.; White, Benjamin C.; Jared, Bradley H.; Conway, Kaitlynn; Adstedt, Katerina; Dingreville, Remi P.; Robbins, Joshua R.; Walsh, Timothy W.; Alvis, Timothy A.; Branch, Brittany A.; Kaehr, Bryan J.; Kunka, Cody; Leathe, Nicholas L.

With the rapid proliferation of additive manufacturing and 3D printing technologies, architected cellular solids including truss-like 3D lattice topologies offer the opportunity to program the effective material response through topological design at the mesoscale. The present report summarizes several of the key findings from a 3-year Laboratory Directed Research and Development Program. The program set out to explore novel lattice topologies that can be designed to control, redirect, or dissipate energy from one or multiple insult environments relevant to Sandia missions, including crush, shock/impact, vibration, thermal, etc. In the first 4 sections, we document four novel lattice topologies stemming from this study: coulombic lattices, multi-morphology lattices, interpenetrating lattices, and pore-modified gyroid cellular solids, each with unique properties that had not been achieved by existing cellular/lattice metamaterials. The fifth section explores how unintentional lattice imperfections stemming from the manufacturing process, primarily sur face roughness in the case of laser powder bed fusion, serve to cause stochastic response but that in some cases such as elastic response the stochastic behavior is homogenized through the adoption of lattices. In the sixth section we explore a novel neural network screening process that allows such stocastic variability to be predicted. In the last three sections, we explore considerations of computational design of lattices. Specifically, in section 7 using a novel generative optimization scheme to design novel pareto-optimal lattices for multi-objective environments. In section 8, we use computational design to optimize a metallic lattice structure to absorb impact energy for a 1000 ft/s impact. And in section 9, we develop a modified micromorphic continuum model to solve wave propagation problems in lattices efficiently.

More Details

Instantaneous Three-Dimensional Temperature Measurements via Ultrafast Laser Spectroscopy with Structured Light

Richardson, Daniel R.

Detonations and flames are characterized by three-dimensional (3D) temperature fields, yet state-of- the-art temperature measurement techniques yield information at a point or along a line. The goal of the research documented here was to combine ultrafast laser spectroscopy and structured illumination to deliver an unprecedented measurement capability—three-dimensional, instantaneous temperature measurements in a gas-phase volume. To achieve this objective, different parts of the proposed technique were developed and tested independently. Structured illumination was used to image particulate matter (soot) in a turbulent flame at multiple planes using a single laser pulse and a single camera. Emission spectroscopy with structured detection was demonstrated for emission- based measurements of explosives with enhance dimensionality. Finally, an instrument for multi- planar laser-based temperature measurement technique was developed. Structured illumination techniques will continue to be developed for multi-dimensional and multi-parameter measurements. These new measurement capabilities will be important for heat transfer and fluid dynamic research areas.

More Details

A Projected Network Model of Online Disinformation Cascades

Emery, Benjamin F.; Ting, Christina T.; Johnson, Nicholas; Tucker, James D.

Within the past half-decade, it has become overwhelmingly clear that suppressing the spread of deliberate false and misleading information is of the utmost importance for protecting democratic institutions. Disinformation has been found to come from both foreign and domestic actors, but the effects from either can be disastrous. From the simple encouragement of unwarranted distrust to conspiracy theories promoting violence, the results of disinformation have put the functionality of American democracy under direct threat. Present scientific challenges posed by this problem include detecting disinformation, quantifying its potential impact, and preventing its amplification. We present a model on which we can experiment with possible strategies toward the third challenge: the prevention of amplification. This is a social contagion network model, which is decomposed into layers to represent physical, ''offline'', interactions as well as virtual interactions on a social media platform. Along with the topological modifications to the standard contagion model, we use state-transition rules designed specifically for disinformation, and distinguish between contagious and non-contagious infected nodes. We use this framework to explore the effect of grassroots social movements on the size of disinformation cascades by simulating these cascades in scenarios where a proportion of the agents remove themselves from the social platform. We also test the efficacy of strategies that could be implemented at the administrative level by the online platform to minimize such spread. These top-down strategies include banning agents who disseminate false information, or providing corrective information to individuals exposed to false information to decrease their probability of believing it. We find an abrupt transition to smaller cascades when a critical number of random agents are removed from the platform, as well as steady decreases in the size of cascades with increasingly more convincing corrective information. Finally, we compare simulated cascades on this framework with real cascades of disinformation recorded on Whatsapp surrounding the 2019 Indian election. We find a set of hyperparameter values that produces a distribution of cascades matching the scaling exponent of the distribution of actual cascades recorded in the dataset. We acknowledge the available future directions for improving the performance of the framework and validation methods, as well as ways to extend the model to capture additional features of social contagion.

More Details

Multi-Resolution Characterization of the Coupling Effects of Molten Salts, High Temperature and Irradiation on Intergranular Fracture

Dingreville, Remi P.; Bielejec, Edward S.; Chen, Elton Y.; Deo, C.; Kim, E.; Spearot, D.E.; Startt, Jacob K.; Stewart, James A.; Sugar, Joshua D.; Vizoso, D.; Weck, Philippe F.; Young, Joshua M.

This project focused on providing a fundamental physico-chemical understanding of the coupling mechanisms of corrosion- and radiation-induced degradation at material-salt interfaces in Ni-based alloys operating in emulated Molten Salt Reactor(MSR) environments through the use of a unique suite of aging experiments, in-situ nanoscale characterization experiments on these materials, and multi-physics computational models. The technical basis and capabilities described in this report bring us a step closer to accelerate the deployment of MSRs by closing knowledge gaps related to materials degradation in harsh environments.

More Details

Marine Atmospheric Corrosion of Additively Manufactured Stainless Steels

Corrosion

Duran, Jesse D.; Taylor, Jason M.; Presuel-Moreno, Francisco; Schaller, Rebecca S.; Schindelholz, Eric J.; Melia, Michael A.

Additively manufactured (AM) stainless steels (SSs) exhibit numerous microstructural differences compared to their wrought counterparts, such as Cr-enriched dislocation cell structures. The influence these unique features have on a SSs corrosion resistance are still under investigation with most current works limited to laboratory experiments. The work herein shows the first documented study of AM 304L and 316L exposed to a severe marine environment on the eastern coast of Florida with comparisons made to wrought counterparts. Coupons were exposed for 21 months and resulted in significant pitting corrosion to initiate after 1 month of exposure for all conditions. At all times, the AM coupons exhibited lower average and maximum pit depths than their wrought counterparts. After 21 months, pits on average were 4 μm deep for AM 316L specimen and 8 μm deep for wrought specimen. Pits on the wrought samples tended to be nearly hemispherical and polished with some pits showing crystallographic attack while pits on AM coupons exhibited preferential attack at melt pool boundaries and the cellular microstructure.

More Details

Quantum Sensed Electron Spin Resonance Discovery Platform (Final Report)

Lilly, Michael L.; Saleh Ziabari, Maziar S.; Titze, Michael T.; Henshaw, Jacob D.; Bielejec, Edward S.; Huber, Dale L.; Mounce, Andrew M.

The properties of materials can change dramatically at the nanoscale new and useful properties can emerge. An example is found in the paramagnetism in iron oxide magnetic nanoparticles. Using magnetically sensitive nitrogen-vacancy centers in diamond, we developed a platform to study electron spin resonance of nanoscale materials. To implement the platform, diamond substrates were prepared with nitrogen vacancy centers near the surface. Nanoparticles were placed on the surface using a drop casting technique. Using optical and microwave pulsing techniques, we demonstrated T1 relaxometry and double electron-electron resonance techniques for measuring the local electron spin resonance. The diamond NV platform developed in this project provides a combination of good magnetic field sensitivity and high spatial resolution and will be used for future investigations of nanomaterials and quantum materials.

More Details

Deep-learning-enabled Bayesian inference of fuel magnetization in magnetized liner inertial fusion

Physics of Plasmas

Laros, James H.; Knapp, Patrick K.; Slutz, Stephen A.; Schmit, Paul S.; Chandler, Gordon A.; Gomez, Matthew R.; Harvey-Thompson, Adam J.; Mangan, Michael M.; Ampleford, David A.; Beckwith, Kristian B.

Fuel magnetization in magneto-inertial fusion (MIF) experiments improves charged burn product confinement, reducing requirements on fuel areal density and pressure to achieve self-heating. By elongating the path length of 1.01 MeV tritons produced in a pure deuterium fusion plasma, magnetization enhances the probability for deuterium-tritium reactions producing 11.8−17.1 MeV neutrons. Nuclear diagnostics thus enable a sensitive probe of magnetization. Characterization of magnetization, including uncertainty quantification, is crucial for understanding the physics governing target performance in MIF platforms, such as magnetized liner inertial fusion (MagLIF) experiments conducted at Sandia National Laboratories, Z-facility. We demonstrate a deep-learned surrogate of a physics-based model of nuclear measurements. A single model evaluation is reduced from CPU hours on a high-performance computing cluster down to ms on a laptop. This enables a Bayesian inference of magnetization, rigorously accounting for uncertainties from surrogate modeling and noisy nuclear measurements. The approach is validated by testing on synthetic data and comparing with a previous study. We analyze a series of MagLIF experiments systematically varying preheat, resulting in the first ever systematic experimental study of magnetic confinement properties of the fuel plasma as a function of fundamental inputs on any neutron-producing MIF platform. We demonstrate that magnetization decreases from B ∼0.5 to B MG cm as laser preheat energy deposited increases from preheat ∼460 J to E preheat ∼1.4 kJ. This trend is consistent with 2D LASNEX simulations showing Nernst advection of the magnetic field out of the hot fuel and diffusion into the target liner.

More Details

Site Environmental Report for 2020, Sandia National Laboratories, California

Robinson, Christina R.

Sandia National Laboratories, California (SNL/CA) is a Department of Energy (DOE) facility. The management and operations of the facility are under a contract with the DOE's National Nuclear Security Administration (NNSA). On May 1, 2017, the name of the management and operating contractor changed from Sandia Corporation to National Technology & Engineering Solutions of Sandia, LLC (NTESS). The DOE, NNSA, Sandia Field Office administers the contract and oversees contractor operations at the site. DOE and its management and operating contractor for Sandia are committed to safeguarding environmental protection, compliance, and sustainability and to ensuring the validity and accuracy of the monitoring data presented in this Annual Site Environmental Report. This Site Environmental Report for 2020 was prepared in accordance with DOE Order 231.1B, Environment, Safety and Health Reporting (DOE 2012). The report provides a summary of environmental monitoring information and compliance activities that occurred at SNL/CA during calendar year 2020, unless noted otherwise. General site and environmental program information is also included.

More Details

Optical Imaging on Z LDRD: Design and Development of Self-Emission and Debris Imagers

Yager-Elorriaga, David A.; Montoya, Michael M.; Bliss, David E.; Ball, Christopher R.; Atencio, Phillip M.; Carpenter, Brian C.; Fuerschbach, Kyle H.; Fulford, Karin W.; Lamppa, Derek C.; Lowinske, Michael C.; Lucero, Larry M.; Patel, Sonal P.; Romero, Anthony R.; Laros, James H.; Breznik-Young, Bonnie B.

We present an overview of the design and development of optical self-emission and debris imaging diagnostics for the Z Machine at Sandia National Laboratories. These diagnostics were designed and implemented to address several gaps in our understanding of visibly emitting phenomenon on Z and the post-shot debris environment. Optical emission arises from plasmas that form on the transmission line that delivers energy to Z loads and on the Z targets themselves; however, the dynamics of these plasmas are difficult to assess without imaging data. Addressing this, we developed a new optical imager called SEGOI (Self-Emission Gated Optical Imager) that leverages the eight gated optical imagers and two streak cameras of the Z Line VISAR system. SEGOI is a low cost, side-on imager with a 1 cm field of view and 30-50 µm spatial resolution, sensitive to green light (540-600 nm). This report outlines the design considerations and development of this diagnostic and presents an overview of the first diagnostic data acquired from four experimental campaigns. SEGOI was fielded on power flow experiments to image plasmas forming on and between transmission lines, on an inertial confinement fusion experiment called the Dynamic Screw Pinch to image low density plasmas forming on return current posts, on an experiment designed to measure the magneto Rayleigh-Taylor instability to image the instability bubble trajectory and self-emission structures, and finally on a Magnetized Liner Inertial Fusion (MagLIF) experiment to image the emission from the target. The second diagnostic developed, called DINGOZ (Debris ImagiNG on Z), was designed to improve our understanding of the post-shot debris environment. DINGOZ is an airtight enclosure that houses electronics and batteries to operate a high-speed (10-400 kfps) camera in the Z Machine center section. We report on the design considerations of this new diagnostic and present the first high-speed imaging data of the post-shot debris environment on Z.

More Details

Mapping Stochastic Devices to Probabilistic Algorithms

Aimone, James B.; Safonov, Alexander M.

Probabilistic and Bayesian neural networks have long been proposed as a method to incorporate uncertainty about the world (both in training data and operation) into artificial intelligence applications. One approach to making a neural network probabilistic is to leverage a Monte Carlo sampling approach that samples a trained network while incorporating noise. Such sampling approaches for neural networks have not been extensively studied due to the prohibitive requirement of many computationally expensive samples. While the development of future microelectronics platforms that make this sampling more efficient is an attractive option, it has not been immediately clear how to sample a neural network and what the quality of random number generation should be. This research aimed to start addressing these two fundamental questions by examining basic “off the shelf” neural networks can be sampled through a few different mechanisms (including synapse “dropout” and neuron “dropout”) and examine how these sampling approaches can be evaluated both in terms of evaluating algorithm effectiveness and the required quality of random numbers.

More Details

Application of Refractory High-Entropy Alloys for Higher-Reliability and Higher-Efficiency Brayton Cycles and Advanced Nuclear Reactors

Rodriguez, Salvador B.

An exceptional set of newly-discovered advanced superalloys known as refractory high-entropy alloys (RHEAs) can provide near-term solutions for wear, erosion, corrosion, high-temperature strength, creep, and radiation issues associated with supercritical carbon dioxide (sCO2) Brayton Cycles and advanced nuclear reactors. In particular, these superalloys can significantly extend their durability, reliability, and thermal efficiency, thereby making them more cost-competitive, safer, and reliable. For this project, it was endeavored to manufacture and test certain RHEAs, to solve technical issues impacting the Brayton Cycle and advanced nuclear reactors. This was achieved by leveraging Sandia’s patents, technical advances, and previous experience working with RHEAs. Herein, three RHEA manufacturing methods were applied: laser engineered net shaping, spark plasma sintering, and spray coating. Two promising RHEAs were selected, HfNbTaZr and MoNbTaVW. To demonstrate their performance, erosion, structural, radiation, and hightemperature experiments were conducted on the RHEAs, stainless steel (SS) 316 L, SS 1020, and Inconel 718 test coupons, as well as bench-top components. The experimental data is presented, analyzed, and confirms the superior performance of the HfNbTaZr and MoNbTaVW RHEAs vs. SS 316 L, SS 1020, and Inconel 718. In addition, to gain more insights for larger-scale RHEA applications, the erosion and structural capabilities for the two RHEAs were simulated and compared with the experimental data. The experimental data confirm the superior performance of the HfNbTaZr and MoNbTaVW RHEAs vs. SS and Inconel. Most importantly, the erosion and the coating material experimental data show that erosion in sCO2 Brayton Cycles can be eliminated completely if RHEAs are used. The experimental suite and validations confirm that HfNbTaZr is suitable for harsh environments that do not include nuclear radiation, while MoNbTaVW is suitable for harsh environments that include radiation.

More Details

A Comprehensive Open-Source R Software For Statistical Metrology Calculations: From Uncertainty Evaluation To Risk Analysis

NCSLI Measure

Delker, Collin J.

Whether calibrating equipment or inspecting products on the factory floor, metrology requires many complicated statistical calculations to achieve a full understanding and evaluation of measurement uncertainty and quality. In order to assist its workforce in performing these calculations in a consistent and rigorous way, the Primary Standards Lab at Sandia National Laboratories (SNL) has developed a free and open-source software package for computing various metrology calculations from uncertainty propagation to risk analysis. In addition to propagating uncertainty through a measurement model using the well-known Guide to Expression of Uncertainty in Measurement or Monte Carlo approaches, evaluating the individual Type A and Type B uncertainty components that go into the measurement model often requires other statistical methods such as analysis of variance or determining uncertainty in a fitted curve. Once the uncertainty in a measurement has been calculated, it is usually evaluated from a risk perspective to ensure the measurement is suitable for making a particular conformance decision. Finally, SNL’s software can perform all these calculations in a single application via an easy-to-use graphical interface, where the different functions are integrated so the results of one calculation can be used as inputs to another calculation.

More Details

Physics-Based Optical Neuromorphic Classification

Leonard, Francois L.; Teeter, Corinne M.; Vineyard, Craig M.

Typical approaches to classify scenes from light convert the light field to electrons to perform the computation in the digital electronic domain. This conversion and downstream computational analysis require significant power and time. Diffractive neural networks have recently emerged as unique systems to classify optical fields at lower energy and high speeds. Previous work has shown that a single layer of diffractive metamaterial can achieve high performance on classification tasks. In analogy with electronic neural networks, it is anticipated that multilayer diffractive systems would provide better performance, but the fundamental reasons for the potential improvement have not been established. In this work, we present extensive computational simulations of two - layer diffractive neural networks and show that they can achieve high performance with fewer diffractive features than single layer systems.

More Details

FAIR DEAL Grand Challenge Overview

Allemang, Christopher R.; Anderson, Evan M.; Baczewski, Andrew D.; Bussmann, Ezra B.; Butera, Robert; Campbell, DeAnna M.; Campbell, Quinn C.; Carr, Stephen M.; Frederick, Esther; Gamache, Phillip G.; Gao, Xujiao G.; Grine, Albert D.; Gunter, Mathew M.; Halsey, Connor H.; Ivie, Jeffrey A.; Katzenmeyer, Aaron M.; Leenheer, Andrew J.; Lepkowski, William L.; Lu, Tzu-Ming L.; Mamaluy, Denis M.; Mendez Granado, Juan P.; Pena, Luis F.; Schmucker, Scott W.; Scrymgeour, David S.; Tracy, Lisa A.; Wang, George T.; Ward, Dan; Young, Steve M.

While it is likely practically a bad idea to shrink a transistor to the size of an atom, there is no arguing that it would be fantastic to have atomic-scale control over every aspect of a transistor – a kind of crystal ball to understand and evaluate new ideas. This project showed that it was possible to take a niche technique used to place dopants in silicon with atomic precision and apply it broadly to study opportunities and limitations in microelectronics. In addition, it laid the foundation to attaining atomic-scale control in semiconductor manufacturing more broadly.

More Details

New experimental approach to understanding the chemical reactivity of oxide surfaces

Wong, Chun-Shang W.; Wang, Chen S.; Thurmer, Konrad T.; Whaley, Josh A.; Kolasinski, Robert K.

Metal oxides have been an attractive option for a range of applications, including hydrogen sensors, microelectronics, and catalysis, due to their reactivity and tunability. The properties of metal oxides can vary greatly on their precise surface structure; however, few surface science techniques can achieve atomistic-level determinations of surface structure, and fewer yet can do so for insulator surfaces. Low energy ion beam analysis offers a potential insulator-compatible solution to characterizing the surface structure of metal oxides. As a feasibility study, we apply low energy ion beam analysis to investigate the surface structure of a magnetite single crystal, Fe3O4(100). We obtain multi-angle maps using both forward-scattering low energy ion scattering (LEIS) and backscattering impact-collision ion scattering spectroscopy (ICISS). Both sets of experimental maps have intensity patterns that reflect the symmetries of the Fe3O4(100) surface structure. However, analytical interpretation of these intensity patterns to extract details of the surface structure is significantly more complex than previous LEIS and ICISS structural studies of one-component metal crystals, which had far more symmetries to exploit. To gain further insight into the surface structure, we model our experimental measurements with ion-trajectory tracing simulations using molecular dynamics. Our simulations provide a qualitative indication that our experimental measurements agree better with a subsurface cation vacancy model than a distorted bulk model.

More Details

City-Wide Distributed Roof-Top Photovoltaic System Adoption Forecast, Grid Impact Simulation, & Neighborhood Microgrid Contribution Assessment

Jones, Christian B.; Vining, William F.; Haines, Thad

The adoption of distributed photovoltaic (PV) systems grew significantly in recent years. Market projections anticipate future growth for both residential and commercial installations. To understand grid impacts associated with distributed PV, useful hosting capacity studies require accurate representations of the spatial distribution of PV adoptions. Prediction of PV locations and numbers depends on median income data, building use zoning maps, and permit records to understand existing trends and predict future adoption rates and locations throughout an entire city. Using the PV adoption data, advanced and realistic simulations were performed to capture the distributed PV impacts on the grid. Also, using graph theory community detection hundreds of neighborhood microgrids can be discovered for the entire city by identifying densely connected loads that are sparsely connected to other communities. Then, based on the PV adoption predictions, this work identified the contribution of PV within each of the newly discovered graph theory defined microgrid communities.

More Details

The Kokkos EcoSystem: Comprehensive Performance Portability for High Performance Computing

Computing in Science and Engineering

Trott, Christian R.; Berger-Vergiat, Luc B.; Poliakoff, David Z.; Rajamanickam, Sivasankaran R.; Lebrun-Grandie, Damien; Madsen, Jonathan; Al Awar, Nader; Gligoric, Milos; Shipman, Galen; Womeldorff, Geoff

State-of-the-art engineering and science codes have grown in complexity dramatically over the last two decades. Application teams have adopted more sophisticated development strategies, leveraging third party libraries, deploying comprehensive testing, and using advanced debugging and profiling tools. In today's environment of diverse hardware platforms, these applications also desire performance portability-avoiding the need to duplicate work for various platforms. The Kokkos EcoSystem provides that portable software stack. Based on the Kokkos Core Programming Model, the EcoSystem provides math libraries, interoperability capabilities with Python and Fortran, and Tools for analyzing, debugging, and optimizing applications. In this article, we overview the components, discuss some specific use cases, and highlight how codesigning these components enables a more developer friendly experience.

More Details

An Overview of Gemma FY2021 Verification Activities

Freno, Brian A.; Matula, Neil M.; Owen, Justin O.; Krueger, Aaron M.; Johnson, William Arthur.

Though the method-of-moments implementation of the electric-field integral equation plays an important role in computational electromagnetics, it provides many code-verification challenges due to the different sources of numerical error and their possible interactions. Matters are further complicated by singular integrals, which arise from the presence of a Green's function. In this report, we document our research to address these issues, as well as its implementation and testing in Gemma.

More Details

Semi-supervised Bayesian Low-shot Learning

Adams, Jason R.; Goode, Katherine J.; Michalenko, Joshua J.; Lewis, Phillip J.; Ries, Daniel R.

Deep neural networks (NNs) typically outperform traditional machine learning (ML) approaches for complicated, non-linear tasks. It is expected that deep learning (DL) should offer superior performance for the important non-proliferation task of predicting explosive device configuration based upon observed optical signature, a task which human experts struggle with. However, supervised machine learning is difficult to apply in this mission space because most recorded signatures are not associated with the corresponding device description, or “truth labels.” This is challenging for NNs, which traditionally require many samples for strong performance. Semi-supervised learning (SSL), low-shot learning (LSL), and uncertainty quantification (UQ) for NNs are emerging approaches that could bridge the mission gaps of few labels and rare samples of importance. NN explainability techniques are important in gaining insight into the inferential feature importance of such a complex model. In this work, SSL, LSL, and UQ are merged into a single framework, a significant technical hurdle not previously demonstrated. Exponential Average Adversarial Training (EAAT) and Pairwise Neural Networks (PNNs) are chosen as the SSL and LSL methods of choice. Permutation feature importance (PFI) for functional data is used to provide explainability via the Variable importance Explainable Elastic Shape Analysis (VEESA) pipeline. A variety of uncertainty quantification approaches are explored: Bayesian Neural Networks (BNNs), ensemble methods, concrete dropout, and evidential deep learning. Two final approaches, one utilizing ensemble methods and one utilizing evidential learning, are constructed and compared using a well-quantified synthetic 2D dataset along with the DIRSIG Megascene.

More Details

Gamma spectrometry uranium isotopic analysis rodeo: Summary of GADRAS results

Enghauser, Michael E.

This report summarizes GADRAS methods and gamma spectrometry rodeo uranium isotopic analysis results for high energy resolution H3D M400 cadmium zinc telluride (CZT) and ORTEC Micro Detective high-purity germanium (HPGe) spectra of uranium isotopic standards collected at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL) over a two-year measurement campaign. During the campaign, measurements were performed with the detectors unshielded, side shielded, and collimated. In addition, measurements of the uranium isotopic standards were performed unshielded and shielded.

More Details

SCEPTRE 2.3 Quick Start Guide

Drumm, Clifton R.; Bruss, Donald E.; Fan, Wesley C.; Pautz, Shawn D.

This report provides a summary of notes for building and running the Sandia Computational Engine for Particle Transport for Radiation Effects (SCEPTRE) code. SCEPTRE is a general- purpose C++ code for solving the linear Boltzmann transport equation in serial or parallel using unstructured spatial finite elements, multigroup energy treatment, and a variety of angular treatments including discrete ordinates (Sn) and spherical harmonics (Pn). Either the first-order form of the Boltzmann equation or one of the second-order forms may be solved. SCEPTRE requires a small number of open-source Third Party Libraries (TPL) to be available, and example scripts for building these TPL are provided. The TPL needed by SCEPTRE are Trilinos, Boost, and Netcdf. SCEPTRE uses an autotools build system, and a sample configure script is provided. Running the SCEPTRE code requires that the user provide a spatial finite-elements mesh in Exodus format and a cross section library in a format that will be described. SCEPTRE uses an xml-based input, and several examples will be provided.

More Details

Large-scale Nonlinear Approaches for Inference of Reporting Dynamics and Unobserved SARS-CoV-2 Infections

Hart, William E.; Bynum, Michael L.; Laird, Carl; Siirola, John D.; Staid, Andrea S.

This work focuses on estimation of unknown states and parameters in a discrete-time, stochastic, SEIR model using reported case counts and mortality data. An SEIR model is based on classifying individuals with respect to their status in regards to the progression of the disease, where S is the number individuals who remain susceptible to the disease, E is the number of individuals who have been exposed to the disease but not yet infectious, I is the number of individuals who are currently infectious, and R is the number of recovered individuals. For convenience, we include in our notation the number of infections or transmissions, T, that represents the number of individuals transitioning from compartment S to compartment E over a particular interval. Similarly, we use C to represent the number of reported cases.

More Details

Incentivizing Adoption of Software Quality Practices

Raybourn, Elaine M.; Milewicz, Reed M.; Mundt, Miranda R.

Although many software teams across the laboratories comply with yearly software quality engineering (SQE) assessments, the practice of introducing quality into each phase of the software lifecycle, or the team processes, may vary substantially. Even with the support of a quality engineer, many teams struggle to adapt and right-size software engineering best practices in quality to fit their context, and these activities aren’t framed in a way that motivates teams to take action. In short, software quality is often a “check the box for compliance” activity instead of a cultural practice that both values software quality and knows how to achieve it. In this report, we present the results of our 6600 VISTA Innovation Tournament project, "Incentivizing and Motivating High Confidence and Research Software Teams to Adopt the Practice of Quality." We present our findings and roadmap for future work based on 1) a rapid review of relevant literature, 2) lessons learned from an internal design thinking workshop, and 3) an external Collegeville 2021 workshop. These activities provided an opportunity for team ideation and community engagement/feedback. Based on our findings, we believe a coordinated effort (e.g. strategic communication campaign) aimed at diffusing the innovation of the practice of quality across Sandia National Laboratories could over time effect meaningful organizational change. As such, our roadmap addresses strategies for motivating and incentivizing individuals ranging from early career to seasoned software developers/scientists.

More Details

Processing Aleatory and Epistemic Uncertainties in Experimental Data From Sparse Replicate Tests of Stochastic Systems for Real-Space Model Validation

Journal of Verification, Validation and Uncertainty Quantification

Romero, Vicente J.; Black, Amalia R.

This paper presents a practical methodology for propagating and processing uncertainties associated with random measurement and estimation errors (that vary from test-to-test) and systematic measurement and estimation errors (uncertain but similar from test-to-test) in inputs and outputs of replicate tests to characterize response variability of stochastically varying test units. Also treated are test condition control variability from test-to-test and sampling uncertainty due to limited numbers of replicate tests. These aleatory variabilities and epistemic uncertainties result in uncertainty on computed statistics of output response quantities. The methodology was developed in the context of processing experimental data for “real-space” (RS) model validation comparisons against model-predicted statistics and uncertainty thereof. The methodology is flexible and sufficient for many types of experimental and data uncertainty, offering the most extensive data uncertainty quantification (UQ) treatment of any model validation method the authors are aware of. It handles both interval and probabilistic uncertainty descriptions and can be performed with relatively little computational cost through use of simple and effective dimension- and order-adaptive polynomial response surfaces in a Monte Carlo (MC) uncertainty propagation approach. A key feature of the progressively upgraded response surfaces is that they enable estimation of propagation error contributed by the surrogate model. Sensitivity analysis of the relative contributions of the various uncertainty sources to the total uncertainty of statistical estimates is also presented. The methodologies are demonstrated on real experimental validation data involving all the mentioned sources and types of error and uncertainty in five replicate tests of pressure vessels heated and pressurized to failure. Simple spreadsheet procedures are used for all processing operations.

More Details

NgramPPM: Compression Analytics without Compression

Bauer, Travis L.

Arithmetic Coding (AC) using Prediction by Partial Matching (PPM) is a compression algorithm that can be used as a machine learning algorithm. This paper describes a new algorithm, NGram PPM. NGram PPM has all the predictive power of AC/PPM, but at a fraction of the computational cost. Unlike compression-based analytics, it is also amenable to a vector space interpretation, which creates the ability for integration with other traditional machine learning algorithms. AC/PPM is reviewed, including its application to machine learning. Then NGram PPM is described and test results are presented, comparing them to AC/PPM.

More Details

Critical Infrastructure Decision-Making under Long-Term Climate Hazard Uncertainty: The Need for an Integrated, Multidisciplinary Approach

Staid, Andrea S.; Fleming Lindsley, Elizabeth S.; Gunda, Thushara G.; Jackson, Nicole D.

U.S. critical infrastructure assets are often designed to operate for decades, and yet long-term planning practices have historically ignored climate change. With the current pace of changing operational conditions and severe weather hazards, research is needed to improve our ability to translate complex, uncertain risk assessment data into actionable inputs to improve decision-making for infrastructure planning. Decisions made today need to explicitly account for climate change – the chronic stressors, the evolution of severe weather events, and the wide-ranging uncertainties. If done well, decision making with climate in mind will result in increased resilience and decreased impacts to our lives, economies, and national security. We present a three-tier approach to create the research products needed in this space: bringing together climate projection data, severe weather event modeling, asset-level impacts, and contextspecific decision constraints and requirements. At each step, it is crucial to capture uncertainties and to communicate those uncertainties to decision-makers. While many components of the necessary research are mature (i.e., climate projection data), there has been little effort to develop proven tools for long-term planning in this space. The combination of chronic and acute stressors, spatial and temporal uncertainties, and interdependencies among infrastructure sectors coalesce into a complex decision space. By applying known methods from decision science and data analysis, we can work to demonstrate the value of an interdisciplinary approach to climate-hazard decision making for longterm infrastructure planning.

More Details

Integrated System and Application Continuous Performance Monitoring and Analysis Capability

Aaziz, Omar R.; Allan, Benjamin A.; Brandt, James M.; Cook, Jeanine C.; Devine, Karen D.; Elliott, James E.; Gentile, Ann C.; Hammond, Simon D.; Kelley, Brian M.; Lopatina, Lena; Moore, Stan G.; Olivier, Stephen L.; Laros, James H.; Poliakoff, David Z.; Pawlowski, Roger P.; Regier, Phillip A.; Schmitz, Mark E.; Schwaller, Benjamin S.; Surjadidjaja, Vanessa S.; Swan, Matthew S.; Tucker, Nick; Tucker, Thomas; Vaughan, Courtenay T.; Walton, Sara P.

Scientific applications run on high-performance computing (HPC) systems are critical for many national security missions within Sandia and the NNSA complex. However, these applications often face performance degradation and even failures that are challenging to diagnose. To provide unprecedented insight into these issues, the HPC Development, HPC Systems, Computational Science, and Plasma Theory & Simulation departments at Sandia crafted and completed their FY21 ASC Level 2 milestone entitled "Integrated System and Application Continuous Performance Monitoring and Analysis Capability." The milestone created a novel integrated HPC system and application monitoring and analysis capability by extending Sandia's Kokkos application portability framework, Lightweight Distributed Metric Service (LDMS) monitoring tool, and scalable storage, analysis, and visualization pipeline. The extensions to Kokkos and LDMS enable collection and storage of application data during run time, as it is generated, with negligible overhead. This data is combined with HPC system data within the extended analysis pipeline to present relevant visualizations of derived system and application metrics that can be viewed at run time or post run. This new capability was evaluated using several week-long, 290-node runs of Sandia's ElectroMagnetic Plasma In Realistic Environments ( EMPIRE ) modeling and design tool and resulted in 1TB of application data and 50TB of system data. EMPIRE developers remarked this capability was incredibly helpful for quickly assessing application health and performance alongside system state. In short, this milestone work built the foundation for expansive HPC system and application data collection, storage, analysis, visualization, and feedback framework that will increase total scientific output of Sandia's HPC users.

More Details

Predictive Data-driven Platform for Subsurface Energy Production

Yoon, Hongkyu Y.; Verzi, Stephen J.; Cauthen, Katherine R.; Musuvathy, Srideep M.; Melander, Darryl J.; Norland, Kyle; Morales, Adriana M.; Lee, Jonghyun; Sun, Alexander

Subsurface energy activities such as unconventional resource recovery, enhanced geothermal energy systems, and geologic carbon storage require fast and reliable methods to account for complex, multiphysical processes in heterogeneous fractured and porous media. Although reservoir simulation is considered the industry standard for simulating these subsurface systems with injection and/or extraction operations, reservoir simulation requires spatio-temporal “Big Data” into the simulation model, which is typically a major challenge during model development and computational phase. In this work, we developed and applied various deep neural network-based approaches to (1) process multiscale image segmentation, (2) generate ensemble members of drainage networks, flow channels, and porous media using deep convolutional generative adversarial network, (3) construct multiple hybrid neural networks such as convolutional LSTM and convolutional neural network-LSTM to develop fast and accurate reduced order models for shale gas extraction, and (4) physics-informed neural network and deep Q-learning for flow and energy production. We hypothesized that physicsbased machine learning/deep learning can overcome the shortcomings of traditional machine learning methods where data-driven models have faltered beyond the data and physical conditions used for training and validation. We improved and developed novel approaches to demonstrate that physics-based ML can allow us to incorporate physical constraints (e.g., scientific domain knowledge) into ML framework. Outcomes of this project will be readily applicable for many energy and national security problems that are particularly defined by multiscale features and network systems.

More Details

Named Data Networking for DER Cybersecurity

Chavez, Adrian R.; Cordeiro, Patricia G.; Huang, Gary H.; Kitsos, Panayioti C.; La Pay, Trevor L.; Short, Austin S.; Summers, Adam

We present our research findings on the novel NDN protocol. In this work, we defined key attack scenarios for possible exploitation and detail software security testing procedures to evaluate the security of the NDN software. This work was done in the context of distributed energy resources (DER). The software security testing included an execution of unit tests and static code analyses to better understand the software rigor and the security that has been implemented. The results from the penetration testing are presented. Recommendations are discussed to provide additional defense for secure end-to-end NDN communications.

More Details

Sphynx: A parallel multi-GPU graph partitioner for distributed-memory systems

Parallel Computing

Acer, Seher A.; Boman, Erik G.; Glusa, Christian A.; Rajamanickam, Sivasankaran R.

Graph partitioning has been an important tool to partition the work among several processors to minimize the communication cost and balance the workload. While accelerator-based supercomputers are emerging to be the standard, the use of graph partitioning becomes even more important as applications are rapidly moving to these architectures. However, there is no distributed-memory-parallel, multi-GPU graph partitioner available for applications. We developed a spectral graph partitioner, Sphynx, using the portable, accelerator-friendly stack of the Trilinos framework. In Sphynx, we allow using different preconditioners and exploit their unique advantages. We use Sphynx to systematically evaluate the various algorithmic choices in spectral partitioning with a focus on the GPU performance. We perform those evaluations on two distinct classes of graphs: regular (such as meshes, matrices from finite element methods) and irregular (such as social networks and web graphs), and show that different settings and preconditioners are needed for these graph classes. The experimental results on the Summit supercomputer show that Sphynx is the fastest alternative on irregular graphs in an application-friendly setting and obtains a partitioning quality close to ParMETIS on regular graphs. When compared to nvGRAPH on a single GPU, Sphynx is faster and obtains better balance and better quality partitions. Sphynx provides a good and robust partitioning method across a wide range of graphs for applications looking for a GPU-based partitioner.

More Details

Development and Use of an Ultra-High Resolution Electron Scattering Apparatus

Frank, Jonathan H.; Laros, James H.; Jana, Irina J.; Huang, Erxiong H.; Chandler, D.W.

In this LDRD project, we developed a versatile capability for high-resolution measurements of electron scattering processes in gas-phase molecules, such as ionization, dissociation, and electron attachment/detachment. This apparatus is designed to advance fundamental understanding of these processes and to inform predictions of plasmas associated with applications such as plasma-assisted combustion, neutron generation, re-entry vehicles, and arcing that are critical to national security. We use innovative coupling of electron-generation and electron-imaging techniques that leverages Sandia’s expertise in ion/electron imaging methods. Velocity map imaging provides a measure of the kinetic energies of electrons or ion products from electron scattering in an atomic or molecular beam. We designed, constructed, and tested the apparatus. Tests include dissociative electron attachment to O2 and SO2, as well as a new method for studying laser-initiated plasmas. This capability sets the stage for new studies in dynamics of electron scattering processes, including scattering from excited-state atoms and molecules.

More Details

Adaptation of the NWM Cloud Environment for an ISF Project

Meacham, Janette E.; Meacham, Paul G.; Huber, Cynthia M.; Grong, Erica L.

The DOE-NE NWM Cloud was designed to be a generic set of tools and applications for any nuclear waste management program. As policymakers continue to consider approaches that emphasize consolidated interim storage and transportation of spent nuclear fuel, a gap analysis of the tools and applications provided for spent nuclear fuel and high-level radioactive waste disposal in comparison those needed for siting, licensing, and developing a consolidated interim storage facility and/or for a transportation campaign will help prepare DOE for implementing such potential policy direction. This report evaluates the points of alignment and potential gaps between the applications on the NWM Cloud that supported SNF disposal project, and the applications needed to address QA requirements and for other project support needs of an SNF storage project.

More Details

Physiological Characterization of Language Comprehension

Matzen, Laura E.; Stites, Mallory C.; Ting, Christina T.; Howell, Breannan C.; Wisniewski, Kyra L.

In this project, our goal was to develop methods that would allow us to make accurate predictions about individual differences in human cognition. Understanding such differences is important for maximizing human and human-system performance. There is a large body of research on individual differences in the academic literature. Unfortunately, it is often difficult to connect this literature to applied problems, where we must predict how specific people will perform or process information. In an effort to bridge this gap, we set out to answer the question: can we train a model to make predictions about which people understand which languages? We chose language processing as our domain of interest because of the well- characterized differences in neural processing that occur when people are presented with linguistic stimuli that they do or do not understand. Although our original plan to conduct several electroencephalography (EEG) studies was disrupted by the COVID-19 pandemic, we were able to collect data from one EEG study and a series of behavioral experiments in which data were collected online. The results of this project indicate that machine learning tools can make reasonably accurate predictions about an individual?s proficiency in different languages, using EEG data or behavioral data alone.

More Details

Science & Engineering of Cyber Security by Uncertainty Quantification and Rigorous Experimentation (SECURE) HANDBOOK

Pinar, Ali P.; Tarman, Thomas D.; Swiler, Laura P.; Gearhart, Jared L.; Hart, Derek H.; Vugrin, Eric D.; Cruz, Gerardo C.; Arguello, Bryan A.; Geraci, Gianluca G.; Debusschere, Bert D.; Hanson, Seth T.; Outkin, Alexander V.; Thorpe, Jamie T.; Hart, William E.; Sahakian, Meghan A.; Gabert, Kasimir G.; Glatter, Casey J.; Johnson, Emma S.; Punla-Green, and She?Ifa S.

Abstract not provided.

Emergent Recursive Multiscale Interaction in Complex Systems

Naugle, Asmeret B.; Doyle, Casey L.; Sweitzer, Matthew; Rothganger, Fredrick R.; Verzi, Stephen J.; Lakkaraju, Kiran L.; Kittinger, Robert; Bernard, Michael L.; Chen, Yuguo; Loyal, Joshua; Mueen, Abdullah

This project studied the potential for multiscale group dynamics in complex social systems, including emergent recursive interaction. Current social theory on group formation and interaction focuses on a single scale (individuals forming groups) and is largely qualitative in its explanation of mechanisms. We combined theory, modeling, and data analysis to find evidence that these multiscale phenomena exist, and to investigate their potential consequences and develop predictive capabilities. In this report, we discuss the results of data analysis showing that some group dynamics theory holds at multiple scales. We introduce a new theory on communicative vibration that uses social network dynamics to predict group life cycle events. We discuss a model of behavioral responses to the COVID-19 pandemic that incorporates influence and social pressures. Finally, we discuss a set of modeling techniques that can be used to simulate multiscale group phenomena.

More Details

Platform for Single-Cell Dual RNA Sequencing of Host-Pathogen Interactions

Harouaka, Ramdane H.

The aim of this project was to advance single-cell RNA-Seq methods toward the establishment of a platform that may be used to simultaneously interrogate the gene expression profiles of mammalian host cells and bacterial pathogens. Existing genetic sequencing methods that measure bulk groups of cells do not account for the heterogeneity of cell-microbe interactions that occur within a complex environment, have limited efficiency, and cannot simultaneously interrogate bacterial sequences. In order to overcome these challenges, separate biochemistry workflows were developed based on a No-So-Random hexamer priming strategy or libraries of targeted molecular probes. Computational tools were developed to facilitate these methods, and feasibility was demonstrated for single-cell RNA-Seq for both bacterial and mammalian transcriptomes. This work supports cross-agency national priorities on addressing the threat of biological pathogens and understanding the role of the microbiome in modulating immunity and susceptibility to infection.

More Details

Propagation of a Stress Pulse in a Heterogeneous Elastic Bar

Journal of Peridynamics and Nonlocal Modeling

Silling, Stewart A.

The propagation of a wave pulse due to low-speed impact on a one-dimensional, heterogeneous bar is studied. Due to the dispersive character of the medium, the pulse attenuates as it propagates. This attenuation is studied over propagation distances that are much longer than the size of the microstructure. A homogenized peridynamic material model can be calibrated to reproduce the attenuation and spreading of the wave. The calibration consists of matching the dispersion curve for the heterogeneous material near the limit of long wavelengths. It is demonstrated that the peridynamic method reproduces the attenuation of wave pulses predicted by an exact microstructural model over large propagation distances.

More Details

Update on the Simulation of Commercial Drying of Spent Nuclear Fuel

Durbin, S.G.; Lindgren, Eric R.; Pulido, Ramon P.; Laros, James H.; Fasano, Raymond E.

The purpose of this report is to document improvements in the simulation of commercial vacuum drying procedures at the Nuclear Energy Work Complex at Sandia National Laboratories. Validation of the extent of water removal in a dry spent nuclear fuel storage system based on drying procedures used at nuclear power plants is needed to close existing technical gaps. Operational conditions leading to incomplete drying may have potential impacts on the fuel, cladding, and other components in the system. A general lack of data suitable for model validation of commercial nuclear canister drying processes necessitates additional, well-designed investigations of drying process efficacy and water retention. Scaled tests that incorporate relevant physics and well-controlled boundary conditions are essential to provide insight and guidance to the simulation of prototypic systems undergoing drying processes.

More Details

ASCEND: Asymptotically compatible strong form foundations for nonlocal discretization

Trask, Nathaniel A.; D'Elia, Marta D.; Littlewood, David J.; Silling, Stewart A.; Trageser, Jeremy T.; Tupek, Michael R.

Nonlocal models naturally handle a range of physics of interest to SNL, but discretization of their underlying integral operators poses mathematical challenges to realize the accuracy and robustness commonplace in discretization of local counterparts. This project focuses on the concept of asymptotic compatibility, namely preservation of the limit of the discrete nonlocal model to a corresponding well-understood local solution. We address challenges that have traditionally troubled nonlocal mechanics models primarily related to consistency guarantees and boundary conditions. For simple problems such as diffusion and linear elasticity we have developed complete error analysis theory providing consistency guarantees. We then take these foundational tools to develop new state-of-the-art capabilities for: lithiation-induced failure in batteries, ductile failure of problems driven by contact, blast-on-structure induced failure, brittle/ductile failure of thin structures. We also summarize ongoing efforts using these frameworks in data-driven modeling contexts. This report provides a high-level summary of all publications which followed from these efforts.

More Details

Data driven learning of robust nonlocal models

D'Elia, Marta D.; Silling, Stewart A.; You, Huaiqian; Yu, Yue

Nonlocal models use integral operators that embed length-scales in their definition. However, the integrands in these operators are difficult to define from the data that are typically available for a given physical system, such as laboratory mechanical property tests. In contrast, molecular dynamics (MD) does not require these integrands, but it suffers from computational limitations in the length and time scales it can address. To combine the strengths of both methods and to obtain a coarse-grained, homogenized continuum model that efficiently and accurately captures materials' behavior, we propose a learning framework to extract, from MD data, an optimal nonlocal model as a surrogate for MD displacements. Our framework guarantees that the resulting model is mathematically well-posed, physically consistent, and that it generalizes well to settings that are different from the ones used during training. The efficacy of this approach is demonstrated with several numerical tests for single layer graphene both in the case of perfect crystal and in the presence of thermal noise.

More Details

Multiscale assessment of caprock integrity for geologic carbon storage in the pennsylvanian farnsworth unit, Texas, USA

Energies

Trujillo, Natasha; Rose-Coss, Dylan; Heath, Jason; Dewers, Thomas D.; Ampomah, William; Mozley, Peter S.; Cather, Martha

Leakage pathways through caprock lithologies for underground storage of CO2 and/or enhanced oil recovery (EOR) include intrusion into nano-pore mudstones, flow within fractures and faults, and larger-scale sedimentary heterogeneity (e.g., stacked channel deposits). To assess multiscale sealing integrity of the caprock system that overlies the Morrow B sandstone reservoir, Farnsworth Unit (FWU), Texas, USA, we combine pore-to-core observations, laboratory testing, well logging results, and noble gas analysis. A cluster analysis combining gamma ray, compressional slowness, and other logs was combined with caliper responses and triaxial rock mechanics testing to define eleven lithologic classes across the upper Morrow shale and Thirteen Finger limestone caprock units, with estimations of dynamic elastic moduli and fracture breakdown pressures (minimum horizontal stress gradients) for each class. Mercury porosimetry determinations of CO2 column heights in sealing formations yield values exceeding reservoir height. Noble gas profiles provide a “geologic time-integrated” assessment of fluid flow across the reservoir-caprock system, with Morrow B reservoir measurements consistent with decades-long EOR water-flooding, and upper Morrow shale and lower Thirteen Finger limestone values being consistent with long-term geohydrologic isolation. Together, these data suggest an excellent sealing capacity for the FWU and provide limits for injection pressure increases accompanying carbon storage activities.

More Details

Impact of Integration Scheme on Performance of Anisotropic Plasticity Models

Lester, Brian T.; Scherzinger, William M.

Given the prevalent role of metals in a variety of industries, schemes to integrate corresponding constitutive models in finite element applications have long been studied. A number of formulations have been developed to accomplish this task; each with their own advantages and costs. Often the focus has been on ensuring the accuracy and numerical stability of these algorithms to enable robust integration. While important, emphasis on these performance metrics may often come at the cost of computational expense potentially neglecting the needs of individual problems. In the current work, the performance of two of the most common integration methods for anisotropic plasticity -- the convex cutting plane (CCP) and closest point projection (CPP) -- across a variety of metrics is assessed; including accuracy and cost. A variety of problems are considered ranging from single elements to large representative simulations including both implicit quasistatic and explicit transient dynamic type responses. The relative performance of each scheme in the different instances is presented with an eye towards guidance on when the different algorithms may be beneficial.

More Details

Sierra/SolidMechanics 5.2 Capabilities in Development

Bergel, Guy L.; Beckwith, Frank B.; Belcourt, Kenneth N.; de Frias, Gabriel J.; Manktelow, Kevin M.; Merewether, Mark T.; Miller, Scott T.; Mosby, Matthew D.; Plews, Julia A.; Shelton, Timothy S.; Thomas, Jesse T.; Treweek, Benjamin T.; Tupek, Michael R.; Veilleux, Michael V.; Wagman, Ellen B.

This user’s guide documents capabilities in Sierra/SolidMechanics which remain “in-development” and thus are not tested and hardened to the standards of capabilities listed in Sierra/SM 5.2 User’s Guide. Capabilities documented herein are available in Sierra/SM for experimental use only until their official release. These capabilities include, but are not limited to, novel discretization approaches such as the conforming reproducing kernel (CRK) method, numerical fracture and failure modeling aids such as the extended finite element method (XFEM) and J-integral, explicit time step control techniques, dynamic mesh rebalancing, as well as a variety of new material models and finite element formulations.

More Details

A Fast-Cycle Charge Noise Measurement for Better Qubits

Lewis, Rupert; Kindel, William K.; Harris, Charles T.; Skinner Ramos, Sueli D.

Defects in materials are an ongoing challenge for quantum bits, so called qubits. Solid state qubits—both spins in semiconductors and superconducting qubits—suffer from losses and noise caused by two-level-system (TLS) defects thought to reside on surfaces and in amorphous materials. Understanding and reducing the number of such defects is an ongoing challenge to the field. Superconducting resonators couple to TLS defects and provide a handle that can be used to better understand TLS. We develop noise measurements of superconducting resonators at very low temperatures (20 mK) compared to the resonant frequency, and low powers, down to single photon occupation.

More Details

Assessing and mapping extreme wave height along the Gulf of Mexico coast

Ahn, Seongho; Neary, Vincent S.; Chartrand, Chris; Pluemer, Sean

The effect of extreme waves on the coastal community includes inundation, loss of habitats, increasing shoreline erosion, and increasing risks to coastal infrastructures (e.g., ports, breakwaters, oil and gas platforms), important for supporting coastal resilience. The coastal communities along the US Gulf of Mexico are very low-lying, which makes the region particularly vulnerable to impacts of extreme waves generated by storm events. We propose assessing and mapping the risks from extreme waves for the Gulf of Mexico coast to support coastal resiliency planning. The risks will be assessed by computing n-year recurring wave height (e.g., 1, 5, 50, 100-year) using 32-year wave hindcast data and various extreme value analysis techniques including Peak- Over-Threshold and Annual Maxima method. The characteristics of the extreme waves, e.g., relations between the mean and extreme wave climates, directions associated with extreme waves, will be investigated. Hazard maps associated with extreme wave heights at different return periods will be generated to help planners identify potential risks and envision places that are less susceptible to future storm damage.

More Details

Spatially and Temporally Resolved Velocimetry for Hypersonic Flows

Zhang, Yibin Z.; Richardson, Daniel R.; Marshall, Garrett J.; Beresh, Steven J.; Casper, Katya M.

The development of new hypersonic flight vehicles is limited by the physical understanding that may be obtained from ground test facilities. This has motivated the present development of a temporally and spatially resolved velocimetry measurement for Sandia National Laboratories (SNL) Hypersonic Wind Tunnel (HWT) using Femtosecond Laser Electronic Excitation Tagging (FLEET). First, a multi-line FLEET technique has been created for the first time and tested in a supersonic jet, allowing simultaneous measurements of velocities along multiple profiles in a flow. Secondly, two different approaches have been demonstrated for generating dotted FLEET lines. One employs a slit mask pattern focused into points to yield a dotted line, allowing for two- or three-component velocity measurements free of contamination between components. The other dotted-line approach is based upon an optical wedge array and yields a grid of points rather than a dotted line. Two successful FLEET measurement campaigns have been conducted in SNL’s HWT. The first effort established optimal diagnostic configurations in the hypersonic environment based on earlier benchtop reproductions, including validation of the use of a 267 nm beam to boost the measurement signal-to-noise ratio (SNR) with minimal risk of perturbing the flow and greater simplicity than a comparable resonant technique at 202 nm. The same FLEET system subsequently was reconstituted to demonstrate the ability to make velocimetry measurements of hypersonic turbulence in a realistic flow field. Mean velocity profiles and turbulence intensity profiles of the shear layer in the wake of a hypersonic cone model were measured at several different downstream stations, proving the viability of FLEET as a hypersonic diagnostic.

More Details

CIS Project 22359, Final Technical Report. Discretized Posterior Approximation in High Dimensions

Duersch, Jed A.; Catanach, Thomas A.

Our primary aim in this work is to understand how to efficiently obtain reliable uncertainty quantification in automatic learning algorithms with limited training datasets. Standard approaches rely on cross-validation to tune hyper parameters. Unfortunately, when our datasets are too small, holdout datasets become unreliable—albeit unbiased—measures of prediction quality due to the lack of adequate sample size. We should not place confidence in holdout estimators under conditions wherein the sample variance is both large and unknown. More poigniantly, our training experiments on limited data (Duersch and Catanach, 2021) show that even if we could improve estimator quality under these conditions, the typical training trajectory may never even encounter generalizable models.

More Details

Gamma Irradiation Facility

Dodge, Haley D.

Gamma irradiation is a process that uses Cobalt60 radionuclide produced artificially in nuclear reactors to irradiate a variety of items using gamma radiation. A key characteristic of gamma irradiation is its high penetration capability and the fact that it can modify physical, chemical, and biological properties of the irradiated materials.

More Details

High-resolution magnetic microscopy applications using nitrogen-vacancy centers in diamond

Kehayias, Pauli M.

Magnetic microscopy with high spatial resolution helps to solve a variety of technical problems in condensed-matter physics, electrical engineering, biomagnetism, and geomagnetism. In this work we used quantum diamond magnetic microscope (QDMM) setups, which use a dense uniform layer of magnetically-sensitive nitrogen-vacancy (NV) centers in diamond to image an external magnetic field using a fluorescence microscope. We used this technique for imaging few-micron ferromagnetic needles used as a physically unclonable function (PUF) and to passively interrogate electric current paths in a commercial 555 timer integrated circuit (IC). As part of the QDMM development, we also found a way to calculate ion implantation recipes to create diamond samples with dense uniform NV layers at the surface. This work opens the possibility for follow-up experiments with 2D magnetic materials, ion implantation, and electronics characterization and troubleshooting.

More Details

Noise and error analysis and optimization in particle-based kinetic plasma simulations

Journal of Computational Physics

Evstatiev, E.G.; Finn, J.M.; Shadwick, B.A.; Hengartner, N.

In this paper we analyze the noise in macro-particle methods used in plasma physics and fluid dynamics, leading to approaches for minimizing the total error, focusing on electrostatic models in one dimension. We begin by describing kernel density estimation for continuous values of the spatial variable x, expressing the kernel in a form in which its shape and width are represented separately. The covariance matrix C(x,y) of the noise in the density is computed, first for uniform true density. The bandwidth of the covariance matrix is related to the width of the kernel. A feature that stands out is the presence of constant negative terms in the elements of the covariance matrix both on and off-diagonal. These negative correlations are related to the fact that the total number of particles is fixed at each time step; they also lead to the property ∫C(x,y)dy=0. We investigate the effect of these negative correlations on the electric field computed by Gauss's law, finding that the noise in the electric field is related to a process called the Ornstein-Uhlenbeck bridge, leading to a covariance matrix of the electric field with variance significantly reduced relative to that of a Brownian process. For non-constant density, ρ(x), still with continuous x, we analyze the total error in the density estimation and discuss it in terms of bias-variance optimization (BVO). For some characteristic length l, determined by the density and its second derivative, and kernel width h, having too few particles within h leads to too much variance; for h that is large relative to l, there is too much smoothing of the density. The optimum between these two limits is found by BVO. For kernels of the same width, it is shown that this optimum (minimum) is weakly sensitive to the kernel shape. We repeat the analysis for x discretized on a grid. In this case the charge deposition rule is determined by a particle shape. An important property to be respected in the discrete system is the exact preservation of total charge on the grid; this property is necessary to ensure that the electric field is equal at both ends, consistent with periodic boundary conditions. We find that if the particle shapes satisfy a partition of unity property, the particle charge deposited on the grid is conserved exactly. Further, if the particle shape is expressed as the convolution of a kernel with another kernel that satisfies the partition of unity, then the particle shape obeys the partition of unity. This property holds for kernels of arbitrary width, including widths that are not integer multiples of the grid spacing. We show results relaxing the approximations used to do BVO optimization analytically, by doing numerical computations of the total error as a function of the kernel width, on a grid in x. The comparison between numerical and analytical results shows good agreement over a range of particle shapes. We discuss the practical implications of our results, including the criteria for design and implementation of computationally efficient particle shapes that take advantage of the developed theory.

More Details

Seismic Source Modeling Software Enhancements (FY21)

Preston, Leiph A.; Poppeliers, Christian P.; Eliassi, Mehdi E.

Seismic source modeling allows researchers both to simulate how a source that induces seismic waves interacts with the Earth to produce observed seismograms and, inversely, to infer what the time histories, sizes, and force distributions were for a seismic source given observed seismograms. In this report, we discuss improvements made in FY21 to our software as applies to both the forward and inverse seismic source modeling problems. For the forward portion of the problem, we have added the ability to use full 3-D nonlinear simulations by implementing 3-D time varying boundary conditions within Sandia’s linear seismic code Parelasti. Secondly, on the inverse source modeling side, we have developed software that allows us to invert seismic gradiometer-derived observations in conjunction with standard translational motion seismic data to infer properties of the source that may improve characterization in certain circumstances. First, we describe the basic theory behind each software enhancement and then demonstrate the software in action with some simple examples.

More Details

Code Development Supporting a Non-Thermal Source of High Fluence Warm X-Ray

Bennett, Nichelle L.; Welch, Dale R.

A six-month research effort has advanced the hybrid kinetic-fluid modeling capability required for developing non-thermal warm x-ray sources on Z. The three particle treatments of quasi-neutral, multi-fluid, and kinetic are demonstrated in 1D simulations of an Ar gas puff. The simulations determine required resolutions for the advanced implicit solution techniques and debug hybrid particle treatments with equation-of-state and radiation transport. The kinetic treatment is used in preliminary analysis of the non-Maxwellian nature of a gas target. It is also demonstrates the sensitivity of the cyclotron and collision frequencies in determining the transition from thermal to non-thermal particle populations. Finally, a 2D Ar gas puff simulation of a Z shot demonstrates the readiness to proceed with realistic target configurations. The results put us on a very firm footing to proceed to a full LDRD which includes continued development transition criteria and x-ray yield calculation.

More Details

A simple levelset contact algorithm for large overlap removal and robust preloads

Mosby, Matthew D.; Tupek, Michael R.; Vo, Johnathan V.

A simple approach to simulate contact between deformable objects is presented which relies on levelset descriptions of the Lagrangian geometry and an optimization-based solver. Modeling contact between objects remains a significant challenge for computational mechanics simulations. Common approaches are either plagued by lack of robustness or are exceedingly complex and require a significant number of heuristics. In contrast, the levelset contact approach presented herein is essentially heuristic free. Furthermore, the presented algorithm enables resolving and enforcing contact between objects with a significant amount of initial overlap. Examples demonstrating the feasibility of this approach are shown, including the standard Hertz contact problem, the robust removal of overlap between two overlapping blocks, and overlap-removal and pre-load for a bolted configuration.

More Details

sCO2 Brayton Energy Conversion Customer Discovery

Mendez Cruz, Carmen M.; Wilson, Mollye C.

All energy production systems need efficient energy conversion systems. Current Rankine cycles use water to generate steam at temperatures where efficiency is limited to around 40%. As existing fossil and nuclear power plants are decommissioned due to end of effective life and/or societies’ desire for cleaner generation options, more efficient energy conversion is needed to keep up with increasing electricity demands. Modern energy generation technologies, such as advanced nuclear reactors and concentrated solar, coupled to high efficiency sCO2 conversion systems provide a solution to efficient, clean energy systems. Leading R&D communities worldwide agree that the successful development of sCO2 Brayton power cycle technology will eventually bring about large-scale changes to existing multi-billion-dollar global markets and enable power applications not currently possible or economically justifiable. However, all new technologies face challenges in the path to commercialization and the electricity sector is distinctively risk averse. The Sandia sCO2 Brayton team needs to better understand what the electricity sector needs in terms of new technology risk mitigation, generation efficiency, reliability improvements above current technology, and cost requirements which would make new technology adoption worthwhile. Relying on the R&D community consensus that a sCO2 power cycle will increase the revenue of the electrical industry, without addressing the electrical industry’s concerns, significantly decreases the potential for adoption at commercial scale. With a clear understanding of the market perspectives on technology adoption, including military, private sector, and utilities customers, the Sandia Brayton Team can resolve industry concerns for smoother development and faster transition to commercialization. An extensive customer discovery process, similar to that executed through the NSF’s I-Corp program, is necessary in order to understand the pain points of the market and articulate the value proposition of Brayton systems in terms that engage decision makers and facilitate commercialization of the technology.

More Details

Concurrent Shape and Topology Optimization

Robbins, Joshua R.; Alberdi, Ryan A.; Clark, Brett W.

The typical topology optimization workflow uses a design domain that does not change during the optimization process. Consequently, features of the design domain, such as the location of loads and constraints, must be determined in advance and are not optimizable. A method is proposed herein that allows the design domain to be optimized along with the topology. This approach uses topology and shape derivatives to guide nested optimizers to the optimal topology and design domain. The details of the method are discussed, and examples are provided that demonstrate the utility of this approach.

More Details

Integration of energy storage with diesel generation in remote communities

MRS Energy and Sustainability

Trevizan, Rodrigo D.; Headley, Alexander J.; Geer, Robert; Atcitty, Stanley A.; Gyuk, Imre

Highlights: Battery energy storage may improve energy efficiency and reliability of hybrid energy systems composed by diesel and solar photovoltaic power generators serving isolated communities.In projects aiming update of power plants serving electrically isolated communities with redundant diesel generation, battery energy storage can improve overall economic performance of power supply system by reducing fuel usage, decreasing capital costs by replacing redundant diesel generation units, and increasing generator system life by shortening yearly runtime.Fast-acting battery energy storage systems with grid-forming inverters might have potential for improving drastically the reliability indices of isolated communities currently supplied by diesel generation. Abstract: This paper will highlight unique challenges and opportunities with regard to energy storage utilization in remote, self-sustaining communities. The energy management of such areas has unique concerns. Diesel generation is often the go-to power source in these scenarios, but these systems are not devoid of issues. Without dedicated maintenance crews as in large, interconnected network areas, minor interruptions can be frequent and invasive not only for those who lose power, but also for those in the community that must then correct any faults. Although the immediate financial benefits are perhaps not readily apparent, energy storage could be used to address concerns related to reliability, automation, fuel supply concerns, generator degradation, solar utilization, and, yes, fuel costs to name a few. These ideas are shown through a case study of the Levelock Village of Alaska. Currently, the community is faced with high diesel prices and a difficult supply chain, which makes temporary loss of power very common and reductions in fuel consumption very impactful. This study will investigate the benefits that an energy storage system could bring to the overall system life, fuel costs, and reliability of the power supply. The variable efficiency of the generators, impact of startup/shutdown process, and low-load operation concerns are considered. The technological benefits of the combined system will be explored for various scenarios of future diesel prices and technology maintenance/replacement costs as well as for the avoidance of power interruptions that are so common in the community currently. Graphic abstract: [Figure not available: see fulltext.] Discussion: In several cases, energy storage can provide a means to promote energy equity by improving remote communities’ power supply reliability to levels closer to what the average urban consumer experiences at a reduced cost compared to transmission buildout. Furthermore, energy equity represents a hard-to-quantify benefit achieved by the integration of energy storage to isolated power systems of under-served communities, which suggests that the financial aspects of such projects should be questioned as the main performance criterion. To improve battery energy storage system valuation for diesel-based power systems, integration analysis must be holistic and go beyond fuel savings to capture every value stream possible.

More Details

Manipulation of Hole Spin Transport in Germanium

Lu, Tzu-Ming L.; Hutchins-Delgado, Troy A.; Lidsky, David A.

Downscaling of the silicon metal-oxide-semiconductor field-effect transistor technology is expected to reach a fundamental limit soon. A paradigm shift in computing is occurring. Spin field-effect transistors are considered a candidate architecture for next-generation microelectronics. Being able to leverage the existing infrastructure for silicon, a spin field-effect transistor technology based on group IV heterostructures will have unparalleled technical and economical advantages. For the same material platform reason, germanium hole quantum dots are also considered a competitive architecture for semiconductor-based quantum technology. In this project, we investigated several approaches to creating hole devices in germanium-based materials as well as injecting hole spins in such structures. We also explored the roles of hole injection in wet chemical etching of germanium. Our main results include the demonstration of germanium metal-oxide-semiconductor field-effect transistors operated at cryogenic temperatures, ohmic current-voltage characteristics in germanium/silicon-germanium heterostructures with ferromagnetic contacts at deep cryogenic temperatures and high magnetic fields, evaluation of the effects of surface preparation on carrier mobility in germanium/silicon- germanium heterostructures, and hole spin polarization through integrated permanent magnets. These results serve as essential components for fabricating next-generation germanium-based devices for microelectronics and quantum systems.

More Details

Nonlinear Interface Reduction for Time-Domain Analysis of Hurty/Craig-Bampton Superelements with Frictional Contact

Journal of Sound and Vibration

Hughes, Patrick J.; Kuether, Robert J.

Virtual prototyping in engineering design rely on modern numerical models of contacting structures with accurate resolution of interface mechanics, which strongly affect the system-level stiffness and energy dissipation due to frictional losses. High-fidelity modeling within the localized interfaces is required to resolve local quantities of interest that may drive design decisions. The high-resolution finite element meshes necessary to resolve inter-component stresses tend to be computationally expensive, particularly when the analyst is interested in response time histories. The Hurty/Craig-Bampton (HCB) transformation is a widely used method in structural dynamics for reducing the interior portion of a finite element model while having the ability to retain all nonlinear contact degrees of freedom (DOF) in physical coordinates. These models may still require many DOF to adequately resolve the kinematics of the interface, leading to inadequate reduction and computational savings. This study proposes a novel interface reduction method to overcome these challenges by means of system-level characteristic constraint (SCC) modes and properly orthogonal interface modal derivatives (POIMDs) for transient dynamic analyses. Both SCC modes and POIMDs are computed using the reduced HCB mass and stiffness matrices, which can be directly computed from many commercial finite element analysis software. Comparison of time history responses to an impulse-type load in a mechanical beam assembly indicate that the interface-reduced model correlates well with the HCB truth model. Localized features like slip and contact area are well-represented in the time domain when the beam assembly is loaded with a broadband excitation. The proposed method also yields reduced-order models with greater critical timestep lengths for explicit integration schemes.

More Details

Evaluating Scalograms for Seismic Event Denoising

Lewis, Phillip J.; Gonzales, Antonio G.; Hammond, Patrick H.

Denoising contaminated seismic signals for later processing is a fundamental problem in seismic signals analysis. The most straightforward denoising approach, using spectral filtering, is not effective when noise and seismic signal occupy the same frequency range. Neural network approaches have shown success denoising local signal when trained on short-time Fourier transform spectrograms (Zhu et al 2018; Tibi et al 2021). Scalograms, a wavelet-based transform, achieved ~15% better reconstruction as measured by dynamic time warping on a seismic waveform test set than spectrograms, suggesting their use as an alternative for denoising. We train a deep neural network on a scalogram dataset derived from waveforms recorded by the University of Utah Seismograph Stations network. We find that initial results are no better than a spectrogram approach, with additional overhead imposed by the significantly larger size of scalograms. A robust exploration of neural network hyperparameters and network architecture was not performed, which could be done in follow on work.

More Details

Direct Subsurface Measurements through Precise Micro Drilling

Su, Jiann-Cherng S.; Bettin, Giorgia B.; Buerger, Stephen B.; Rittikaidachar, Michal; Hobart, Clinton G.; Slightam, Jonathon S.; McBrayer, Kepra M.; Gonzalez, Levi M.; Pope, Joseph S.; Foris, Adam J.; Bruss, Kathryn; Kim, Raymond; Mazumdar, Anirban

Wellbore integrity is a significant problem in the U.S. and worldwide, which has serious adverse environmental and energy security consequences. Wells are constructed with a cement barrier designed to last about 50 years. Indirect measurements and models are commonly used to identify wellbore damage and leakage, often producing subjective and even erroneous results. The research presented herein focuses on new technologies to improve monitoring and detection of wellbore failures (leaks) by developing a multi-step machine learning approach to localize two types of thermal defects within a wellbore model, a prototype mechatronic system for automatically drilling small diameter holes of arbitrary depth to monitor the integrity of oil and gas wells in situ, and benchtop testing and analyses to support the development of an autonomous real-time diagnostic tool to enable sensor emplacement for monitoring wellbore integrity. Each technology was supported by experimental results. This research has provided tools to aid in the detection of wellbore leaks and significantly enhanced our understanding of the interaction between small-hole drilling and wellbore materials.

More Details

Multi-fidelity thermal modeling of laser powder bed additive manufacturing

Moser, Daniel M.

Laser powder bed fusion (LPBF) Additive manufacturing (AM) has attracted interest as an agile method of building production metal parts to reduce design-build-test cycle times for systems. However, predicting part performance is difficult due to inherent process variabilities. This makes qualification challenging. Computational process models have attempted to address some of these challenges, including mesoscale, full physics models and reduced fidelity conduction models. The goal of this work is credible multi-fidelity modeling of the LPBF process by investigating methods for estimating the error between models of two different fidelities. Two methods of error estimation are investigated, adjoint-based error estimation and Bayesian calibration. Adjoint-based error estimation is found to effectively bounding the error between the two models, but with very conservative bounds, making predictions highly uncertain. Bayesian parameter calibration applied to conduction model heat source parameters is found to effectively bound the observed error between the models for melt pool morphology quantities of interest. However, the calibrations do not effectively bound the error in heat distribution.

More Details
Results 8801–9000 of 96,771
Results 8801–9000 of 96,771