Publications

Results 10401–10600 of 99,299

Search results

Jump to search filters

High-resolution magnetic microscopy applications using nitrogen-vacancy centers in diamond

Kehayias, Pauli

Magnetic microscopy with high spatial resolution helps to solve a variety of technical problems in condensed-matter physics, electrical engineering, biomagnetism, and geomagnetism. In this work we used quantum diamond magnetic microscope (QDMM) setups, which use a dense uniform layer of magnetically-sensitive nitrogen-vacancy (NV) centers in diamond to image an external magnetic field using a fluorescence microscope. We used this technique for imaging few-micron ferromagnetic needles used as a physically unclonable function (PUF) and to passively interrogate electric current paths in a commercial 555 timer integrated circuit (IC). As part of the QDMM development, we also found a way to calculate ion implantation recipes to create diamond samples with dense uniform NV layers at the surface. This work opens the possibility for follow-up experiments with 2D magnetic materials, ion implantation, and electronics characterization and troubleshooting.

More Details

Large-scale Nonlinear Approaches for Inference of Reporting Dynamics and Unobserved SARS-CoV-2 Infections

Hart, William E.; Bynum, Michael L.; Laird, Carl; Siirola, John D.; Staid, Andrea

This work focuses on estimation of unknown states and parameters in a discrete-time, stochastic, SEIR model using reported case counts and mortality data. An SEIR model is based on classifying individuals with respect to their status in regards to the progression of the disease, where S is the number individuals who remain susceptible to the disease, E is the number of individuals who have been exposed to the disease but not yet infectious, I is the number of individuals who are currently infectious, and R is the number of recovered individuals. For convenience, we include in our notation the number of infections or transmissions, T, that represents the number of individuals transitioning from compartment S to compartment E over a particular interval. Similarly, we use C to represent the number of reported cases.

More Details

Viral Fate and Transport for COVID-19 - NVBL

Negrete, Oscar N.; Domino, Stefan P.; Ho, Clifford K.

The NVBL Viral Fate and Transport Team includes researchers from eleven DOE national laboratories and is utilizing unique experimental facilities combined with physics-based and data-driven modeling and simulation to study the transmission, transport, and fate of SARSCoV-2. The team was focused on understanding and ultimately predicting SARS-CoV-2 viability in varied environments with the goal of rapidly informing strategies that guide the nation’s resumption of normal activities. The primary goals of this project include prioritizing administrative and engineering controls that reduce the risk of SARS-CoV-2 transmission within an enclosed environment; identifying the chemical and physical properties that influence binding of SARS-CoV-2 to common surfaces; and understanding the contribution of environmental reservoirs and conditions on transmission and resurgence of SARS-CoV-2.

More Details

Gamma Irradiation Facility

Foulk, James W.

Gamma irradiation is a process that uses Cobalt60 radionuclide produced artificially in nuclear reactors to irradiate a variety of items using gamma radiation. A key characteristic of gamma irradiation is its high penetration capability and the fact that it can modify physical, chemical, and biological properties of the irradiated materials.

More Details

Ultrafast Electron Microscopy for Spatial-Temporal Mapping of Charge Carriers

Ellis, Scott R.; Chandler, David; Michael, Joseph R.; Nakakura, Craig Y.

This LDRD supported efforts to significantly advance the scanning ultrafast electron microscope (SUEM) for spatial-temporal mapping of charge carrier dynamics in semiconductor materials and microelectronic devices. Sandia's SUEM capability in Livermore, CA, was built and demonstrated with previous LDRD funding; however, the stability and usability of the tool limited the throughput for analyzing samples. A new laser alignment strategy improved the stability of the SUEM, and the design and characterization of a new micro-channel plate (MCP)- based detector improved the signal-to-noise of the SUEM signal detection. These enhancements to the SUEM system improved throughput by over two orders of magnitude (before, a single time series of SUEM measurements would take several days to several weeks to acquire; now, the same measurements can be completed in~90 minutes in an automated fashion). The SUEM system can now be routinely used as an analytical instrument and will be a central part of several multi-year projects starting in FY22.

More Details

FAIR DEAL Grand Challenge Overview

Allemang, Christopher R.; Anderson, Evan M.; Baczewski, Andrew D.; Bussmann, Ezra; Butera, Robert; Campbell, Deanna M.; Campbell, Quinn; Carr, Stephen M.; Frederick, Esther; Gamache, Phillip; Gao, Xujiao; Grine, Albert; Gunter, Mathew; Halsey, Connor; Ivie, Jeffrey A.; Katzenmeyer, Aaron M.; Leenheer, Andrew J.; Lepkowski, William; Lu, Tzu M.; Mamaluy, Denis; Mendez Granado, Juan P.; Pena, Luis F.; Schmucker, Scott W.; Scrymgeour, David; Tracy, Lisa A.; Wang, George T.; Ward, Dan; Young, Steve M.

While it is likely practically a bad idea to shrink a transistor to the size of an atom, there is no arguing that it would be fantastic to have atomic-scale control over every aspect of a transistor – a kind of crystal ball to understand and evaluate new ideas. This project showed that it was possible to take a niche technique used to place dopants in silicon with atomic precision and apply it broadly to study opportunities and limitations in microelectronics. In addition, it laid the foundation to attaining atomic-scale control in semiconductor manufacturing more broadly.

More Details

Safety and Security Defense-in-Depth for Nuclear Power Plants

Clark, Andrew J.; Rowland, Mike

This report describes the risk-informed technical elements that will contribute to a defense-in-depth assessment for cybersecurity. Risk-informed cybersecurity must leverage the technical elements of a risk-informed approach appropriately in order to evaluate cybersecurity risk insights. HAZCADS and HAZOP+ are suitable methodologies to model the connection between digital harm and process hazards. Risk assessment modeling needs to be expanded beyond HAZCADS and HAZOP+ to consider the sequence of events that lead to plant consequences. Leveraging current practices in PRA can lead to categorization of digital assets and prioritizing digital assets commensurate with the risk. Ultimately, the culmination of cyber hazard methodologies, event sequence modeling, and digital asset categorization will facilitate a defense-in-depth assessment of cybersecurity.

More Details

Physiological Characterization of Language Comprehension

Matzen, Laura E.; Stites, Mallory C.; Ting, Christina; Howell, Breannan C.; Wisniewski, Kyra L.

In this project, our goal was to develop methods that would allow us to make accurate predictions about individual differences in human cognition. Understanding such differences is important for maximizing human and human-system performance. There is a large body of research on individual differences in the academic literature. Unfortunately, it is often difficult to connect this literature to applied problems, where we must predict how specific people will perform or process information. In an effort to bridge this gap, we set out to answer the question: can we train a model to make predictions about which people understand which languages? We chose language processing as our domain of interest because of the well- characterized differences in neural processing that occur when people are presented with linguistic stimuli that they do or do not understand. Although our original plan to conduct several electroencephalography (EEG) studies was disrupted by the COVID-19 pandemic, we were able to collect data from one EEG study and a series of behavioral experiments in which data were collected online. The results of this project indicate that machine learning tools can make reasonably accurate predictions about an individual?s proficiency in different languages, using EEG data or behavioral data alone.

More Details

Spatially and Temporally Resolved Velocimetry for Hypersonic Flows

Zhang, Yibin; Richardson, Daniel; Marshall, G.J.; Beresh, Steven J.; Casper, Katya M.

The development of new hypersonic flight vehicles is limited by the physical understanding that may be obtained from ground test facilities. This has motivated the present development of a temporally and spatially resolved velocimetry measurement for Sandia National Laboratories (SNL) Hypersonic Wind Tunnel (HWT) using Femtosecond Laser Electronic Excitation Tagging (FLEET). First, a multi-line FLEET technique has been created for the first time and tested in a supersonic jet, allowing simultaneous measurements of velocities along multiple profiles in a flow. Secondly, two different approaches have been demonstrated for generating dotted FLEET lines. One employs a slit mask pattern focused into points to yield a dotted line, allowing for two- or three-component velocity measurements free of contamination between components. The other dotted-line approach is based upon an optical wedge array and yields a grid of points rather than a dotted line. Two successful FLEET measurement campaigns have been conducted in SNL’s HWT. The first effort established optimal diagnostic configurations in the hypersonic environment based on earlier benchtop reproductions, including validation of the use of a 267 nm beam to boost the measurement signal-to-noise ratio (SNR) with minimal risk of perturbing the flow and greater simplicity than a comparable resonant technique at 202 nm. The same FLEET system subsequently was reconstituted to demonstrate the ability to make velocimetry measurements of hypersonic turbulence in a realistic flow field. Mean velocity profiles and turbulence intensity profiles of the shear layer in the wake of a hypersonic cone model were measured at several different downstream stations, proving the viability of FLEET as a hypersonic diagnostic.

More Details

Exploratory Efforts to Constrain Geologic Material Properties from Remote Sensing Data: A Joint Study

Swanson, Erika M.; Sussman, Aviva J.

Identification and characterization of underground events from surface or remote data requires a thorough understanding of the rock material properties. However, material properties usually come from borehole data, which is expensive and not always available. A potential alternative is to use topographic characteristics to approximate the strength, but this has never been done before quantitatively. Here we present the results from the first steps towards this goal. We have found that there are strong correlations between compressive and tensile strengths and slopes, but these correlations vary depending on data analysis details. Rugosity may be better correlated to strength than slope values. More comprehensive analyses are needed to fully understand the best method of predicting strength from topography for this area. We also found that misalignment of multiple GIS datasets can have a large influence on the ability to make interpretations. Lastly, these results will require further study in a variety of climatic conditions before being applicable to other sites.

More Details

Multiscale assessment of caprock integrity for geologic carbon storage in the pennsylvanian farnsworth unit, Texas, USA

Energies

Trujillo, Natasha; Rose-Coss, Dylan; Heath, Jason E.; Dewers, Thomas; Ampomah, William; Mozley, Peter S.; Cather, Martha

Leakage pathways through caprock lithologies for underground storage of CO2 and/or enhanced oil recovery (EOR) include intrusion into nano-pore mudstones, flow within fractures and faults, and larger-scale sedimentary heterogeneity (e.g., stacked channel deposits). To assess multiscale sealing integrity of the caprock system that overlies the Morrow B sandstone reservoir, Farnsworth Unit (FWU), Texas, USA, we combine pore-to-core observations, laboratory testing, well logging results, and noble gas analysis. A cluster analysis combining gamma ray, compressional slowness, and other logs was combined with caliper responses and triaxial rock mechanics testing to define eleven lithologic classes across the upper Morrow shale and Thirteen Finger limestone caprock units, with estimations of dynamic elastic moduli and fracture breakdown pressures (minimum horizontal stress gradients) for each class. Mercury porosimetry determinations of CO2 column heights in sealing formations yield values exceeding reservoir height. Noble gas profiles provide a “geologic time-integrated” assessment of fluid flow across the reservoir-caprock system, with Morrow B reservoir measurements consistent with decades-long EOR water-flooding, and upper Morrow shale and lower Thirteen Finger limestone values being consistent with long-term geohydrologic isolation. Together, these data suggest an excellent sealing capacity for the FWU and provide limits for injection pressure increases accompanying carbon storage activities.

More Details

Seismic Shake Table Test Plan

Kalinina, Elena A.; Ammerman, Douglas; Lujan, Lucas A.

This report is a preliminary test plan of the seismic shake table test. The final report will be developed when all decisions regarding the test hardware, instrumentation, and shake table inputs are made. A new revision of this report will be issued in spring of 2022. The preliminary test plan documents the free-field ground motions that will be used as inputs to the shake table, the test hardware, and instrumentation. It also describes the facility at which the test will take place in late summer of 2022.

More Details

Relationship Extraction: Automatic Information Extraction and Organization for Supporting Analysts in Threat Assessment

Ward, Katrina J.; Bisila, Jonathan; Sahu, Jamini A.

In order for analysts to be able to do their work, they sift through hundreds, thousands, or even millions of documents to make connections between entities of interest. This process is time consuming, tedious, and prone to potential error from missed connections or connections made that should not have been. There exist many tools in natural language processing, or NLP, to extract information from documents. However, when it comes to relationship extraction, there has been varied success. This project began with a goal to solve the relationship extraction problem which developed into a deeper understanding of the problem and the associated challenges for solving this problem on a general scale. In this report, we explain our research and approach to relationship extraction, identify other auxiliary problems in NLP that provide additional challenges to solving relationship extraction generally, explain our analysis of the current state of relationship extraction, and postulate future work to address these problems.

More Details

Processing Aleatory and Epistemic Uncertainties in Experimental Data From Sparse Replicate Tests of Stochastic Systems for Real-Space Model Validation

Journal of Verification, Validation and Uncertainty Quantification

Romero, Vicente J.; Black, Amalia R.

This paper presents a practical methodology for propagating and processing uncertainties associated with random measurement and estimation errors (that vary from test-to-test) and systematic measurement and estimation errors (uncertain but similar from test-to-test) in inputs and outputs of replicate tests to characterize response variability of stochastically varying test units. Also treated are test condition control variability from test-to-test and sampling uncertainty due to limited numbers of replicate tests. These aleatory variabilities and epistemic uncertainties result in uncertainty on computed statistics of output response quantities. The methodology was developed in the context of processing experimental data for “real-space” (RS) model validation comparisons against model-predicted statistics and uncertainty thereof. The methodology is flexible and sufficient for many types of experimental and data uncertainty, offering the most extensive data uncertainty quantification (UQ) treatment of any model validation method the authors are aware of. It handles both interval and probabilistic uncertainty descriptions and can be performed with relatively little computational cost through use of simple and effective dimension- and order-adaptive polynomial response surfaces in a Monte Carlo (MC) uncertainty propagation approach. A key feature of the progressively upgraded response surfaces is that they enable estimation of propagation error contributed by the surrogate model. Sensitivity analysis of the relative contributions of the various uncertainty sources to the total uncertainty of statistical estimates is also presented. The methodologies are demonstrated on real experimental validation data involving all the mentioned sources and types of error and uncertainty in five replicate tests of pressure vessels heated and pressurized to failure. Simple spreadsheet procedures are used for all processing operations.

More Details

Sierra/SolidMechanics 5.2 Capabilities in Development

Bergel, Guy L.; Beckwith, Frank; Belcourt, Kenneth; De Frias, Gabriel J.; Manktelow, Kevin; Merewether, Mark T.; Miller, Scott T.; Mosby, Matthew D.; Plews, Julia A.; Shelton, Timothy R.; Thomas, Jesse E.; Treweek, Benjamin; Tupek, Michael R.; Veilleux, Michael G.; Wagman, Ellen B.

This user’s guide documents capabilities in Sierra/SolidMechanics which remain “in-development” and thus are not tested and hardened to the standards of capabilities listed in Sierra/SM 5.2 User’s Guide. Capabilities documented herein are available in Sierra/SM for experimental use only until their official release. These capabilities include, but are not limited to, novel discretization approaches such as the conforming reproducing kernel (CRK) method, numerical fracture and failure modeling aids such as the extended finite element method (XFEM) and J-integral, explicit time step control techniques, dynamic mesh rebalancing, as well as a variety of new material models and finite element formulations.

More Details

Design of a 1 MWth Supercritical Carbon Dioxide Primary Heat Exchanger Test System

Journal of Energy Resources Technology, Transactions of the ASME

Carlson, Matthew; Alvarez, Francisco

A new generation of concentrating solar power (CSP) technologies is under development to provide dispatchable renewable power generation and reduce the levelized cost of electricity (LCOE) to 6 cents/kWh by leveraging heat transfer fluids (HTFs) capable of operation at higher temperatures and coupling with higher efficiency power conversion cycles. The U.S. Department of Energy (DOE) has funded three pathways for Generation 3 CSP (Gen3CSP) technology development to leverage solid, liquid, and gaseous HTFs to transfer heat to a supercritical carbon dioxide (sCO2) Brayton cycle. This paper presents the design and off-design capabilities of a 1 MWth sCO2 test system that can provide sCO2 coolant to the primary heat exchangers (PHX) coupling the high-Temperature HTFs to the sCO2 working fluid of the power cycle. This system will demonstrate design, performance, lifetime, and operability at a scale relevant to commercial CSP. A dense-phase high-pressure canned motor pump is used to supply up to 5.3 kg/s of sCO2 flow to the primary heat exchanger at pressures up to 250 bar and temperatures up to 715 °C with ambient air as the ultimate heat sink. Key component requirements for this system are presented in this paper.

More Details

NMR spectroscopy of coin cell batteries with metal casings

Science Advances

Walder, Brennan J.; Conradi, Mark S.; Borchardt, John; Merrill, Laura C.; Sorte, Eric; Deichmann, Eric J.; Anderson, Travis M.; Alam, Todd M.; Harrison, Katharine L.

Battery cells with metal casings are commonly considered incompatible with nuclear magnetic resonance (NMR) spectroscopy because the oscillating radio-frequency magnetic fields ("rf fields") responsible for excitation and detection of NMR active nuclei do not penetrate metals. Here, we show that rf fields can still efficiently penetrate nonmetallic layers of coin cells with metal casings provided "B1 damming"configurations are avoided. With this understanding, we demonstrate noninvasive high-field in situ 7Li and 19F NMR of coin cells with metal casings using a traditional external NMR coil. This includes the first NMR measurements of an unmodified commercial off-the-shelf rechargeable battery in operando, from which we detect, resolve, and separate 7Li NMR signals from elemental Li, anodic β-LiAl, and cathodic LixMnO2 compounds. Real-time changes of β-LiAl lithium diffusion rates and variable β-LiAl 7Li NMR Knight shifts are observed and tied to electrochemically driven changes of the β-LiAl defect structure.

More Details

An Overview of Gemma FY2021 Verification Activities

Freno, Brian A.; Matula, Neil; Owen, Justin; Krueger, Aaron M.; Johnson, William A.

Though the method-of-moments implementation of the electric-field integral equation plays an important role in computational electromagnetics, it provides many code-verification challenges due to the different sources of numerical error and their possible interactions. Matters are further complicated by singular integrals, which arise from the presence of a Green's function. In this report, we document our research to address these issues, as well as its implementation and testing in Gemma.

More Details

Influence of functional groups on low-temperature combustion chemistry of biofuels

Progress in Energy and Combustion Science

Rotavera, Brandon; Taatjes, Craig A.

Ongoing progress in synthetic biology, metabolic engineering, and catalysis continues to produce a diverse array of advanced biofuels with complex molecular structure and functional groups. In order to integrate biofuels into existing combustion systems, and to optimize the design of next-generation combustion systems, understanding connections between molecular structure and ignition at low-temperature conditions (< 1000 K) remains a priority that is addressed in part using chemical kinetics modeling. The development of predictive models relies on detailed information, derived from experimental and theoretical studies, on molecular structure and chemical reactivity, both of which influence the balance of chain reactions that occur during combustion – propagation, termination, and branching. In broad context, three main categories of reactions affect ignition behavior: (i) initiation reactions that generate a distribution of organic radicals, R˙; (ii) competing unimolecular decomposition of R˙ and bimolecular reaction of R˙ with O2; (iii) decomposition mechanisms of peroxy radical adducts (ROO˙), including isomerization via ROO˙ ⇌ Q˙OOH. All three categories are influenced by functional groups in different ways, which causes a shift in the balance of chain reactions that unfold over complex temperature- and pressure-dependent mechanisms. The objective of the present review is three-fold: (1) to provide a historical account of research on low-temperature oxidation of biofuels, including initiation reactions, peroxy radical reactions, Q˙OOH-mediated reaction mechanisms, and chain-branching chemistry; (2) to summarize the influence of functional groups on chemical kinetics relevant to chain-branching reactions, which are responsible for the accelerated production of radicals that leads to ignition; (3) to identify areas of research that are needed – experimentally and computationally – to address fundamental questions that remain. Results from experimental, quantum chemical, and chemical kinetics modeling studies are reviewed for several classes of biofuels – alcohols, esters, ketones, acyclic ethers and cyclic ethers – and are compared against analogous results in alkane oxidation. The review is organized into separate sections for each biofuel class, which include studies on thermochemistry and bond dissociation energies, rate coefficients for initiation reactions via H-abstraction and related branching fractions, reaction mechanisms and product formation from reactive intermediates, ignition delay times, and chemical kinetics modeling. Each section is then summarized in order to identify areas for which additional functional group-specific work is required. The review concludes with an outline for research directions for improving the fundamental understanding of biofuel ignition chemistry and related chemical kinetics modeling.

More Details

Visible emission spectra of thermographic phosphors under x-ray excitation

Measurement Science and Technology

Westphal, Eric R.; Brown, Alex D.; Quintana, Enrico C.; Kastengren, Alan L.; Son, Steven F.; Meyer, Terrence R.; Hoffmeister, K.N.G.

Thermographic phosphors have been employed for temperature sensing in challenging environments, such as on surfaces or within solid samples exposed to dynamic heating, because of the high temporal and spatial resolution that can be achieved using this approach. Typically, UV light sources are employed to induce temperature-sensitive spectral responses from the phosphors. However, it would be beneficial to explore x-rays as an alternate excitation source to facilitate simultaneous x-ray imaging of material deformation and temperature of heated samples and to reduce UV absorption within solid samples being investigated. The phosphors BaMgAl10O17:Eu (BAM), Y2SiO5:Ce, YAG:Dy, La2O2S:Eu, ZnGa2O4:Mn, Mg3F2GeO4:Mn, Gd2O2S:Tb, and ZnO were excited in this study using incident synchrotron x-ray radiation. These materials were chosen to include conventional thermographic phosphors as well as x-ray scintillators (with crossover between these two categories). X-ray-induced thermographic behavior was explored through the measurement of visible spectral response with varying temperature. The incident x-rays were observed to excite the same electronic energy level transitions in these phosphors as UV excitation. Similar shifts in the spectral response of BAM, Y2SiO5:Ce, YAG:Dy, La2O2S:Eu, ZnGa2O4:Mn, Mg3F2GeO4:Mn, and Gd2O2S:Tb were observed when compared to their response to UV excitation found in literature. Some phosphors were observed to thermally quench in the temperature ranges tested here, while the response from others did not rise above background noise levels. This may be attributed to the increased probability of non-radiative energy release from these phosphors due to the high energy of the incident x-rays. These results indicate that x-rays can serve as a viable excitation source for phosphor thermometry.

More Details

Preliminary Radioisotope Screening for Off-site Consequence Assessment of Advanced Non-LWR Systems

Andrews, Nathan C.; Foulk, James W.; Taconi, Anna M.; Leute, Jennifer E.

Currently a set of 71 radionuclides are accounted for in off-site consequence analysis for LWRs. Radionuclides of dose consequence are expected to change for non-LWRs, with radionuclides of interest being type-specific. This document identifies an expanded set of radionuclides that may need to be accounted for in multiple non-LWR systems: high temperature gas reactors (HTGRs); fluoride-salt-cooled high-temperature reactors (FHRs); thermal-spectrum fluoride-based molten salt reactors (MSRs); fast-spectrum chloride-based MSRs; and, liquid metal fast reactors with metallic fuel (LMRs) Specific considerations are provided for each reactor type in Chapter 2 through Chapter 5, and a summary of all recommendations is provided in Chapter 6. All identified radionuclides are already incorporated within the MACCS software, yet the development of tritium-specific and carbon-specific chemistry models are recommended.

More Details

Efficient and Safe Hydrogen Refueling of Fuel Cell Vehicles from an Emergency Chemical Hydride Storage Source

Bran Anleu, Gabriela A.; Kimble, Michael; Carr, Daniel

Zero-emissions hydrogen fuel cell electrical vehicles (FCEVs) have become more popular in recent years. However, the limited availability of hydrogen fueling stations is considered a critical barrier to sustainable adoption of hydrogen FCEV. To enable the widespread deployment and commercialization of hydrogen FCEV, the availability of hydrogen refueling stations needs to improve. One of the consequences of the lack of hydrogen refueling infrastructure is that consumers can suffer from “range anxiety”, meaning consumers would get anxious of running out of fuel during long-distance trip [4]. A practical solution is to provide a compact emergency hydrogen refueler that can be used if the consumer runs out of hydrogen before reaching the nearest hydrogen refueling station. A safe, compact, and user-friendly hydrogen refueler would give consumers the flexibility they need to feel comfortable using their hydrogen FCEV when planning a long-distance trip. Offering this product would alleviate range anxiety, and it would make Hydrogen FCEV a more attractive alternative to gasoline vehicles. The emergency hydrogen refueler consists of a lithium hydride bed that reacts with liquid water to produce hydrogen gas and lithium hydroxide.

More Details

NgramPPM: Compression Analytics without Compression

Bauer, Travis L.

Arithmetic Coding (AC) using Prediction by Partial Matching (PPM) is a compression algorithm that can be used as a machine learning algorithm. This paper describes a new algorithm, NGram PPM. NGram PPM has all the predictive power of AC/PPM, but at a fraction of the computational cost. Unlike compression-based analytics, it is also amenable to a vector space interpretation, which creates the ability for integration with other traditional machine learning algorithms. AC/PPM is reviewed, including its application to machine learning. Then NGram PPM is described and test results are presented, comparing them to AC/PPM.

More Details

ASCEND: Asymptotically compatible strong form foundations for nonlocal discretization

Trask, Nathaniel A.; D'Elia, Marta; Littlewood, David J.; Silling, Stewart; Trageser, Jeremy; Tupek, Michael R.

Nonlocal models naturally handle a range of physics of interest to SNL, but discretization of their underlying integral operators poses mathematical challenges to realize the accuracy and robustness commonplace in discretization of local counterparts. This project focuses on the concept of asymptotic compatibility, namely preservation of the limit of the discrete nonlocal model to a corresponding well-understood local solution. We address challenges that have traditionally troubled nonlocal mechanics models primarily related to consistency guarantees and boundary conditions. For simple problems such as diffusion and linear elasticity we have developed complete error analysis theory providing consistency guarantees. We then take these foundational tools to develop new state-of-the-art capabilities for: lithiation-induced failure in batteries, ductile failure of problems driven by contact, blast-on-structure induced failure, brittle/ductile failure of thin structures. We also summarize ongoing efforts using these frameworks in data-driven modeling contexts. This report provides a high-level summary of all publications which followed from these efforts.

More Details

Rock Valley Accelerated Weight Drop Seismic Data Processing and Picking of P-wave and S-wave Arrival Times

Harding, Jennifer L.; Bodmer, Miles; Preston, Leiph

Rock Valley, in the southern end of the Nevada National Security Site, hosts a fault system that was responsible for a shallow (< 3 km below surface ) magnitude 3.7 earthquake in May 1993. In order to better understand this system, seismic properties of the shallow subsurface need to be better constrained. In April and May of 2021, accelerated weight drop (AWD) active-source seismic data were recorded in order to measure P- and S-wave travel-times for the area. This report describes the processing and phase picking of the recorded seismic waveforms. In total, we picked 7,982 P-wave arrivals at offsets up to ~2500 m, and 4,369 S-wave arrivals at offsets up to ~2200 m. These travel-time picks can be inverted for shallow P-wave and S-wave velocity structure in future studies.

More Details

Optical Imaging on Z LDRD: Design and Development of Self-Emission and Debris Imagers

Yager-Elorriaga, David A.; Montoya, Michael M.; Bliss, David E.; Ball, Christopher R.; Atencio, Phillip; Carpenter, Brian C.; Fuerschbach, Kyle H.; Fulford, Karin W.; Lamppa, Derek C.; Lowinske, Michael C.; Lucero, Larry; Patel, Sonal G.; Romero, Anthony; Foulk, James W.; Breznik-Young, Bonnie

We present an overview of the design and development of optical self-emission and debris imaging diagnostics for the Z Machine at Sandia National Laboratories. These diagnostics were designed and implemented to address several gaps in our understanding of visibly emitting phenomenon on Z and the post-shot debris environment. Optical emission arises from plasmas that form on the transmission line that delivers energy to Z loads and on the Z targets themselves; however, the dynamics of these plasmas are difficult to assess without imaging data. Addressing this, we developed a new optical imager called SEGOI (Self-Emission Gated Optical Imager) that leverages the eight gated optical imagers and two streak cameras of the Z Line VISAR system. SEGOI is a low cost, side-on imager with a 1 cm field of view and 30-50 µm spatial resolution, sensitive to green light (540-600 nm). This report outlines the design considerations and development of this diagnostic and presents an overview of the first diagnostic data acquired from four experimental campaigns. SEGOI was fielded on power flow experiments to image plasmas forming on and between transmission lines, on an inertial confinement fusion experiment called the Dynamic Screw Pinch to image low density plasmas forming on return current posts, on an experiment designed to measure the magneto Rayleigh-Taylor instability to image the instability bubble trajectory and self-emission structures, and finally on a Magnetized Liner Inertial Fusion (MagLIF) experiment to image the emission from the target. The second diagnostic developed, called DINGOZ (Debris ImagiNG on Z), was designed to improve our understanding of the post-shot debris environment. DINGOZ is an airtight enclosure that houses electronics and batteries to operate a high-speed (10-400 kfps) camera in the Z Machine center section. We report on the design considerations of this new diagnostic and present the first high-speed imaging data of the post-shot debris environment on Z.

More Details

Multi-Resolution Characterization of the Coupling Effects of Molten Salts, High Temperature and Irradiation on Intergranular Fracture

Dingreville, Remi; Bielejec, Edward S.; Chen, Elton Y.; Deo, C.; Kim, E.; Spearot, D.E.; Startt, Jacob K.; Stewart, James A.; Sugar, Joshua D.; Vizoso, D.; Weck, Philippe F.; Young, Joshua M.

This project focused on providing a fundamental physico-chemical understanding of the coupling mechanisms of corrosion- and radiation-induced degradation at material-salt interfaces in Ni-based alloys operating in emulated Molten Salt Reactor(MSR) environments through the use of a unique suite of aging experiments, in-situ nanoscale characterization experiments on these materials, and multi-physics computational models. The technical basis and capabilities described in this report bring us a step closer to accelerate the deployment of MSRs by closing knowledge gaps related to materials degradation in harsh environments.

More Details

Gamma spectrometry uranium isotopic analysis rodeo: Summary of GADRAS results

Enghauser, Michael W.

This report summarizes GADRAS methods and gamma spectrometry rodeo uranium isotopic analysis results for high energy resolution H3D M400 cadmium zinc telluride (CZT) and ORTEC Micro Detective high-purity germanium (HPGe) spectra of uranium isotopic standards collected at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL) over a two-year measurement campaign. During the campaign, measurements were performed with the detectors unshielded, side shielded, and collimated. In addition, measurements of the uranium isotopic standards were performed unshielded and shielded.

More Details

Molecular Origin of Wettability Alteration of Subsurface Porous Media upon Gas Pressure Variations

ACS Applied Materials and Interfaces

Ho, Tuan A.; Wang, Yifeng

Upon extraction/injection of a large quantity of gas from/into a subsurface system in shale gas production or carbon sequestration, the gas pressure varies remarkably, which may significantly change the wettability of porous media involved. Mechanistic understanding of such changes is critical for designing and optimizing a related subsurface engineering process. Using molecular dynamics simulations, we have calculated the contact angle of a water droplet on various solid surfaces (kerogen, pyrophyllite, calcite, gibbsite, and montmorillonite) as a function of CO2 or CH4 gas pressure up to 200 atm at a temperature of 300 K. The calculation reveals a complex behavior of surface wettability alteration by gas pressure variation depending on surface chemistry and structure, and molecular interactions of fluid molecules with surfaces. As the CO2 gas pressure increases, a partially hydrophilic kerogen surface becomes highly hydrophobic, while a calcite surface becomes more hydrophilic. Considering kerogen and calcite being the major components of a shale formation, we postulate that the wettability alteration of a solid surface induced by a gas pressure change may play an important role in fluid flows in shale gas production and geological carbon sequestration.

More Details

Predictive Data-driven Platform for Subsurface Energy Production

Yoon, Hongkyu; Verzi, Stephen J.; Cauthen, Katherine R.; Musuvathy, Srideep S.; Melander, Darryl; Norland, Kyle; Morales, Adriana M.; Lee, Jonghyun; Sun, Alexander

Subsurface energy activities such as unconventional resource recovery, enhanced geothermal energy systems, and geologic carbon storage require fast and reliable methods to account for complex, multiphysical processes in heterogeneous fractured and porous media. Although reservoir simulation is considered the industry standard for simulating these subsurface systems with injection and/or extraction operations, reservoir simulation requires spatio-temporal “Big Data” into the simulation model, which is typically a major challenge during model development and computational phase. In this work, we developed and applied various deep neural network-based approaches to (1) process multiscale image segmentation, (2) generate ensemble members of drainage networks, flow channels, and porous media using deep convolutional generative adversarial network, (3) construct multiple hybrid neural networks such as convolutional LSTM and convolutional neural network-LSTM to develop fast and accurate reduced order models for shale gas extraction, and (4) physics-informed neural network and deep Q-learning for flow and energy production. We hypothesized that physicsbased machine learning/deep learning can overcome the shortcomings of traditional machine learning methods where data-driven models have faltered beyond the data and physical conditions used for training and validation. We improved and developed novel approaches to demonstrate that physics-based ML can allow us to incorporate physical constraints (e.g., scientific domain knowledge) into ML framework. Outcomes of this project will be readily applicable for many energy and national security problems that are particularly defined by multiscale features and network systems.

More Details

Femtosecond Reflectance Spectroscopy for Energetic Material Diagnostics

Cole-Filipiak, Neil C.; Schrader, Paul; Luk, Ting S.; Ramasesha, Krupa

Understanding the fundamental mechanisms underpinning shock initiation is critical to predicting energetic material (EM) safety and performance. Currently, the timescales and pathways by which shock-excited lattice modes transfer energy into specific chemical bonds remains an open question. Towards understanding these mechanisms, our group has previously measured the vibrational energy transfer (VET) pathways in several energetic thin films using broadband, femtosecond transient absorption spectroscopy. However, new technologies are needed to move beyond these thin film surrogates and measure broadband VET pathways in realistic EM morphologies. Herein, we describe a new broadband, femtosecond, attenuated total reflectance spectroscopy apparatus. Performance of the system is benchmarked against published data and the first VET results from a pressed EM pellet are presented. This technology enables fundamental studies of VET dynamics across sample configurations and environments (pressure, temperature, etc .) and supports the potential use of VET studies in the non-destructive surveillance of EM components.

More Details

A Comprehensive Open-Source R Software For Statistical Metrology Calculations: From Uncertainty Evaluation To Risk Analysis

NCSLI Measure

Delker, Collin J.

Whether calibrating equipment or inspecting products on the factory floor, metrology requires many complicated statistical calculations to achieve a full understanding and evaluation of measurement uncertainty and quality. In order to assist its workforce in performing these calculations in a consistent and rigorous way, the Primary Standards Lab at Sandia National Laboratories (SNL) has developed a free and open-source software package for computing various metrology calculations from uncertainty propagation to risk analysis. In addition to propagating uncertainty through a measurement model using the well-known Guide to Expression of Uncertainty in Measurement or Monte Carlo approaches, evaluating the individual Type A and Type B uncertainty components that go into the measurement model often requires other statistical methods such as analysis of variance or determining uncertainty in a fitted curve. Once the uncertainty in a measurement has been calculated, it is usually evaluated from a risk perspective to ensure the measurement is suitable for making a particular conformance decision. Finally, SNL’s software can perform all these calculations in a single application via an easy-to-use graphical interface, where the different functions are integrated so the results of one calculation can be used as inputs to another calculation.

More Details

Trajectory Optimization via Unsupervised Probabilistic Learning On Manifolds

Safta, Cosmin; Najm, Habib N.; Grant, Michael J.; Sparapany, Michael J.

This report investigates the use of unsupervised probabilistic learning techniques for the analysis of hypersonic trajectories. The algorithm first extracts the intrinsic structure in the data via a diffusion map approach. Using the diffusion coordinates on the graph of training samples, the probabilistic framework augments the original data with samples that are statistically consistent with the original set. The augmented samples are then used to construct conditional statistics that are ultimately assembled in a path-planing algorithm. In this framework the controls are determined stage by stage during the flight to adapt to changing mission objectives in real-time. A 3DOF model was employed to generate optimal hypersonic trajectories that comprise the training datasets. The diffusion map algorithm identfied that data resides on manifolds of much lower dimensionality compared to the high-dimensional state space that describes each trajectory. In addition to the path-planing worflow we also propose an algorithm that utilizes the diffusion map coordinates along the manifold to label and possibly remove outlier samples from the training data. This algorithm can be used to both identify edge cases for further analysis as well as to remove them from the training set to create a more robust set of samples to be used for the path-planing process.

More Details

Spatio-temporal Estimates of Disease Transmission Parameters for COVID-19 with a Fully-Coupled, County-Level Model of the United States

Cummings, Derek; Hart, William E.; Garcia-Carreras, Bernardo; Lanning, Carl; Lessler, Justin; Staid, Andrea

Sandia National Laboratories has developed a capability to estimate parameters of epidemiological models from case reporting data to support responses to the COVID-19 pandemic. A differentiating feature of this work is the ability to simultaneously estimate county-specific disease transmission parameters in a nation-wide model that considers mobility between counties. The approach is focused on estimating parameters in a stochastic SEIR model that considers mobility between model patches (i.e., counties) as well as additional infectious compartments. The inference engine developed by Sandia includes (1) reconstruction and (2) transmission parameter inference. Reconstruction involves estimating current population counts within each of the compartments in a modified SEIR model from reported case data. Reconstruction produces input for the inference formulations, and it provides initial conditions that can be used in other modeling and planning efforts. Inference involves the solution of a large-scale optimization problem to estimate the time profiles for the transmission parameters in each county. These provide quantification of changes in the transmission parameter over time (e.g., due to impact of intervention strategies). This capability has been implemented in a Python-based software package, epi_inference, that makes extensive use of Pyomo [5] and IPOPT [10] to formulate and solve the inference formulations.

More Details

Expanding the Scope of Genomic Security: Targeted Genome Editing within Microbiomes through Designer Bacteriophage Vectors

Mageeney, Catherine M.

The ability to engineer the genome of a bacterial strain, not as an isolate, but while present among other microbes in a microbiome, would open new technological possibilities in the areas of medicine, energy and biomanufacturing. Our approach is to develop sets of phages (bacterial viruses) active on the target strain and themselves engineered to act not as killers but as vectors for gene delivery. This approach is rooted in our bioinformatic tools that map prophages accurately within bacterial genomes. We present new bioinformatic results in cross-contig search, design of phage genome assemblies, satellites that embed within prophages, alignment of large numbers of biological sequences, and improvement of reference databases for prophage discovery. We targeted a Pseudomonas putida strain within a lignin-degrading microbiome, but were unable to obtain active phages, and turned toward a defined microbiome of the mouse gut.

More Details

Conditional Point Sampling: A stochastic media transport algorithm with full geometric sampling memory

Journal of Quantitative Spectroscopy and Radiative Transfer

Vu, Emily H.; Olson, Aaron

Current methods for stochastic media transport are either computationally expensive or, by nature, approximate. Moreover, none of the well-developed, benchmarked approximate methods can compute the variance caused by the stochastic mixing, a quantity especially important to safety calculations. Therefore, we derive and apply a new conditional probability function (CPF) for use in the recently developed stochastic media transport algorithm Conditional Point Sampling (CoPS), which 1) leverages the full intra-particle memory of CoPS to yield errorless computation of stochastic media outputs in 1D, binary, Markovian-mixed media, and 2) leverages the full inter-particle memory of CoPS and the recently developed Embedded Variance Deconvolution method to yield computation of the variance in transport outputs caused by stochastic material mixing. Numerical results demonstrate errorless stochastic media transport as compared to reference benchmark solutions with the new CPF for this class of stochastic mixing as well as the ability to compute the variance caused by the stochastic mixing via CoPS. Using previously derived, non-errorless CPFs, CoPS is further found to be more accurate than the atomic mix approximation, Chord Length Sampling (CLS), and most of memory-enhanced versions of CLS surveyed. In addition, we study the compounding behavior of CPF error as a function of cohort size (where a cohort is a group of histories that share intra-particle memory) and recommend that small cohorts be used when computing the variance in transport outputs caused by stochastic mixing.

More Details

ERAS: Enabling the Integration of Real-World Intellectual Properties (IPs) in Architectural Simulators

Nema, Shubham; Razdan, Rohin; Rodrigues, Arun; Hemmert, Karl S.; Voskuilen, Gwendolyn R.; Adak, Debratim; Hammond, Simon; Awad, Amro; Hughes, Clayton

Sandia National Laboratories is investigating scalable architectural simulation capabilities with a focus on simulating and evaluating highly scalable supercomputers for high performance computing applications. There is a growing demand for RTL model integration to provide the capability to simulate customized node architectures and heterogeneous systems. This report describes the first steps integrating the ESSENTial Signal Simulation Enabled by Netlist Transforms (ESSENT) tool with the Structural Simulation Toolkit (SST). ESSENT can emit C++ models from models written in FIRRTL to automatically generate components. The integration workflow will automatically generate the SST component and necessary interfaces to ’plug’ the ESSENT model into the SST framework.

More Details

Marine Atmospheric Corrosion of Additively Manufactured Stainless Steels

Corrosion

Duran, Jesse G.; Taylor, Jason M.; Presuel-Moreno, Francisco; Schaller, Rebecca S.; Schindelholz, Eric J.; Melia, Michael A.

Additively manufactured (AM) stainless steels (SSs) exhibit numerous microstructural differences compared to their wrought counterparts, such as Cr-enriched dislocation cell structures. The influence these unique features have on a SSs corrosion resistance are still under investigation with most current works limited to laboratory experiments. The work herein shows the first documented study of AM 304L and 316L exposed to a severe marine environment on the eastern coast of Florida with comparisons made to wrought counterparts. Coupons were exposed for 21 months and resulted in significant pitting corrosion to initiate after 1 month of exposure for all conditions. At all times, the AM coupons exhibited lower average and maximum pit depths than their wrought counterparts. After 21 months, pits on average were 4 μm deep for AM 316L specimen and 8 μm deep for wrought specimen. Pits on the wrought samples tended to be nearly hemispherical and polished with some pits showing crystallographic attack while pits on AM coupons exhibited preferential attack at melt pool boundaries and the cellular microstructure.

More Details

HEAF Cable Fragility Testing at the Solar Furnace at the NSTTF

Glover, Austin M.; Lafleur, Angela (Chris); Engerer, Jeff

In order to establish a zone of influence (ZOI) due to a high energy arcing fault (HEAF) environment, the fragility of the targets must be determined. The high heat flux/short duration exposure of a HEAF is considerably different than that of a traditional hydrocarbon fire, which previous research has addressed. The previous failure metrics (e.g., internal jacket temperature of a cable exposed to a fire) were based on low heat flux/long duration exposures. Because of this, evaluation of different physics and failure modes was considered to evaluate the fragility of cables exposed to a HEAF. Tests on cable targets were performed at high heat flux/short duration exposures to gain insight on the relevant physics and failure modes. These tests yielded data on several relevant failure modes, including electrical failure and sustained ignition. Additionally, the results indicated a relationship between the total energy of exposure and the damage state of the cable target. This data can be used to inform the fragility of the targets.

More Details

Instantaneous Three-Dimensional Temperature Measurements via Ultrafast Laser Spectroscopy with Structured Light

Richardson, Daniel

Detonations and flames are characterized by three-dimensional (3D) temperature fields, yet state-of- the-art temperature measurement techniques yield information at a point or along a line. The goal of the research documented here was to combine ultrafast laser spectroscopy and structured illumination to deliver an unprecedented measurement capability—three-dimensional, instantaneous temperature measurements in a gas-phase volume. To achieve this objective, different parts of the proposed technique were developed and tested independently. Structured illumination was used to image particulate matter (soot) in a turbulent flame at multiple planes using a single laser pulse and a single camera. Emission spectroscopy with structured detection was demonstrated for emission- based measurements of explosives with enhance dimensionality. Finally, an instrument for multi- planar laser-based temperature measurement technique was developed. Structured illumination techniques will continue to be developed for multi-dimensional and multi-parameter measurements. These new measurement capabilities will be important for heat transfer and fluid dynamic research areas.

More Details

GDSA Repository Systems Analysis Investigations in FY2021

Laforce, Tara C.; Basurto, Eduardo; Chang, Kyung W.; Jayne, Richard; Leone, Rosemary C.; Nole, Michael A.; Foulk, James W.; Stein, Emily

The Spent Fuel and Waste Science and Technology (SFWST) Campaign of the U.S. Department of Energy Office of Nuclear Energy, Office of Spent Fuel and Waste Disposition (SFWD), has been conducting research and development on generic deep geologic disposal systems (i.e., geologic repositories). This report describes specific activities in the Fiscal Year (FY) 2021 associated with the Geologic Disposal Safety Assessment (GDSA) Repository Systems Analysis (RSA) work package within the SFWST Campaign. The overall objective of the GDSA RSA work package is to develop generic deep geologic repository concepts and system performance assessment (PA) models in several host-rock environments, and to simulate and analyze these generic repository concepts and models using the GDSA Framework toolkit, and other tools as needed.

More Details

Sensitivity Analysis Comparisons on Geologic Case Studies: An International Collaboration

Swiler, Laura P.; Becker, Dirk-Alexander; Brooks, Dusty M.; Govaerts, Joan; Koskinen, Lasse; Plischke, Elmar; Rohlig, Klaus-Jurgen; Saveleva, Elena; Spiessl, Sabine M.; Stein, Emily; Svitelman, Valentina

Over the past four years, an informal working group has developed to investigate existing sensitivity analysis methods, examine new methods, and identify best practices. The focus is on the use of sensitivity analysis in case studies involving geologic disposal of spent nuclear fuel or nuclear waste. To examine ideas and have applicable test cases for comparison purposes, we have developed multiple case studies. Four of these case studies are presented in this report: the GRS clay case, the SNL shale case, the Dessel case, and the IBRAE groundwater case. We present the different sensitivity analysis methods investigated by various groups, the results obtained by different groups and different implementations, and summarize our findings.

More Details

Risk-Adaptive Experimental Design for High-Consequence Systems: LDRD Final Report

Kouri, Drew P.; Jakeman, John D.; Huerta, Jose G.; Walsh, Timothy; Smith, Chandler; Uryasev, Stan

Constructing accurate statistical models of critical system responses typically requires an enormous amount of data from physical experiments or numerical simulations. Unfortunately, data generation is often expensive and time consuming. To streamline the data generation process, optimal experimental design determines the 'best' allocation of experiments with respect to a criterion that measures the ability to estimate some important aspect of an assumed statistical model. While optimal design has a vast literature, few researchers have developed design paradigms targeting tail statistics, such as quantiles. In this project, we tailored and extended traditional design paradigms to target distribution tails. Our approach included (i) the development of new optimality criteria to shape the distribution of prediction variances, (ii) the development of novel risk-adapted surrogate models that provably overestimate certain statistics including the probability of exceeding a threshold, and (iii) the asymptotic analysis of regression approaches that target tail statistics such as superquantile regression. To accompany our theoretical contributions, we released implementations of our methods for surrogate modeling and design of experiments in two complementary open source software packages, the ROL/OED Toolkit and PyApprox.

More Details

Multimode Metastructures: Novel Hybrid 3D Lattice Topologies

Boyce, Brad L.; Garland, Anthony; White, Benjamin C.; Jared, Bradley H.; Conway, Kaitlynn; Adstedt, Katerina; Dingreville, Remi; Robbins, Joshua; Walsh, Timothy; Alvis, Timothy; Branch, Brittany A.; Kaehr, Bryan J.; Kunka, Cody; Leathe, Nicholas S.

With the rapid proliferation of additive manufacturing and 3D printing technologies, architected cellular solids including truss-like 3D lattice topologies offer the opportunity to program the effective material response through topological design at the mesoscale. The present report summarizes several of the key findings from a 3-year Laboratory Directed Research and Development Program. The program set out to explore novel lattice topologies that can be designed to control, redirect, or dissipate energy from one or multiple insult environments relevant to Sandia missions, including crush, shock/impact, vibration, thermal, etc. In the first 4 sections, we document four novel lattice topologies stemming from this study: coulombic lattices, multi-morphology lattices, interpenetrating lattices, and pore-modified gyroid cellular solids, each with unique properties that had not been achieved by existing cellular/lattice metamaterials. The fifth section explores how unintentional lattice imperfections stemming from the manufacturing process, primarily sur face roughness in the case of laser powder bed fusion, serve to cause stochastic response but that in some cases such as elastic response the stochastic behavior is homogenized through the adoption of lattices. In the sixth section we explore a novel neural network screening process that allows such stocastic variability to be predicted. In the last three sections, we explore considerations of computational design of lattices. Specifically, in section 7 using a novel generative optimization scheme to design novel pareto-optimal lattices for multi-objective environments. In section 8, we use computational design to optimize a metallic lattice structure to absorb impact energy for a 1000 ft/s impact. And in section 9, we develop a modified micromorphic continuum model to solve wave propagation problems in lattices efficiently.

More Details

Recommendations for Distributed Energy Resource Patching

Johnson, Jay

While computer systems, software applications, and operational technology (OT)/Industrial Control System (ICS) devices are regularly updated through automated and manual processes, there are several unique challenges associated with distributed energy resource (DER) patching. Millions of DER devices from dozens of vendors have been deployed in home, corporate, and utility network environments that may or may not be internet-connected. These devices make up a growing portion of the electric power critical infrastructure system and are expected to operate for decades. During that operational period, it is anticipated that critical and noncritical firmware patches will be regularly created to improve DER functional capabilities or repair security deficiencies in the equipment. The SunSpec/Sandia DER Cybersecurity Workgroup created a Patching Subgroup to investigate appropriate recommendations for the DER patching, holding fortnightly meetings for more than nine months. The group focused on DER equipment, but the observations and recommendations contained in this report also apply to DERMS tools and other OT equipment used in the end-to-end DER communication environment. The group found there were many standards and guides that discuss firmware lifecycles, patch and asset management, and code-signing implementations, but did not singularly cover the needs of the DER industry. This report collates best practices from these standards organizations and establishes a set of best practices that may be used as a basis for future national or international patching guides or standards.

More Details

Short Term Plasticity for Artificial Neural Networks

Teeter, Corinne M.

Achieving efficient learning for AI systems was identified as a major challenge in the DOE's recently released, AI for Science, report. The human brain is capable of efficient and low-powered learning. It is likely that implementing brain-like principles will lead to more efficient AI systems. In this LDRD, I aim to contribute to this goal by creating a foundation for implementing and studying a brain phenomenon termed short term plasticity (STP) in spiking artificial neural networks within Sandia. First, data collected by the Allen Institute for Brain Science (AIBS) was analyzed to see if STP could be classified into types using the data collected. Although the data was inadequate at the time, AIBS has updated their database and created models that could be utilized in the future. Second, I began creating a software package to assess the ability of a Boltzmann machine utilizing STP to sample from national security data.

More Details

Leveraging Spin-Orbit Coupling in Ge/SiGe Heterostructures for Quantum Information Transfer

Bretz-Sullivan, Terence M.; Brickson, Mitchell I.; Foster, Natalie D.; Hutchins-Delgado, Troy A.; Lewis, Rupert M.; Lu, Tzu M.; Miller, Andrew J.; Srinivasa, Vanita; Tracy, Lisa A.; Wanke, Michael C.; Luhman, Dwight R.

Hole spin qubits confined to lithographically - defined lateral quantum dots in Ge/SiGe heterostructures show great promise. On reason for this is the intrinsic spin - orbit coupling that allows all - electric control of the qubit. That same feature can be exploited as a coupling mechanism to coherently link spin qubits to a photon field in a superconducting resonator, which could, in principle, be used as a quantum bus to distribute quantum information. The work reported here advances the knowledge and technology required for such a demonstration. We discuss the device fabrication and characterization of different quantum dot designs and the demonstration of single hole occupation in multiple devices. Superconductor resonators fabricated using an outside vendor were found to have adequate performance and a path toward flip-chip integration with quantum devices is discussed. The results of an optical study exploring aspects of using implanted Ga as quantum memory in a Ge system are presented.

More Details

Update on the Simulation of Commercial Drying of Spent Nuclear Fuel

Durbin, S.; Lindgren, Eric; Pulido, Ramon J.; Foulk, James W.; Fasano, Raymond

The purpose of this report is to document improvements in the simulation of commercial vacuum drying procedures at the Nuclear Energy Work Complex at Sandia National Laboratories. Validation of the extent of water removal in a dry spent nuclear fuel storage system based on drying procedures used at nuclear power plants is needed to close existing technical gaps. Operational conditions leading to incomplete drying may have potential impacts on the fuel, cladding, and other components in the system. A general lack of data suitable for model validation of commercial nuclear canister drying processes necessitates additional, well-designed investigations of drying process efficacy and water retention. Scaled tests that incorporate relevant physics and well-controlled boundary conditions are essential to provide insight and guidance to the simulation of prototypic systems undergoing drying processes.

More Details

Less-Than-Lethal Quick Deploy Inflatable Hall/Door Barrier: VISTA Feasibility Study

Rivera, W.G.; Portman, Addison

Physical protection of public buildings has long been a concern of police and security services where a balance of facility security and personnel safety is vital. Due to the nature of public spaces, the use of permanently installed and deploy-on-demand physical barrier systems must be safe for the legitimate occupants and visitors of that space. Such systems must seek to mitigate the personal and organizational consequences of unintentionally seriously injuring or killing an innocent bystander by slamming a heavy, rigid, and quick-deploying barrier into place. Consideration and implementation of less-than-lethal technologies is necessary to reduce risk to visitors and building personnel. One potential barrier solution is a fast-acting, high-strength, composite airbag barrier system for doorways and hallways to quickly deploy a less-than-lethal barrier at entry points as well as isolate intruders who have already gained access. This system is envisioned to be stored within an architecturally attractive selectively frangible shell that could be permanently installed at a facility or installed in remote or temporary locations as dictated by risk. The system would be designed to be activated remotely (hardwired or wireless) from a Central Alarm Station (CAS) or other secure location.

More Details

CIS Project 22359, Final Technical Report. Discretized Posterior Approximation in High Dimensions

Duersch, Jed A.; Catanach, Thomas A.

Our primary aim in this work is to understand how to efficiently obtain reliable uncertainty quantification in automatic learning algorithms with limited training datasets. Standard approaches rely on cross-validation to tune hyper parameters. Unfortunately, when our datasets are too small, holdout datasets become unreliable—albeit unbiased—measures of prediction quality due to the lack of adequate sample size. We should not place confidence in holdout estimators under conditions wherein the sample variance is both large and unknown. More poigniantly, our training experiments on limited data (Duersch and Catanach, 2021) show that even if we could improve estimator quality under these conditions, the typical training trajectory may never even encounter generalizable models.

More Details

A Projected Network Model of Online Disinformation Cascades

Emery, Benjamin; Ting, Christina; Johnson, Nicholas; Tucker, J.D.

Within the past half-decade, it has become overwhelmingly clear that suppressing the spread of deliberate false and misleading information is of the utmost importance for protecting democratic institutions. Disinformation has been found to come from both foreign and domestic actors, but the effects from either can be disastrous. From the simple encouragement of unwarranted distrust to conspiracy theories promoting violence, the results of disinformation have put the functionality of American democracy under direct threat. Present scientific challenges posed by this problem include detecting disinformation, quantifying its potential impact, and preventing its amplification. We present a model on which we can experiment with possible strategies toward the third challenge: the prevention of amplification. This is a social contagion network model, which is decomposed into layers to represent physical, ''offline'', interactions as well as virtual interactions on a social media platform. Along with the topological modifications to the standard contagion model, we use state-transition rules designed specifically for disinformation, and distinguish between contagious and non-contagious infected nodes. We use this framework to explore the effect of grassroots social movements on the size of disinformation cascades by simulating these cascades in scenarios where a proportion of the agents remove themselves from the social platform. We also test the efficacy of strategies that could be implemented at the administrative level by the online platform to minimize such spread. These top-down strategies include banning agents who disseminate false information, or providing corrective information to individuals exposed to false information to decrease their probability of believing it. We find an abrupt transition to smaller cascades when a critical number of random agents are removed from the platform, as well as steady decreases in the size of cascades with increasingly more convincing corrective information. Finally, we compare simulated cascades on this framework with real cascades of disinformation recorded on Whatsapp surrounding the 2019 Indian election. We find a set of hyperparameter values that produces a distribution of cascades matching the scaling exponent of the distribution of actual cascades recorded in the dataset. We acknowledge the available future directions for improving the performance of the framework and validation methods, as well as ways to extend the model to capture additional features of social contagion.

More Details

Quantum Sensed Electron Spin Resonance Discovery Platform (Final Report)

Lilly, Michael; Saleh Ziabari, Maziar S.; Titze, Michael; Henshaw, Jacob D.; Bielejec, Edward S.; Huber, Dale L.; Mounce, Andrew M.

The properties of materials can change dramatically at the nanoscale new and useful properties can emerge. An example is found in the paramagnetism in iron oxide magnetic nanoparticles. Using magnetically sensitive nitrogen-vacancy centers in diamond, we developed a platform to study electron spin resonance of nanoscale materials. To implement the platform, diamond substrates were prepared with nitrogen vacancy centers near the surface. Nanoparticles were placed on the surface using a drop casting technique. Using optical and microwave pulsing techniques, we demonstrated T1 relaxometry and double electron-electron resonance techniques for measuring the local electron spin resonance. The diamond NV platform developed in this project provides a combination of good magnetic field sensitivity and high spatial resolution and will be used for future investigations of nanomaterials and quantum materials.

More Details

Fuel Fabrication and Single Stage Aqueous Process Modeling

Foulk, James W.; Taconi, Anna M.; Honnold, Philip; Cipiti, Benjamin B.

The Material Protection, Accounting, and Control Technologies program utilizes modeling and simulation to assess Material Control and Accountability (MC&A) concerns for a variety of nuclear facilities. Single analyst tools allow for rapid design and evaluation of advanced approaches for new and existing nuclear facilities. A low enriched uranium (LEU) fuel conversion and fabrication facility simulator is developed to assist with MC&A for existing facilities. Measurements are added to the model (consistent with current best practices). Material balance calculations and statistical tests are also added to the model. In addition, scoping work is performed for developing a single stage aqueous reprocessing model. Preliminary results are presented and discussed, and next steps outlined.

More Details

Evidence for a high temperature whisker growth mechanism active in tungsten during in situ nanopillar compression

Nanomaterials

Jawaharram, Gowtham S.; Barr, Christopher M.; Hattar, Khalid M.; Dillon, Shen J.

A series of nanopillar compression tests were performed on tungsten as a function of temperature using in situ transmission electron microscopy with localized laser heating. Surface oxidation was observed to form on the pillars and grow in thickness with increasing temperature. Deformation between 850◦C and 1120◦C is facilitated by long-range diffusional transport from the tungsten pillar onto adjacent regions of the Y2O3-stabilized ZrO2 indenter. The constraint imposed by the surface oxidation is hypothesized to underly this mechanism for localized plasticity, which is generally the so-called whisker growth mechanism. The results are discussed in context of the tungsten fuzz growth mechanism in He plasma-facing environments. The two processes exhibit similar morphological features and the conditions under which fuzz evolves appear to satisfy the conditions necessary to induce whisker growth.

More Details

3D orthorhombic earth model effects on seismic source characterization

Jensen, Richard P.; Preston, Leiph

Most earth materials are anisotropic with regard to seismic wave-speeds, especially materials such as shales, or where oriented fractures are present. However, the base assumption for many numerical simulations is to treat earth materials as isotropic media. This is done for simplicity, the apparent weakness of anisotropy in the far field, and the lack of well-characterized anisotropic material properties for input into numerical simulations. One approach for addressing the higher complexity of actual geologic regions is to model the material as an orthorhombic medium. We have developed an explicit time-domain, finite-difference (FD) algorithm for simulating three-dimensional (3D) elastic wave propagation in a heterogeneous orthorhombic medium. The objective of this research is to investigate the errors and biases that result from modeling a non-isotropic medium as an isotropic medium. This is done by computing “observed data” by using synthetic, anisotropic simulations with the assumption of an orthorhombic, anisotropic earth model. Green’s functions for an assumed isotropic earth model are computed and then used an inversion designed to estimate moment tensors with the “observed” data. One specific area of interest is how shear waves, which are introduced in an anisotropic model even for an isotropic explosion, affect the characterization of seismic sources when isotropic earth assumptions are made. This work is done in support of the modeling component of the Source Physics Experiment (SPE), a series of underground chemical explosions at the Nevada National Security Site (NNSS).

More Details

Final report of activities for the LDRD-express project #223796 titled: “Fluid models of charged species transport: numerical methods with mathematically guaranteed properties”, PI: Ignacio Tomas, Co-PI: John Shadid

Tomas, Ignacio; Shadid, John N.; Crockatt, Michael M.; Pawlowski, Roger; Maier, Matthias; Guermond, Jean-Luc

This report summarizes the findings and outcomes of the LDRD-express project with title “Fluid models of charged species transport: numerical methods with mathematically guaranteed properties”. The primary motivation of this project was the computational/mathematical exploration of the ideas advanced aiming to improve the state-of-the-art on numerical methods for the one-fluid Euler-Poisson models and gain some understanding on the Euler-Maxwell model. Euler-Poisson and Euler-Maxwell, by themselves are not the most technically relevant PDE plasma-models. However, both of them are elementary building blocks of PDE-models used in actual technical applications and include most (if not all) of their mathematical difficulties. Outside the classical ideal MHD models, rigorous mathematical and numerical understanding of one-fluid models is still a quite undeveloped research area, and the treatment/understanding of boundary conditions is minimal (borderline non-existent) at this point in time. This report focuses primarily on bulk-behaviour of Euler-Poisson’s model, touching boundary conditions only tangentially.

More Details

Verification of Data-Driven Models of Physical Phenomena using Interpretable Approximation

Ray, Jaideep; Barone, Matthew F.; Domino, Stefan P.; Banerjee, Tania; Ranka, Sanjay

Machine-learned models, specifically neural networks, are increasingly used as “closures” or “constitutive models” in engineering simulators to represent fine-scale physical phenomena that are too computationally expensive to resolve explicitly. However, these neural net models of unresolved physical phenomena tend to fail unpredictably and are therefore not used in mission-critical simulations. In this report, we describe new methods to authenticate them, i.e., to determine the (physical) information content of their training datasets, qualify the scenarios where they may be used and to verify that the neural net, as trained, adhere to physics theory. We demonstrate these methods with neural net closure of turbulent phenomena used in Reynolds Averaged Navier-Stokes equations. We show the types of turbulent physics extant in our training datasets, and, using a test flow of an impinging jet, identify the exact locations where the neural network would be extrapolating i.e., where it would be used outside the feature-space where it was trained. Using Generalized Linear Mixed Models, we also generate explanations of the neural net (à la Local Interpretable Model agnostic Explanations) at prototypes placed in the training data and compare them with approximate analytical models from turbulence theory. Finally, we verify our findings by reproducing them using two different methods.

More Details

Pulsed Magnetic Gradiometry in Earth's Field [Poster]

Campbell, Kaleb L.; Wang, Ying-Ju; Schwindt, Peter D.; Jau, Yuan-Yu; Shah, Vishal

We describe a novel pulsed magnetic gradiometer based on the optical interference of sidebands generated using two spatially separated alkali vapor cells. In contrast to traditional magnetic gradiometers, our approach provides a direct readout of the gradient field without the intermediate step of subtracting the outputs of two spatially separated magnetometers. Operation of the gradiometer in multiple field orientations is discussed. The noise floor is measured as low as 25$\frac{fT}{\sqrt{Hz-cm}}$ in a room without magnetic shielding.

More Details

Integrated System and Application Continuous Performance Monitoring and Analysis Capability

Aaziz, Omar R.; Allan, Benjamin A.; Brandt, James M.; Cook, Jeanine; Devine, Karen; Elliott, James E.; Gentile, Ann C.; Hammond, Simon; Kelley, Brian M.; Lopatina, Lena; Moore, Stan G.; Olivier, Stephen L.; Foulk, James W.; Poliakoff, David; Pawlowski, Roger; Regier, Phillip; Schmitz, Mark E.; Schwaller, Benjamin; Surjadidjaja, Vanessa; Swan, Matthew S.; Tucker, Nick; Tucker, Thomas; Vaughan, Courtenay T.; Walton, Sara P.

Scientific applications run on high-performance computing (HPC) systems are critical for many national security missions within Sandia and the NNSA complex. However, these applications often face performance degradation and even failures that are challenging to diagnose. To provide unprecedented insight into these issues, the HPC Development, HPC Systems, Computational Science, and Plasma Theory & Simulation departments at Sandia crafted and completed their FY21 ASC Level 2 milestone entitled "Integrated System and Application Continuous Performance Monitoring and Analysis Capability." The milestone created a novel integrated HPC system and application monitoring and analysis capability by extending Sandia's Kokkos application portability framework, Lightweight Distributed Metric Service (LDMS) monitoring tool, and scalable storage, analysis, and visualization pipeline. The extensions to Kokkos and LDMS enable collection and storage of application data during run time, as it is generated, with negligible overhead. This data is combined with HPC system data within the extended analysis pipeline to present relevant visualizations of derived system and application metrics that can be viewed at run time or post run. This new capability was evaluated using several week-long, 290-node runs of Sandia's ElectroMagnetic Plasma In Realistic Environments ( EMPIRE ) modeling and design tool and resulted in 1TB of application data and 50TB of system data. EMPIRE developers remarked this capability was incredibly helpful for quickly assessing application health and performance alongside system state. In short, this milestone work built the foundation for expansive HPC system and application data collection, storage, analysis, visualization, and feedback framework that will increase total scientific output of Sandia's HPC users.

More Details

Cyber-Physical Risks for Advanced Reactors

Fasano, Raymond; Lamb, Christopher; Hahn, Andrew S.; Haddad, Alexandria

Cybersecurity for industrial control systems is an important consideration that advance reactor designers will need to consider. How cyber risk is managed is the subject of on-going research and debate in the nuclear industry. This report seeks to identify potential cyber risks for advance reactors. Identified risks are divided into absorbed risk and licensee managed risk to clearly show how cyber risks for advance reactors can potentially be transferred. Absorbed risks are risks that originate external to the licensee but may unknowingly propagate into the plant. Insights include (1) the need for unification of safety, physical security, and cybersecurity risk assessment frameworks to ensure optimal coordination of risk, (2) a quantitative risk assessment methodology in conjunction with qualitative assessments may be useful in efficiently and sufficiently managing cyber risks, and (3) cyber risk management techniques should align with a risked informed regulatory framework for advance reactors.

More Details

The Kokkos EcoSystem: Comprehensive Performance Portability for High Performance Computing

Computing in Science and Engineering

Trott, Christian R.; Berger-Vergiat, Luc; Poliakoff, David; Rajamanickam, Sivasankaran; Lebrun-Grandie, Damien; Madsen, Jonathan; Al Awar, Nader; Gligoric, Milos; Shipman, Galen; Womeldorff, Geoff

State-of-the-art engineering and science codes have grown in complexity dramatically over the last two decades. Application teams have adopted more sophisticated development strategies, leveraging third party libraries, deploying comprehensive testing, and using advanced debugging and profiling tools. In today's environment of diverse hardware platforms, these applications also desire performance portability-avoiding the need to duplicate work for various platforms. The Kokkos EcoSystem provides that portable software stack. Based on the Kokkos Core Programming Model, the EcoSystem provides math libraries, interoperability capabilities with Python and Fortran, and Tools for analyzing, debugging, and optimizing applications. In this article, we overview the components, discuss some specific use cases, and highlight how codesigning these components enables a more developer friendly experience.

More Details

Advance Reactor Operational Technology Architecture Categorization

Fasano, Raymond; Hahn, Andrew S.; Haddad, Alexandria; Lamb, Christopher

Seven generation III+ and generation IV nuclear reactor types, based on twelve reactor concepts surveyed, are examined using functional decomposition to extract relevant operational technology (OT) architecture information. This information is compared to existing nuclear power plants (NPPs) OT architectures to highlight novel and emergent cyber risks associated with next generation NPPs. These insights can help inform operational technology architecture requirements that will be unique to a given reactor type. Next generation NPPs have streamlined OT architectures relative to the current generation II commercial NPP fleet. Overall, without compensatory measures that provide sufficient and efficient cybersecurity controls, next generation NPPs will have increased cyber risk. Verification and validation of cyber-physical testbeds and cyber risk assessment methodologies may be an important next step to reduce cyber risk in the OT architecture design and testing phase. Coordination with safety requirements can result in OT architecture design being an iterative process.

More Details

AXIOM Unfold 0.7.0, Users Manual

Radtke, Gregg A.

The AXIOM-Unfold application is a computational code for performing spectral unfolds along with uncertainty quantification of the photon spectrum. While this code was principally designed for spectral unfolds on the Saturn source, it is also relevant to other radiation sources such as Pithon. This code is a component of the AXIOM project which was undertaken in order to measure the time-resolved spectrum of the Saturn source; to support this, the AXIOM-Unfold code is able to process time-dependent dose measurements in order to obtain a time-resolved spectrum. This manual contains a full description of the algorithms used by the method. The code features are fully documented along with several worked examples.

More Details

Adaptation of the NWM Cloud Environment for an ISF Project

Meacham, Janette; Meacham, Paul; Huber, Cynthia; Grong, Erica

The DOE-NE NWM Cloud was designed to be a generic set of tools and applications for any nuclear waste management program. As policymakers continue to consider approaches that emphasize consolidated interim storage and transportation of spent nuclear fuel, a gap analysis of the tools and applications provided for spent nuclear fuel and high-level radioactive waste disposal in comparison those needed for siting, licensing, and developing a consolidated interim storage facility and/or for a transportation campaign will help prepare DOE for implementing such potential policy direction. This report evaluates the points of alignment and potential gaps between the applications on the NWM Cloud that supported SNF disposal project, and the applications needed to address QA requirements and for other project support needs of an SNF storage project.

More Details

Defining Computational Emissivity Uncertainty Over Large Temperature Scales Due to Surface Evolution

Journal of Verification, Validation and Uncertainty Quantification

Silva, Humberto; Mills, Brantley; Schroeder, Benjamin B.; Keedy, Ryan M.; Smith, Kyle D.

There is a dearth in the literature on how to capture the uncertainty generated by material surface evolution in thermal modeling. This leads to inadequate or highly variable uncertainty representations for material properties, specifically emissivity when minimal information is available. Inaccurate understandings of prediction uncertainties may lead decision makers to incorrect conclusions, so best engineering practices should be developed for this domain. In order to mitigate the aforementioned issues, this study explores different strategies to better capture the thermal uncertainty response of engineered systems exposed to fire environments via defensible emissivity uncertainty characterizations that can be easily adapted to a variety of use cases. Two unique formulations (one physics-informed and one mathematically based) are presented. The formulations and methodologies presented herein are not exhaustive but more so are a starting point and give the reader a basis for how to customize their uncertainty definitions for differing fire scenarios and materials. Finally, the impact of using this approach versus other commonly used strategies and the usefulness of adding rigor to material surface evolution uncertainty is demonstrated.

More Details

Improved forward voltage and external quantum efficiency scaling in multi-active region III-nitride LEDs

Applied Physics Express

Jamal-Eddine, Zane; Gunning, Brendan P.; Armstrong, Andrew A.; Rajan, Siddharth

Ultra-low voltage drop tunnel junctions (TJs) were utilized to enable multi-active region blue light emitting diodes (LEDs) with up to three active regions in a single device. The multi-active region blue LEDs were grown monolithically by metal-organic chemical vapor deposition (MOCVD) without growth interruption. This is the first demonstration of a MOCVD grown triple-junction LED. Optimized TJ design enabled near-ideal voltage and EQE scaling close to the number of junctions. This work demonstrates that with proper TJ design, improvements in wall-plug efficiency at high output power operation are possible by cascading multiple III-nitride based LEDs.

More Details

Manipulation of Hole Spin Transport in Germanium

Lu, Tzu M.; Hutchins-Delgado, Troy A.; Lidsky, David A.

Downscaling of the silicon metal-oxide-semiconductor field-effect transistor technology is expected to reach a fundamental limit soon. A paradigm shift in computing is occurring. Spin field-effect transistors are considered a candidate architecture for next-generation microelectronics. Being able to leverage the existing infrastructure for silicon, a spin field-effect transistor technology based on group IV heterostructures will have unparalleled technical and economical advantages. For the same material platform reason, germanium hole quantum dots are also considered a competitive architecture for semiconductor-based quantum technology. In this project, we investigated several approaches to creating hole devices in germanium-based materials as well as injecting hole spins in such structures. We also explored the roles of hole injection in wet chemical etching of germanium. Our main results include the demonstration of germanium metal-oxide-semiconductor field-effect transistors operated at cryogenic temperatures, ohmic current-voltage characteristics in germanium/silicon-germanium heterostructures with ferromagnetic contacts at deep cryogenic temperatures and high magnetic fields, evaluation of the effects of surface preparation on carrier mobility in germanium/silicon- germanium heterostructures, and hole spin polarization through integrated permanent magnets. These results serve as essential components for fabricating next-generation germanium-based devices for microelectronics and quantum systems.

More Details

Incentivizing Adoption of Software Quality Practices

Raybourn, Elaine M.; Milewicz, Reed M.; Mundt, Miranda R.

Although many software teams across the laboratories comply with yearly software quality engineering (SQE) assessments, the practice of introducing quality into each phase of the software lifecycle, or the team processes, may vary substantially. Even with the support of a quality engineer, many teams struggle to adapt and right-size software engineering best practices in quality to fit their context, and these activities aren’t framed in a way that motivates teams to take action. In short, software quality is often a “check the box for compliance” activity instead of a cultural practice that both values software quality and knows how to achieve it. In this report, we present the results of our 6600 VISTA Innovation Tournament project, "Incentivizing and Motivating High Confidence and Research Software Teams to Adopt the Practice of Quality." We present our findings and roadmap for future work based on 1) a rapid review of relevant literature, 2) lessons learned from an internal design thinking workshop, and 3) an external Collegeville 2021 workshop. These activities provided an opportunity for team ideation and community engagement/feedback. Based on our findings, we believe a coordinated effort (e.g. strategic communication campaign) aimed at diffusing the innovation of the practice of quality across Sandia National Laboratories could over time effect meaningful organizational change. As such, our roadmap addresses strategies for motivating and incentivizing individuals ranging from early career to seasoned software developers/scientists.

More Details

The Power and Energy Storage Systems Toolbox–PSTess (V1.0)

Elliott, Ryan T.; Trudnowski, Daniel J.; Choi, Hyungjin; Nguyen, Tam

This document describes the Power and Energy Storage Systems Toolbox for MATLAB, abbreviated as PSTess. This computing package is a fork of the Power Systems Toolbox (PST). PST was originally developed at Rensselaer Polytechnic Institute (RPI) and later upgraded by Dr. Graham Rogers at Cherry Tree Scientific Software. While PSTess shares a common lineage with PST Version 3.0, it is a substantially different application. This document supplements the main PST manual by describing the features and models that are unique to PSTess. As the name implies, the main distinguishing characteristic of PSTess is its ability to model inverter-based energy storage systems (ESS). The model that enables this is called ess.m , and it serves the dual role of representing ESS operational constraints and the generator/converter interface. As in the WECC REGC_A model, the generator/converter interface is modeled as a controllable current source with the ability to modulate both real and reactive current. The model ess.m permits four-quadrant modulation, which allows it to represent a wide variety of inverter-based resources beyond energy storage when paired with an appropriate supplemental control model. Examples include utility-scale photovoltaic (PV) power plants, Type 4 wind plants, and static synchronous compensators (STATCOM). This capability is especially useful for modeling hybrid plants that combine energy storage with renewable resources or FACTS devices.

More Details

DOE Fffice of Nuclear Energy cybersecurity research, development and demonstration program plan

Dawson, Lon A.

This document describes the Cybersecurity Research Development and Demonstration (RD&D) Program, established by the Department of Energy Office of Nuclear Energy (NE) to provide sciencebased methods and technologies necessary for cost-effective, cyber-secure digital instrumentation, control and communication in collaboration with nuclear energy stakeholders. It provides an overview of program goals, objectives, linkages to organizational strategies, management structure, and stakeholder and cross-program interfaces.

More Details

Efficient flexible characterization of quantum processors with nested error models

New Journal of Physics

Nielsen, Erik N.; Rudinger, Kenneth M.; Proctor, Timothy J.; Young, Kevin; Blume-Kohout, Robin

We present a simple and powerful technique for finding a good error model for a quantum processor. The technique iteratively tests a nested sequence of models against data obtained from the processor, and keeps track of the best-fit model and its wildcard error (a metric of the amount of unmodeled error) at each step. Each best-fit model, along with a quantification of its unmodeled error, constitutes a characterization of the processor. We explain how quantum processor models can be compared with experimental data and to each other. We demonstrate the technique by using it to characterize a simulated noisy two-qubit processor.

More Details

Sphynx: A parallel multi-GPU graph partitioner for distributed-memory systems

Parallel Computing

Acer, Seher; Boman, Erik G.; Glusa, Christian; Rajamanickam, Sivasankaran

Graph partitioning has been an important tool to partition the work among several processors to minimize the communication cost and balance the workload. While accelerator-based supercomputers are emerging to be the standard, the use of graph partitioning becomes even more important as applications are rapidly moving to these architectures. However, there is no distributed-memory-parallel, multi-GPU graph partitioner available for applications. We developed a spectral graph partitioner, Sphynx, using the portable, accelerator-friendly stack of the Trilinos framework. In Sphynx, we allow using different preconditioners and exploit their unique advantages. We use Sphynx to systematically evaluate the various algorithmic choices in spectral partitioning with a focus on the GPU performance. We perform those evaluations on two distinct classes of graphs: regular (such as meshes, matrices from finite element methods) and irregular (such as social networks and web graphs), and show that different settings and preconditioners are needed for these graph classes. The experimental results on the Summit supercomputer show that Sphynx is the fastest alternative on irregular graphs in an application-friendly setting and obtains a partitioning quality close to ParMETIS on regular graphs. When compared to nvGRAPH on a single GPU, Sphynx is faster and obtains better balance and better quality partitions. Sphynx provides a good and robust partitioning method across a wide range of graphs for applications looking for a GPU-based partitioner.

More Details

Propagation of a Stress Pulse in a Heterogeneous Elastic Bar

Journal of Peridynamics and Nonlocal Modeling

Silling, Stewart

The propagation of a wave pulse due to low-speed impact on a one-dimensional, heterogeneous bar is studied. Due to the dispersive character of the medium, the pulse attenuates as it propagates. This attenuation is studied over propagation distances that are much longer than the size of the microstructure. A homogenized peridynamic material model can be calibrated to reproduce the attenuation and spreading of the wave. The calibration consists of matching the dispersion curve for the heterogeneous material near the limit of long wavelengths. It is demonstrated that the peridynamic method reproduces the attenuation of wave pulses predicted by an exact microstructural model over large propagation distances.

More Details

Efficient prompt scintillation and fast neutron-gamma ray discrimination using amorphous blends of difluorenylsilane organic glass and in situ polymerized vinyltoluene

IEEE Transactions on Nuclear Science

Myllenbeck, Nicholas R.; Feng, Patrick L.; Benin, Annabelle I.; Tran, Huu; Carlson, Joseph S.; Hunter, McKenzie A.

High-performance radiation detection materials are an integral part of national security, medical imaging, and nuclear physics applications. Those that offer compositional and manufacturing versatility are of particular interest. Here, we report a new family of radiological particle-discriminating scintillators containing bis(9,9-dimethyl-9H-fluoren-2-yl)diphe-nylsilane (compound 'P2') and in situ polymerized vinyltoluene (PVT) that is phase stable and mechanically robust at any blend ratio. The gamma-ray light yield increases nearly linearly across the composition range, to 16 400 photons/MeV at 75 wt.% P2. These materials are also capable of performing γ/n pulse shape discrimination (PSD), and between 20% and 50% P2 loading is competitive with the PSD quality of commercially available plastic scintillators. The 137Cs scintillation rise and decay times are sensitive to P2 loading and approach the values for 'pure' P2. Additionally, the radiation detection performance of P2-PVT blends can be made stable in 60 °C air for at least 1.5 months with the application of a thin film of poly(vinylalcohol) to the scintillator surfaces.

More Details

Noise and error analysis and optimization in particle-based kinetic plasma simulations

Journal of Computational Physics

Evstatiev, E.G.; Finn, J.M.; Shadwick, B.A.; Hengartner, N.

In this paper we analyze the noise in macro-particle methods used in plasma physics and fluid dynamics, leading to approaches for minimizing the total error, focusing on electrostatic models in one dimension. We begin by describing kernel density estimation for continuous values of the spatial variable x, expressing the kernel in a form in which its shape and width are represented separately. The covariance matrix C(x,y) of the noise in the density is computed, first for uniform true density. The bandwidth of the covariance matrix is related to the width of the kernel. A feature that stands out is the presence of constant negative terms in the elements of the covariance matrix both on and off-diagonal. These negative correlations are related to the fact that the total number of particles is fixed at each time step; they also lead to the property ∫C(x,y)dy=0. We investigate the effect of these negative correlations on the electric field computed by Gauss's law, finding that the noise in the electric field is related to a process called the Ornstein-Uhlenbeck bridge, leading to a covariance matrix of the electric field with variance significantly reduced relative to that of a Brownian process. For non-constant density, ρ(x), still with continuous x, we analyze the total error in the density estimation and discuss it in terms of bias-variance optimization (BVO). For some characteristic length l, determined by the density and its second derivative, and kernel width h, having too few particles within h leads to too much variance; for h that is large relative to l, there is too much smoothing of the density. The optimum between these two limits is found by BVO. For kernels of the same width, it is shown that this optimum (minimum) is weakly sensitive to the kernel shape. We repeat the analysis for x discretized on a grid. In this case the charge deposition rule is determined by a particle shape. An important property to be respected in the discrete system is the exact preservation of total charge on the grid; this property is necessary to ensure that the electric field is equal at both ends, consistent with periodic boundary conditions. We find that if the particle shapes satisfy a partition of unity property, the particle charge deposited on the grid is conserved exactly. Further, if the particle shape is expressed as the convolution of a kernel with another kernel that satisfies the partition of unity, then the particle shape obeys the partition of unity. This property holds for kernels of arbitrary width, including widths that are not integer multiples of the grid spacing. We show results relaxing the approximations used to do BVO optimization analytically, by doing numerical computations of the total error as a function of the kernel width, on a grid in x. The comparison between numerical and analytical results shows good agreement over a range of particle shapes. We discuss the practical implications of our results, including the criteria for design and implementation of computationally efficient particle shapes that take advantage of the developed theory.

More Details

Constitutive Model Development for Aging Polymer Encapsulants (ASC P&EM FY2021 L2 Milestone 7836)

Cundiff, K.N.; Long, Kevin N.; Kropka, Jamie M.; Carroll, Shianne; Groves, Catherine

This SAND report fulfills the completion requirements for the ASC Physics and Engineering Modeling Level 2 Milestone 7836 during Fiscal Year 2021. The Sandia Simplified potential energy clock (SPEC) non-linear viscoelastic constitutive model was developed to predict a whole host of polymer glass physical behaviors in order to provide a tool to assess the effects of stress on these materials over their lifecycle. Polymer glasses are used extensively in applications such as electronics packaging, where encapsulants and adhesives can be critical to device performance. In this work, the focus is on assessing the performance of the model in predicting material evolution associated with long-term physical aging, an area that the model has not been fully vetted in. These predictions are key to utilizing models to help demonstrate electronics packaging component reliability over decades long service lives, a task that is very costly and time consuming to execute experimentally. The initiating hypothesis for the work was that a model calibration process can be defined that enables confidence in physical aging predictions under ND relevant environments and timescales without sacrificing other predictive capabilities. To test the hypothesis, an extensive suite of calibration and aging data was assembled from a combination of prior work and collaborating projects (Aging and Lifetimes as well as the DoD Joint Munitions Program) for two mission relevant epoxy encapsulants, 828DGEBA/DEA and 828DGEBA/T403. Multiple model calibration processes were developed and evaluated against the entire set of data for each material. A qualitative assessment of each calibration's ability to predict the wide range of aging responses was key to ranking the calibrations against each other. During this evaluation, predictions that were identified as non-physical, i.e., demonstrated something that was qualitatively different than known material behavior, were heavily weighted against the calibration performance. Thus, unphysical predictions for one aspect of aging response could generate a lower overall rating for a calibration process even if that process generated better quantitative predictions for another aspect of aging response. This insurance that all predictions are qualitatively correct is important to the overall aim of utilizing the model to predict residual stress evolution, which will depend on the interplay amongst the different material aging responses. The DSC-focused calibration procedure generated the best all-around aging predictions for both materials, demonstrating material models that can qualitatively predict the whole host of different physical aging responses that have been measured. This step forward in predictive capability comes from an unanticipated source, utilization of calorimetry measurements to specify model parameters. The DSC-focused calibration technique performed better than compression-focused techniques that more heavily weigh measurements more closely related to the structural responses to be predicted. Indeed, the DSC-focused calibration procedure was only possible due to recent incorporation of the enthalpy and heat capacity features into SPEC that was newly verified during this L2 milestone. Fundamentally similar aspects of the two material model calibrations as well as parametric studies to assess sensitives of the aging predictions are discussed within the report. A perspective on the next steps to the overall goal of residual stress evolution predictions under stockpile conditions closes the report.

More Details

A Fast-Cycle Charge Noise Measurement for Better Qubits

Lewis, Rupert M.; Kindel, William; Harris, Charles T.; Del Skinner Ramos, Suelicarmen

Defects in materials are an ongoing challenge for quantum bits, so called qubits. Solid state qubits—both spins in semiconductors and superconducting qubits—suffer from losses and noise caused by two-level-system (TLS) defects thought to reside on surfaces and in amorphous materials. Understanding and reducing the number of such defects is an ongoing challenge to the field. Superconducting resonators couple to TLS defects and provide a handle that can be used to better understand TLS. We develop noise measurements of superconducting resonators at very low temperatures (20 mK) compared to the resonant frequency, and low powers, down to single photon occupation.

More Details

Development and Use of an Ultra-High Resolution Electron Scattering Apparatus

Frank, Jonathan H.; Foulk, James W.; Jana, Irina; Huang, Erxiong; Chandler, David

In this LDRD project, we developed a versatile capability for high-resolution measurements of electron scattering processes in gas-phase molecules, such as ionization, dissociation, and electron attachment/detachment. This apparatus is designed to advance fundamental understanding of these processes and to inform predictions of plasmas associated with applications such as plasma-assisted combustion, neutron generation, re-entry vehicles, and arcing that are critical to national security. We use innovative coupling of electron-generation and electron-imaging techniques that leverages Sandia’s expertise in ion/electron imaging methods. Velocity map imaging provides a measure of the kinetic energies of electrons or ion products from electron scattering in an atomic or molecular beam. We designed, constructed, and tested the apparatus. Tests include dissociative electron attachment to O2 and SO2, as well as a new method for studying laser-initiated plasmas. This capability sets the stage for new studies in dynamics of electron scattering processes, including scattering from excited-state atoms and molecules.

More Details

Rock Valley Accelerated Weight Drop Preliminary P-wave Tomographic Model

Preston, Leiph; Harding, Jennifer L.

An active source experiment using an accelerated weight drop was conducted in Rock Valley, Nevada National Security Site, during the spring of 2021 in order to characterize the shallow seismic structure of the region. P-wave first arrival travel times picked from this experiment were used to construct a preliminary 3-D compressional wave speed model over an area that is roughly 4 km wide east-west and 8 km north-south to a depth of about 500-600 m below the surface, but with primary data concentration along the transects of the experimental lines. The preliminary model shows good correlation with basic geology and surface features, but geological interpretation is not the focus of this report. We describe the methods used in the tomographic inversion of the data and show results from this preliminary P-wave model.

More Details

Seismic Source Modeling Software Enhancements (FY21)

Preston, Leiph; Poppeliers, Christian; Eliassi, Mehdi

Seismic source modeling allows researchers both to simulate how a source that induces seismic waves interacts with the Earth to produce observed seismograms and, inversely, to infer what the time histories, sizes, and force distributions were for a seismic source given observed seismograms. In this report, we discuss improvements made in FY21 to our software as applies to both the forward and inverse seismic source modeling problems. For the forward portion of the problem, we have added the ability to use full 3-D nonlinear simulations by implementing 3-D time varying boundary conditions within Sandia’s linear seismic code Parelasti. Secondly, on the inverse source modeling side, we have developed software that allows us to invert seismic gradiometer-derived observations in conjunction with standard translational motion seismic data to infer properties of the source that may improve characterization in certain circumstances. First, we describe the basic theory behind each software enhancement and then demonstrate the software in action with some simple examples.

More Details

Thermal Infrared Detectors: expanding performance limits using ultrafast electron microscopy

Talin, Albert A.; Ellis, Scott; Bartelt, Norman C.; Leonard, Francois; Perez, Christopher; Celio, Km; Fuller, Elliot J.; Hughart, David R.; Garland, D.; Marinella, Matthew; Michael, Joseph R.; Chandler, David; Young, Steve M.; Smith, Sean; Kumar, Suhas

This project aimed to identify the performance-limiting mechanisms in mid- to far infrared (IR) sensors by probing photogenerated free carrier dynamics in model detector materials using scanning ultrafast electron microscopy (SUEM). SUEM is a recently developed method based on using ultrafast electron pulses in combination with optical excitations in a pump- probe configuration to examine charge dynamics with high spatial and temporal resolution and without the need for microfabrication. Five material systems were examined using SUEM in this project: polycrystalline lead zirconium titanate (a pyroelectric), polycrystalline vanadium dioxide (a bolometric material), GaAs (near IR), InAs (mid IR), and Si/SiO 2 system as a prototypical system for interface charge dynamics. The report provides detailed results for the Si/SiO 2 and the lead zirconium titanate systems.

More Details

Locating Seismic Events with Local-Distance Data

Davenport, Kathy

As the seismic monitoring community advances toward detecting, identifying, and locating ever-smaller natural and anthropogenic events, the need is constantly increasing for higher resolution, higher fidelity data, models, and methods for accurately characterizing events. Local-distance seismic data provide robust constraints on event locations, but also introduce complexity due to the significant geologic heterogeneity of the Earth’s crust and upper mantle, and the relative sparsity of data that often occurs with small events recorded on regional seismic networks. Identifying the critical characteristics for improving local-scale event locations and the factors that impact location accuracy and reliability is an ongoing challenge for the seismic community. Using Utah as a test case, we examine three data sets of varying duration, finesse, and magnitude to investigate the effects of local earth structure and modeling parameters on local-distance event location precision and accuracy. We observe that the most critical elements controlling relocation precision are azimuthal coverage and local-scale velocity structure, with tradeoffs based on event depth, type, location, and range.

More Details

Named Data Networking for DER Cybersecurity

Chavez, Adrian R.; Cordeiro, Patricia G.; Huang, Gary; Kitsos, Panayioti; La Pay, Trevor; Short, Austin; Summers, Adam

We present our research findings on the novel NDN protocol. In this work, we defined key attack scenarios for possible exploitation and detail software security testing procedures to evaluate the security of the NDN software. This work was done in the context of distributed energy resources (DER). The software security testing included an execution of unit tests and static code analyses to better understand the software rigor and the security that has been implemented. The results from the penetration testing are presented. Recommendations are discussed to provide additional defense for secure end-to-end NDN communications.

More Details

Direct Subsurface Measurements through Precise Micro Drilling

Su, Jiann-Cherng; Bettin, Giorgia; Buerger, Stephen P.; Rittikaidachar, Michal; Hobart, Clinton; Slightam, Jonathon E.; Mcbrayer, Kepra L.; Gonzalez, Levi M.; Pope, Joseph S.; Foris, Adam J.; Bruss, Kathryn; Kim, Raymond; Mazumdar, Anirban

Wellbore integrity is a significant problem in the U.S. and worldwide, which has serious adverse environmental and energy security consequences. Wells are constructed with a cement barrier designed to last about 50 years. Indirect measurements and models are commonly used to identify wellbore damage and leakage, often producing subjective and even erroneous results. The research presented herein focuses on new technologies to improve monitoring and detection of wellbore failures (leaks) by developing a multi-step machine learning approach to localize two types of thermal defects within a wellbore model, a prototype mechatronic system for automatically drilling small diameter holes of arbitrary depth to monitor the integrity of oil and gas wells in situ, and benchtop testing and analyses to support the development of an autonomous real-time diagnostic tool to enable sensor emplacement for monitoring wellbore integrity. Each technology was supported by experimental results. This research has provided tools to aid in the detection of wellbore leaks and significantly enhanced our understanding of the interaction between small-hole drilling and wellbore materials.

More Details

Critical Infrastructure Decision-Making under Long-Term Climate Hazard Uncertainty: The Need for an Integrated, Multidisciplinary Approach

Staid, Andrea; Fleming, Elizabeth S.; Gunda, Thushara; Jackson, Nicole D.

U.S. critical infrastructure assets are often designed to operate for decades, and yet long-term planning practices have historically ignored climate change. With the current pace of changing operational conditions and severe weather hazards, research is needed to improve our ability to translate complex, uncertain risk assessment data into actionable inputs to improve decision-making for infrastructure planning. Decisions made today need to explicitly account for climate change – the chronic stressors, the evolution of severe weather events, and the wide-ranging uncertainties. If done well, decision making with climate in mind will result in increased resilience and decreased impacts to our lives, economies, and national security. We present a three-tier approach to create the research products needed in this space: bringing together climate projection data, severe weather event modeling, asset-level impacts, and contextspecific decision constraints and requirements. At each step, it is crucial to capture uncertainties and to communicate those uncertainties to decision-makers. While many components of the necessary research are mature (i.e., climate projection data), there has been little effort to develop proven tools for long-term planning in this space. The combination of chronic and acute stressors, spatial and temporal uncertainties, and interdependencies among infrastructure sectors coalesce into a complex decision space. By applying known methods from decision science and data analysis, we can work to demonstrate the value of an interdisciplinary approach to climate-hazard decision making for longterm infrastructure planning.

More Details

Effect of Microstructural Bands on the Localized Corrosion of Laser Surface-Melted 316L Stainless Steel

Corrosion

Hwa, Yoon; Kumai, Christopher S.; Yang, Nancy; Yee, Joshua K.; Devine, Thomas M.

The localized corrosion of laser surface melted (LSM) 316L stainless steel is investigated by a combination of potentiodynamic anodic polarization in 0.1 M HCl and microscopic investigation of the initiation and propagation of localized corrosion. The pitting potential of LSM 316L is significantly lower than the pitting potential of wrought 316L. The LSM microstructure is highly banded as a consequence of the high laser power density and high linear energy density. The bands are composed of zones of changing modes of solidification, cycling between very narrow regions of primary austenite solidification and very wide regions of primary ferrite solidification. Pits initiate in the outer edge of each band where the mode of solidification is primary austenite plane front solidification and primary austenite cellular solidification. The primary austenite regions have low chromium concentration (and possibly low molybdenum concentration), which explains their susceptibility to pitting corrosion. The ferrite is enriched in chromium, which explains the absence of pitting in the primary ferrite regions. The presence of the low chromium regions of primary austenite solidification explains the lower pitting resistance of LSM 316L relative to wrought 316L. The influence of banding on localized corrosion is applicable to other rapidly solidified processes such as additive manufacturing.

More Details
Results 10401–10600 of 99,299
Results 10401–10600 of 99,299