Publications
Search results
Jump to search filtersMulti-Scale Modeling of Electrically Conductive Adhesive Interconnect Reliability
Hartley, James Y.; Bosco, Nick; Springer, Martin
Abstract not provided.
Advanced Membranes for Flow Batteries
Abstract not provided.
High-temperature kinetics of thermal runaway reactions
Kurzawski, John C.; Shurtz, Randy C.; Hewson, John C.
Abstract not provided.
Preliminary Designs for a Freeze-Safe Valve Stem Heat Pipe
Abstract not provided.
Site Environmental Report for Sandia National Laboratories California (2019)
Sandia National Laboratories, California (SNL/CA) is a Department of Energy (DOE) facility. The management and operations of the facility are under a contract with the DOE's National Nuclear Security Administration (NNSA). On May 1, 2017, the name of the management and operating contractor changed from Sandia Corporation to National Technology & Engineering Solutions of Sandia, LLC (NTESS). The DOE, NNSA, Sandia Field Office administers the contract and oversees contractor operations at the site. DOE and its management and operating contractor for Sandia are committed to safeguarding environmental protection, compliance, and sustainability and to ensuring the validity and accuracy of the monitoring data presented in this Annual Site Environmental Report. This Site Environmental Report for 2019 was prepared in accordance with DOE Order 231.1B, Environment, S afro and Health &potting (DOE 2012). The report provides a summary of environmental monitoring information and compliance activities that occurred at SNL/CA during calendar year 2019, unless noted otherwise. General site and environmental program information is also included.
Volt-var curve reactive power control requirements and risks for feeders with distributed roof-top photovoltaic systems
Energies
Jones, Christian B.; Lave, Matt; Reno, Matthew J.; Darbali-Zamora, Rachid; Summers, Adam; Hossain-McKenzie, Shamina S.
The benefits and risks associated with Volt-Var Curve (VVC) control for management of voltages in electric feeders with distributed, roof-top photovoltaic (PV) can be defined using a stochastic hosting capacity analysis methodology. Although past work showed that a PV inverter's reactive power can improve grid voltages for large PV installations, this study adds to the past research by evaluating the control method's impact (both good and bad) when deployed throughout the feeder within small, distributed PV systems. The stochastic hosting capacity simulation effort iterated through hundreds of load and PV generation scenarios and various control types. The simulations also tested the impact of VVCs with tampered settings to understand the potential risks associated with a cyber-attack on all of the PV inverters scattered throughout a feeder. The simulation effort found that the VVC can have an insignificant role in managing the voltage when deployed in distributed roof-top PV inverters. This type of integration strategy will result in little to no harm when subjected to a successful cyber-attack that alters the VVC settings.
Recipe for coating ceramic blades for ion trapping
Stick, Daniel L.; Casias, Adrian L.
The first batches of ion traps patterned and coated were processed per the standard 3-step clean, air fire, and metallization processes. The third or fourth lot using this process resulted in poorly adhering metallization. Up until this point, the standard process was used to metallize and pattern ceramic ion traps without fail. At about the 4th batch of parts something changed. After the 5th batch, the ceramic ion traps received generally came with some unknown contamination that does not come off in a standard 3-step clean (Lenium Vapor Degreaser, Acetone, IPA) and air fire (860C for 1 hour) for which this process removes the vast majority of all contamination for most ceramic metallization. This is highly unusual. Using HF + Boiling H2O2 is extreme for cleaning the ceramic ion traps. The contamination was never identified and is stubborn to effectively clean. Standard as-fired ceramic should be very easy to clean as if s fired at temperatures greater than 1400°C and not much in terms of contamination should exist at these temperatures, so there must be an intermediate step/process which is imparting this contamination. It is likely a polishing compound or previous polishing contaminant, but also not easily visually distinguishable until after metallization. The halo marks observed on parts might be fingerprints (less likely) or potential polishing marks (more likely) as metallization typically doesn't cover/hide any damage or contamination, but rather quite clearly the opposite, it accentuates it. Blotchy appearances in the metallization usually indicated an adhesion issue. As a result of the fragility of the parts (yield loss due to handling) and difficulty in identifying the contamination during cleaning, we have taken a conservative approach of HF + H2O2 cleaning for all batches after the contamination and adhesion issues were identified.
PNT Resilience RFI Response
Brashar, Connor L.; Haydon, Tucker; Luong, Anh
The use of the Global Positioning System (GPS) is a fundamental requirement for most navigation systems today, and this heavy reliance means that denial of GPS service (or extended threats) can pose a significant risk to modern navigation. There is an urgent need for enabling, high-accuracy navigation technologies that can operate without the need for GPS. Ideally, these solutions must be able to initialize in a completely GPS-free environment and continue to navigate even through challenging scenarios. The increasing risk posed to GPS means that trust in this platform is waning—and solutions are required. A future navigator should leverage GPS whenever possible and be capable of identifying and responding to risks while maintaining mission accuracy needs. In the absence of GPS, fully alternative navigation (altnav) technologies are required. This report describes an introductory view of altnav for GPS-impaired and contested environments. Various technologies are collected, presented, and evaluated as potential solutions. A wide snapshot of currently available technologies with a first-order summary of their potential is presented. While this report attempts to be as broad and complete as possible, this is a quickly evolving field.
FY20 ASC IC L2 Milestone 7180: Performance Portability of SIERRA Mechanics Applications to ATS-1 and ATS-2. Executive Summary
Mosby, Matthew D.; Clausen, Jonathan; Crane, Nathan K.; Drake, Richard R.; Thomas, Jesse D.; Williams, Alan B.; Pierson, Kendall H.
The overall goal of this work was to accelerate simulations supporting the nuclear deterrence (ND) mission through improved performance of key algorithms in the ASC IC Sierra multi-physics application suite. This work focused on porting and optimizing algorithms for the graphics processing units (GPU) on the second ASC advanced technology system (ATS-2), while maintaining or improving performance on commodity technology systems (CTS) and ATS-1. Furthermore, these algorithmic developments used the ASC developed Kokkos performance portability abstraction library to maintain high performance across platforms using identical code, and enable sustainable reduced-cost migration and performance optimization to emerging hardware.
Water Network Tool for Resilience (WNTR). User Manual, Version 0.2.3
Klise, Katherine; Hart, David; Bynum, Michael; Hogge, Joseph W.; Haxton, Terranna; Murray, Regan; Burkhardt, Jonathan
The Water Network Tool for Resilience (WNTR, pronounced winter) is a Python package designed to simulate and analyze resilience of water distribution networks. Here, a network refers to the collection of pipes, pumps, valves, junctions, tanks, and reservoirs that make up a water distribution system. WNTR has an application programming interface (API) that is flexible and allows for changes to the network structure and operations, along with simulation of disruptive incidents and recovery actions. WNTR is based upon EPANET, which is a tool to simulate the movement and fate of drinking water constituents within distribution systems. Users are encouraged to be familiar with the use of EPANET and/or should have background knowledge in hydraulics and pressurized pipe network modeling before using WNTR. EPANET has a graphical user interface that might be a useful tool to facilitate the visualization of the network and the associated analysis results. Information on EPANET can be found at https://www.epa.gov/water-research/epanet. WNTR is compatible with EPANET 2.00.12 [Ross00]. In addition, users should have experience using Python, including the installation of additional Python packages. General information on Python can be found at https://www.python.org/.
Quantum Super-resolution Bioimaging using Massively Entangled Multimode Squeezed Light
This report presents a new method for realizing a super-resolution quantum imaging using massively entangled multimode squeezed light (MEMSL). Each branch of the entangled multimode light interacts with the sample and bears the spatially varying optical phase delay. When imaging optics with finite pupil sizes are used, information is lost. Thanks to the analyticity in the Fourier plane, a noiseless measurement would recover the lost information and accomplish super resolution imaging beating the Rayleigh diffraction limit. I proved rigorously in a fully quantum formalism and presented that (1) such information recovery is possible and (2) the information recovery can be accomplished with much less resources when MEMSL is used than those needed in any non-entangled or non-squeezed classical imaging method. Furthermore, the action of the optical loss in the imaging system that degrades the imaging performance is also rigorously analyzed and presented. Several bioimaging applications that can benefit tremendously from the proposed quantum imaging scheme are also suggested.
Features Events and Processes Relevant to DPC Disposal Criticality Analysis
Alsaed, Halim; Price, Laura L.
The Department of Energy is evaluating the technical feasibility of disposal of spent nuclear fuel in dual-purpose canisters in various geologies. As part of ongoing research and development, the effect of potential post-closure criticality events on repository performance is being studied. Many different features, events, and processes (FEPs) could affect the potential for criticality or the extent of a criticality event. Additionally, a criticality event could affect other FEPs. This report uses existing lists of FEPs as a starting point to evaluate the FEPs that could affect or be affected by an in- package criticality event. The evaluation indicates that most of the FEPs associated with the waste form, the waste, or the EBS have some effect on post-closure criticality and/or are affected by the consequences of post-closure criticality. In addition, FEPs not previously considered are identified for further development.
ALEGRA/Sceptre Code Coupling [Brief]
Researchers at Sandia have developed an advanced radiation hydrodynamic simulation capability by coupling the ALEGRA and Sceptre codes.
Application and Certification of Comparative Vacuum Monitoring Sensors for Structural Health Monitoring of 737 Wing Box Fittings
Multi-site fatigue damage, hidden cracks in hard-to-reach locations, disbonded joints, erosion, impact, and corrosion are among the major flaws encountered in today's extensive fleet of aging aircraft and space vehicles. The use of in-situ sensors for real-time health monitoring of aircraft structures are a viable option to overcome inspection impediments stemming from accessibility limitations, complex geometries, and the location and depth of hidden damage. Reliable, structural health monitoring systems can automatically process data, assess structural condition, and signal the need for human intervention. Prevention of unexpected flaw growth and structural failure can be improved if on-board health monitoring systems are used to continuously assess structural integrity. Such systems are able to detect incipient damage before catastrophic failures occurs. Condition-based maintenance practices could be substituted for the current time-based maintenance approach. Other advantages of on-board distributed sensor systems are that they can eliminate costly, and potentially damaging, disassembly, improve sensitivity by producing optimum placement of sensors and decrease maintenance costs by eliminating more time- consuming manual inspections. This report presents a Sandia Labs-aviation industry effort to move SHM into routine use for aircraft maintenance. This program addressed formal SHM technology validation and certification issues so that the full spectrum of concerns, including design, deployment, performance and certification were appropriately considered. The Airworthiness Assurance NDI Validation Center (AANC) at Sandia Labs, in conjunction with Boeing, Delta Air Lines, Structural Monitoring Systems Ltd., Anodyne Electronics Manufacturing Corp. and the Federal Aviation Administration (FAA) carried out a certification program to formally introduce Comparative Vacuum Monitoring (CVM) as a structural health monitoring solution to a specific aircraft wing box application. Validation tasks were designed to address the SHM equipment, the health monitoring task, the resolution required, the sensor interrogation procedures, the conditions under which the monitoring will occur, the potential inspector population, adoption of CVM into an airline maintenance program and the document revisions necessary to allow for routine use of CVM as an alternate means of performing periodic structural inspects. To carry out the validation process, knowledge of aircraft maintenance practices was coupled with an unbiased, independent evaluation. Sandia Labs designed, implemented, and analyzed the results from a focused and statistically-relevant experimental effort to quantify the reliability of the CVM system applied to the Boeing 737 Wing Box fitting application. All factors that affect SHM sensitivity were included in this program: flaw size, shape, orientation and location relative to the sensors, as well as operational and environmental variables. Statistical methods were applied to performance data to derive Probability of Detection (POD) values for CVM sensors in a manner that agrees with current nondestructive inspection (NDI) validation requirements and also is acceptable to both the aviation industry and regulatory bodies. This report presents the use of several different statistical methods, some of them adapted from NDI performance assessments and some proposed to address the unique nature of damage detection via SHM systems, and discusses how they can converge to produce a confident quantification of SHM performance An important element in developing SHM validation processes is a clear understanding of the regulatory measures needed to adopt SHM solutions along with the knowledge of the structural and maintenance characteristics that may impact the operational performance of an SHM system. This report describes the major elements of an SHM validation approach and differentiates the SHM elements from those found in NDI validation. The activities conducted in this program demonstrated the feasibility of routine SHM usage in general and CVM in particular for the application selected. They also helped establish an optimum OEM-airline-regulator process and determined how to safely adopt SHM solutions. This formal SHM validation will allow aircraft manufacturers and airlines to confidently make informed decisions about the proper utilization of CVM technology. It will also streamline the regulatory actions and formal certification measures needed to assure the safe application of SHM solutions.
Inverting infrasound data for the seismoacoustic source time functions and surface spall at the Source Physics Experiments Phase II: Dry Alluvium Geology
Poppeliers, Christian; Preston, Leiph
This report presents the infrasound data recorded as part of the Source Physics Experiment - Phase 2, Dry Alluvium Geology. This experiment, also known colloquially as DAG, consisted of four underground chemical explosions at the Nevada National Security Site. We focus our analysis on only the fourth explosion (DAG-4) as we determined that this was the only event that produced clear source-generated infrasound energy as recorded by the DAG sensors. We analyze the data using two inversion methods. The first method is designed to estimate the point-source seismoacoustic source time functions, and the second inversion method is designed to estimate the first-order characteristics (e.g. horizontal dimensions and maximum amplitude) of the actual spall surface. For both analysis methods, we are able to fit the data reasonably well, with various assumptions of the source model. The estimated seismoacoustic source appears to be a combination of a buried, isotropic explosion with a maximum amplitude of ~2 x 109 Nm and a vertically oriented force, applied to the Earth's surface with a maximum amplitude of 4 x 107 N. We use the vertically oriented force to simulate surface spall. The estimated spall surface has an approximate radius of ~40 m with a maximum acceleration magnitude in the range of 0.8 to 1.5 m/s/s. These estimates are approximately similar to the measured surface acceleration at the site.
Evaluation of Component Reliability in Photovoltaic Systems using Field Failure Statistics
Gunda, Thushara; Homan, Rachel
Ongoing operations and maintenance (O&M) are needed to ensure photovoltaic (PV) systems continue to operate and meet production targets over the lifecycle of the system. Although average costs to operate and maintain PV systems have been decreasing over time, reported costs can vary significantly at the plant level. Estimating O&M costs accurately is important for informing financial planning and tracking activities, and subsequently lowering the levelized cost of electricity (LCOE) of PV systems. This report describes a methodology for improving O&M planning estimates by using empirically-derived failure statistics to capture component reliability in the field. The report also summarizes failure patterns observed for specific PV components and local environmental conditions observed in Sandia's PV Reliability, Operations & Maintenance (PVROM) database, a collection of field records across 800+ systems in the U.S. Where system-specific or fleet-specific data are lacking, PVROM-derived failure distribution values can be used to inform cost modeling and other reliability analyses to evaluate opportunities for performance improvements.
Multimodal Deep Learning for Flaw Detection in Software Programs
Heidbrink, Scott; Rodhouse, Kathryn N.; Dunlavy, Daniel M.
We explore the use of multiple deep learning models for detecting flaws in software programs. Current, standard approaches for flaw detection rely on a single representation of a software program (e.g., source code or a program binary). We illustrate that, by using techniques from multimodal deep learning, we can simultaneously leverage multiple representations of software programs to improve flaw detection over single representation analyses. Specifically, we adapt three deep learning models from the multimodal learning literature for use in flaw detection and demonstrate how these models outperform traditional deep learning models. We present results on detecting software flaws using the Juliet Test Suite and Linux Kernel.
Securing machine learning models
Skryzalin, Jacek; Goss, Kenneth; Jackson, Benjamin C.
We discuss the challenges and approaches to securing numeric computation against adversaries who may want to discover hidden parameters or values used by the algorithm. We discuss techniques that are both cryptographic and non-cryptographic in nature. Cryptographic solutions are either not yet algorithmically feasible or currently require more computational resources than are reasonable to have in a deployed setting. Non-cryptographic solutions may be computationally faster, but these cannot stop a determined adversary. For one such non-cryptographic solution, mixed Boolean arithmetic, we suggest a number of improvements that may protect the obfuscated calculation against current automated deobfuscation methods.
Sandia's Research in Support of COVID-19 Pandemic Response: Computing and Information Sciences
Bauer, Travis L.; Beyeler, Walter E.; Finley, Patrick D.; Jeffers, Robert; Laird, Carl; Makvandi, Monear; Outkin, Alexander V.; Safta, Cosmin; Simonson, Katherine M.
This report summarizes the goals and findings of eight research projects conducted under the Computing and Information Sciences (CIS) Research Foundation and related to the COVID- 19 pandemic. The projects were all formulated in response to Sandia's call for proposals for rapid-response research with the potential to have a positive impact on the global health emergency. Six of the projects in the CIS portfolio focused on modeling various facets of disease spread, resource requirements, testing programs, and economic impact. The two remaining projects examined the use of web-crawlers and text analytics to allow rapid identification of articles relevant to specific technical questions, and categorization of the reliability of content. The portfolio has collectively produced methods and findings that are being applied by a range of state, regional, and national entities to support enhanced understanding and prediction of the pandemic's spread and its impacts.
Performance of CsI:Tl Cyrstal with a Spectrum Matching Photomultiplier Tube
Yang, Pin; Foulk, James W.; Harmon, Charles D.
This report documents an effort to improve the energy resolution for a thallium doped cesium iodide (CsI:T1) scintillator paired with a spectrum matching photomultiplier tube (PMT). A comparison of the differences in the pulse height spectra from thallium doped (CsI:T1) and sodium doped (CsI:Na) single crystals with PMTs of different spectrum responses was performed. Results show that energy resolution of the detector only improves 0.5% at room temperature when these scintillators are coupled with a spectrum matching PMT. Based on a spectrum matching PMT, the best results for energy resolution are 7.39% and 7.88% for CsI:T1 and CsI:Na scintillators, respectively. The improvement is primarily attributed to the increase of photon statistics from the increase of photons (N) being detected in the spectrum matching PMT. Other factors, such as optical quantum yield and non-proportionality of the CsI:T1 and CsI:Na crystals, that can affect the energy resolution were also studied and reported. The results indicate that although the use of a spectrum matching PMT enhances the photon statistics, it also exacerbates the nonproportionality response. Consequently, a promised improvement on energy resolution due solely to photon statistics was not fully realized.
Resiliency of Degraded Built Infrastructure
Infrastructure resiliency depends on the ability of infrastructure systems to withstand, adapt, and recover from chronic and extreme stresses. In this white paper, we address the resiliency of infrastructure assets and discuss improving infrastructure stability through development of our understanding of cement and concrete degradation. The resiliency of infrastructure during extreme events relies on the condition, adaptability, and recoverability of built infrastructure (roads, bridges, dams), which serves as the backbone of existing infrastructure systems. Much of the built infrastructure in the US has consistently been rated D+ by the American Society of Civil Engineers (ASCE). Aged infrastructure introduces risk to the system, since unreliable infrastructure increases the likelihood of failures under chronic and extreme stress and are particularly concerning when extreme events occur. To understand and account for this added risk from poor infrastructure quality, more research is needed on (i) how the changing environment alters the aging of new and existing built infrastructure and (ii) how degradation causes unique failure mechanisms. The aging of built infrastructure is based on degradation of the structural materials, such as concrete and steel supports, which causes failure. Current work in cement/concrete degradation is based on (i) the development of high strength and degradation resistance concrete mixtures, (ii) methods of assessing the age and reliability of existing structures, and (3) modeling of structural stability and the microstructural evolution of concrete/cement from degradation mechanisms (sulfide attack, carbonation, decalcification). Sandia National Laboratories (SNL) has made several investments in studying the durability and degradation of cement based materials, including using SNL-developed codes and methodologies (peridynamics, PFLOTRAN) to focus on chemo-mechanical fracture of cement for energy applications. Additionally, a recent collaboration with the University of Colorado Boulder has included fracture of concrete gravity dams, scaling the existing work to applications in full sized infrastructure problems. Ultimately, SNL has the experience in degradation of cementitious materials to extend the current research portfolio and answer concerns about the resilience of aging built infrastructure.
Image Processing Algorithms for Tuning Quantum Devices and Nitrogen-Vacancy Imaging
Monical, Cara P.; Lewis, Phillip; Agron, Abrielle; Larson, Kurt; Mounce, Andrew M.
Semiconductor quantum dot devices can be challenging to configure into a regime where they are suitable for qubit operation. This challenge arises from variations in gate control of quantum dot electron occupation and tunnel coupling between quantum dots on a single device or across several devices. Furthermore, a single control gate usually has capacitive coupling to multiple quantum dots and tunnel barriers between dots. If the device operator, be it human or machine, has quantitative knowledge of how gates control the electrostatic and dynamic properties of multiqubit devices, the operator can more quickly and easily navigate the multidimensional gate space to find a qubit operating regime. We have developed and applied image analysis techniques to quantitatively detect where charge offsets from different quantum dots intersect, so called anticrossings. In this document we outline the details of our algorithm for detecting single anticrossings, which has been used to fine-tune the inter-dot tunnel rates for a three quantum dot system. Additionally, we show that our algorithm can detect multiple anticrossings in the same dataset, which can aid in the coarse tuning the electron occupation of multiple quantum dots. We also include an application of cross correlation to the imaging of magnetic fields using nitrogen vacancies.
RSVP - Flu Like Illness and Respiratory Syndromes COVID-19 Syndromic Reporting Tool Prototype
Caskey, Susan; Finley, Melissa; Makvandi, Monear; Bynum, Leo J.; Edgar, Pablo A.
Individuals infected with SARS-CoV-2, the virus that causes COVID-19, may be infectious between 1-3 days prior to symptom onset. People may delay seeking medical care after symptom development due to multiple determinants of health seeking behavior like availability of testing, accessibility of providers, and ability to pay. Therefore, understanding symptoms in the general public is important to better predict and inform resource management plans and engage in reopening. As the influenza season looms, the ability to differentiate between clinical presentation of COVID-19 and seasonal influenza will also be important to health providers and public health response efforts. This project has developed an algorithm that when used with captured syndromic trends can help provide both differentiation to various influenza-like illnesses (ILI) as well as provide public health decision makers a better understanding regarding spatial and temporal trends. This effort has also developed a web-based tool to allow for the capturing of generalized syndromic trends and provide both spatial and temporal outputs on these trends. This page left blank
Terry Turbopump Expanded Operating Band Modeling and Simulation Efforts in Fiscal Year 2020 - Progress Report
Beeny, Bradley A.; Gilkey, Lindsay N.; Solom, Matthew; Luxat, David L.
The Terry Turbine Expanded Operating Band Project is currently conducting testing at Texas A&M University as part of a revised experimental program meant to supplant previous full-scale testing plans under the headings of Milestone 5 and Milestone 6. In consultation with Sandia National Laboratories technical staff and with modeling and simulation support from the same, the hybrid Milestone 5&6 plan is moving forward with experiments aimed at addressing knowledge gaps regarding scale, working fluid, and turbopump self-regulation. Modeling and simulation efforts at Sandia National Laboratories in FY20 fell under the broad umbrella of Milestone 7 and consisted exclusively of MELCOR-related tasks aimed at: 1) Constructing/improving input models of Texas A&M University experiments, 2) Constructing a generic boiling water reactor input model according to best practices with systems-level Teny turbine capabilities, and 3) Adding code capability in order to leverage experimental data/findings, address bugs, and improve general code robustness Project impacts of the Covid-19 pandemic have fortunately been minimal thus far but are mentioned as necessary when discussing the hybrid Milestone 5&6 progress as well as the corresponding Milestone 7 modeling and simulation progress.
Research Needs for Trusted Analytics in National Security Settings
Stracuzzi, David J.; Speed, Ann E.
As artificial intelligence, machine learning, and statistical modeling methods become commonplace in national security applications, the drive to create trusted analytics becomes increasingly important. The goal of this report is to identify areas of research that can provide the foundational understanding and technical prerequisites for the development and deployment of trusted analytics in national security settings. Our review of the literature covered several disjoint research communities, including computer science, statistics, human factors, and several branches of psychology and cognitive science, which tend not to interact with one another or cite each other's literatures. As a result, there exists no agreed-upon theoretical framework for understanding how various factors influence trust and no well-established empirical paradigm for studying these effects. This report therefore takes three steps. First, we define several key terms in an effort to provide a unifying language for trusted analytics and to manage the scope of the problem. Second, we outline an empirical perspective that identifies key independent, moderating, and dependent variables in assessing trusted analytics. Though not a substitute for a theoretical framework, the empirical perspective does support research and development of trusted analytics in the national security domain. Finally, we discuss several research gaps relevant to developing trusted analytics for the national security mission space.
HSolo: Homography from a single affine aware correspondence
Gonzales, Antonio; Monical, Cara P.; Perkins, Tony
The performance of existing robust homography estimation algorithms is highly dependent on the inlier rate of feature point correspondences. In this paper, we present a novel procedure for homography estimation that is particularly well suited for inlier-poor domains. By utilizing the scale and rotation byproducts created by affine aware feature detectors such as SIFT and SURF, we obtain an initial homography estimate from a single correspondence pair. This estimate allows us to filter the correspondences to an inlier-rich subset for use with a robust estimator. Especially at low inlier rates, our novel algorithm provides dramatic performance improvements.
Prediction of Circuit Response to an Electromagnetic Environment (ASC IC FY2020 Milestone 7179)
Mei, Ting; Huang, Andy; Thornquist, Heidi K.; Sholander, Peter E.; Verley, Jason C.
This report covers the work performed in support of the ASC Integrated Codes FY20 Milestone 7179. For the Milestone, Sandia's Xyce analog circuit simulator was enhanced to enable a loose coupling to Sandia's EIGER electromagnetic (EM) simulation tool. A device was added to Xyce that takes as its input network parameters (representing the impedance response) and short-circuit current induced in a wire or other element, as calculated by an EM simulator such as EIGER. Simulations were performed in EIGER and in Xyce (using Harmonic Balance analysis) for a variety of linear and nonlinear circuit problems, including various op amp circuits. Results of those simulations are presented and future work is also discussed.
Digital Signal Processing of Radar Pulse Echoes
Modern high-performance radar systems are employing ever-more Digital Signal Processing (DSP), replacing ever-more formerly analog components. Precisely predicting the performance of digital filters and correlators requires an awareness of some of the finer points and characteristics of digital filters. We examine a representative radar receiver DSP chain that is processing a Linear Frequency Modulated (LFM) chirp.
AniMACCS User Guide
Foulk, James W.; Bixler, Nathan E.; Leute, Jennifer E.; Whitener, Dustin; Eubanks, Lloyd
AniMACCS is a utility code in the MELCOR Accident Consequence Code System (MACCS) software suite that allows for certain MACCS output information to be visually displayed and overlaid onto a geospatial map background. AniMACCS was developed by Sandia National Laboratories for the U.S. Nuclear Regulatory Commission. MACCS is designed to calculate health and economic consequences following a release of radioactive material in the atmosphere. MACCS accomplishes this by modeling the atmospheric dispersion, deposition, and consequences of the release, which depend on several factors including the source term, weather, population, economic, and land-use characteristics of the impacted geographical area. From these inputs, MACCS determines the characteristics of the plume, as well as ground and air concentrations as a function of time and radionuclide.
CephFS experiments on stria.sandia.gov
This report is an institutional record of experiments conducted to explore performance of a vendor installation of CephFS on the SNL stria cluster. Comparisons between CephFS, the Lustre parallel file system, and NFS were done using the IOR and MDTEST benchmarking tools, a test program which uses the SEACAS/Trilinos IOSS library, and the checkpointing activity performed by the LAMMPS molecular dynamics simulation.
Wind Turbine Lightning Mitigation System Radar Cross Section Reduction
Modern wind turbines employ Lightning Mitigation Systems (LMSs) in order to reduce costly damages caused by lightning strikes. Lightning strikes on wind turbines occur frequently making LMS configurations a necessity. An LMS for a single turbine includes, among other equipment, cables running inside each blade, along the entire blade length. These cables are connected to various metallic receptors on the outside surface of the blades. The LMS cables can act as significant electromagnetic scatterers which may cause interference to radar systems. This interference may be mitigated by reducing the Radar Cross-Section (RCS) of the wind turbine's LMS. This report investigates proposed modifications to LMS cables in order to reduce the RCS when illuminated by Re locatable Over the Horizon Radar (ROTHR) systems which operate in the HF band (3 - 30 MHz). The proposed modifications include breaking up the LMS cables using spark gap connections, and changing the orientation of the LMS cable within the turbine blade. Both simulated analyses of such RCS mitigation techniques is provided as well as recommendations on further research.
A Bezier Curve Informed Melt Pool Geometry to Model Additive Manufacturing Microstructures Using SPPARKS
Trageser, Jeremy; Mitchell, John A.
Additive manufacturing is a transformative technology with the potential to manufacture designs which traditional subtractive machining methods cannot. Additive manufacturing offers fast builds at near final desired geometry; however, material properties and variability from part to part remain a challenge for certification and qualification of metallic components. AM induced metallic microstructures are spatially heterogeneous and highly process dependent. Engineering properties such as strength and toughness are significantly affected by microstructure morphologies resulting from the manufacturing process Linking process parameters to microstructures and ultimately to the dynamic response of AM materials is critical to certifying and qualifying AM built parts and components and improving the performance of AM materials. The AM fabrication process is characterized by building parts layer by layer using a selective laser melt process guided by a computer. A laser selectively scans and melts metal according to a designated geometry. As the laser scans, metal melts, fuses, and solidifies forming the final geometry in a layerwise fashion. As the laser heat source moves away, the metal cools and solidifies forming metallic microstructures. This work describes a microstructure modeling application implemented in the SPPARKS kinetic Monte Carlo computational framework for simulating the resulting microstructures. The application uses Bzier curves and surfaces to model the melt pool surface and spatial temperature profile induced by moving the laser heat source; it simulates the melting and fusing of metal at the laser hot spot and microstructure formation and evolution when the laser moves away. The geometry of the melt pool is quite flexible and we explore effects of variances in model parameters on simulated microstructures.
Asset Management: Beyond 2020
This Asset Management Beyond 2020 document provides: (1) an introduction to asset management (AM) and an asset management system (AMS), along with insights to next steps, (2) an overview of the International Standards Organization (ISO) 55000 series documents, (3) an overview of the ISO 55001 gap analysis of Center 4700/4800 ("Facilities") organizations at Sandia National Laboratories, hereafter referred to as Sandia, with observations, gaps, and gap closure recommendations, and (4) an asset management architecture (AMA) recommendation aligned with the ISO 55001. The AMS and the AM are different but related. ISO 55000 cites AMS as "a management system for AM" to "direct, coordinate, and control AM activities." ISO 55000 cites AM as translating "the organizational objectives into technical and financial decisions, plans and activities."1 Essentially, the AMS begins with all levels of leadership to enable a structured and consistent approach to AM culminating in lifecycle asset management excellence.
Physical Security Model Development of an Electrochemical Facility
Parks, Mancel J.; Noel, Todd; Stromberg, Benjamin
Nuclear facilities in the U.S. and around the world face increasing challenges in meeting evolving physical security requirements while keeping costs reasonable. The addition of security features after a facility has been designed and without attention to optimization (the approach of the past) can easily lead to cost overruns. Instead, security should be considered at the beginning of the design process in order to provide robust, yet efficient physical security designs. The purpose of this work is to demonstrate how modeling and simulation can be used to optimize the design of physical protection systems. A suite of tools, including Scribe3D and Blender, were used to model a generic electrochemical reprocessing facility. Physical protection elements such as sensors, portal monitors, barriers, and guard forces were added to the model based on best practices for physical security. Two theft scenarios (an outsider attack and insider diversion) as well as a sabotage scenario were examined in order to optimize the security design. Security metrics are presented. This work fits into a larger Virtual Facility Distributed Test Bed 2020 Milestone in the Material Protection, Accounting, and Control Technologies (MPACT) program through the Department of Energy (DOE). The purpose of the milestone is to demonstrate how a series of experimental and modeling capabilities across the DOE complex provide the capabilities to demonstrate complete Safeguards and Security by Design (SSBD) for nuclear facilities.
Cyber Resilience as a Deterrence Strategy
Hammer, Ann E.; Miller, Trisha H.; Uribe, Eva U.
This paper was written by the Cyber Deterrence and Resilience Strategic Initiative in partnership with the Resilience Energy Systems Strategic Initiative. Resilience and deterrence are both part of a comprehensive cyber strategy where tactics may overlap across defense, resilience, deterrence, and other strategic spaces. This paper explores how building resiliency in cyberspace can not only serve to strengthen the defender's posture and capabilities in a general sense but also deter adversaries from attacking.
Big-Data-Driven Geo-Spatiotemporal Correlation Analysis between Precursor Pollen and Influenza and its Implication to Novel Coronavirus Outbreak
Although studies of many respiratory viruses and pollens are often framed by both seasonal and health related perspectives, pollen has yet to be extensively examined as an important covariate to seasonal respiratory viruses (SRVs) in any context, including a causal one. This study contributes to those goals through an investigation of SRVs and pollen counts at selected regions across the Western Hemisphere. Two complementary decadal-scaled geospatial profiles were developed. One laterally spanned the US and was anchored by detailed pollen information for Albuquerque, New Mexico. The other straddled the equator to include Fortaleza, Brazil. We found that the geospatial and climatological patterns of pollen advancement and decline across the US every year presented a statistically significant correlation to the subsequent emergence and decline of SRVs. Other significant covariates included winds, temperatures, and atmospheric moisture. Our study indicates that areas of the US with lower geostrophic wind baselines are typically areas of persistently higher and earlier influenza like illness (ILI) cases. In addition to that continental- scaled contrast, many sites indicated seasonal highs of geostrophic winds and ILI which were closely aligned. These observations suggest extensive scale-dependent connectivity of viruses to geostrophic circulation. Pollen emergence and its own scale-dependent circulation may contribute to the geospatial and seasonal patterns of ILI. We explore some uncertainties associated with this investigation, and consider the possibility that in a temperate climate, following a Spring pollen emergence, a resulting increase in pollen triggered human Immunoglobulin E (IgE) antibodies may suppress ILIs for several months.
Modeling a ring magnet in ALEGRA
Niederhaus, John H.J.; Pacheco, Jose L.; Wilkes, John R.; Hooper, Russell; Siefert, Christopher; Goeke, Ronald S.
We show here that Sandia's ALEGRA software can be used to model a permanent magnet in 2D and 3D, with accuracy matching that of the open-source commercial software FEMM. This is done by conducting simulations and experimental measurements for a commercial-grade N42 neodymium alloy ring magnet with a measured magnetic field strength of approximately 0.4 T in its immediate vicinity. Transient simulations using ALEGRA and static simulations using FEMM are conducted. Comparisons are made between simulations and measurements, and amongst the simulations, for sample locations in the steady-state magnetic field. The comparisons show that all models capture the data to within 7%. The FEMM and ALEGRA results agree to within approximately 2%. The most accurate solutions in ALEGRA are obtained using quadrilateral or hexahedral elements. In the case where iron shielding disks are included in the magnetized space, ALEGRA simulations are considerably more expensive because of the increased magnetic diffusion time, but FEMM and ALEGRA results are still in agreement. The magnetic field data are portable to other software interfaces using the Exodus file format.
Learning Hidden Structure in Multi-Fidelity Information Sources for Efficient Uncertainty Quantification (LDRD 218317)
Jakeman, John D.; Eldred, Michael; Geraci, Gianluca; Smith, Thomas M.; Gorodetsky, Alex A.
This report summarizes the work done under the Laboratory Directed Research and Development (LDRD) project entitled "Learning Hidden Structure in Multi-Fidelity Information Sources for Efficient Uncertainty Quantification". In this project we investigated multi-fidelity strategies for fusing data from information sources of varying cost and accuracy. Most existing strategies exploit hierarchical relationships between models, for example that occur when different models are generated by refining a numerical discretization parameter. In this work we focused on encoding the relationships between information sources using directed acyclic graphs. The multi-fidelity networks can have general structure and represent a significantly greater variety of modeling relationships than recursive networks used in the current state literature. Numerical results show that a non-hierarchical multi-fidelity Monte Carlo strategy can reduce the cost of estimating uncertainty in predictions of a model of plasma expanding in a vacuum by almost two orders of magnitude.
Increasing the Lifetime of Epoxy Components with Antioxidant Stabilizers
Narcross, Hannah L.; Redline, Erica; Celina, Mathew C.; Bowman, Ashley
Epoxy thermoset resins are ubiquitous materials with extensive applications where they are used as encapsulants, composites, and adhesives/staking compounds used to secure sensitive components. Epoxy resins are inherently sensitive to thermo-oxidative aging, especially at elevated temperatures, which changes the bulk properties of the material and can lead to component failure for example by cracking due to embrittlement or by adhesion failure between the epoxy and filler material in a composite. This project investigated the effects of three commercial antioxidants (Irganox® 1010 (I-102), butylated hydroxytoluene (BHT), or Chisorb® 770 (HALS)) at two different loadings (2.5 and 5 wt%) on the mechanical and chemical aging of a model epoxy system (EPONTM 828 / Jeffamine® T-403) under ambient conditions, 65, 95, and 110 °C. Additionally, synthetic routes towards an antioxidant capable of being covalently bound to the resin so as to prevent leaching were explored with one such molecule being successfully synthesized and purified. One commercial antioxidant (Irganox® 1010) was found to reduce the degree of thermo-oxidatively induced damage in the system.
Incorporating physical constraints into Gaussian process surrogate models (LDRD Project Summary)
Swiler, Laura P.; Gulian, Mamikon; Frankel, A.; Jakeman, John D.; Safta, Cosmin
This report summarizes work done under the Laboratory Directed Research and Development (LDRD) project titled "Incorporating physical constraints into Gaussian process surrogate models?' In this project, we explored a variety of strategies for constraint implementations. We considered bound constraints, monotonicity and related convexity constraints, Gaussian processes which are constrained to satisfy linear operator constraints which represent physical laws expressed as partial differential equations, and intrinsic boundary condition constraints. We wrote three papers and are currently finishing two others. We developed initial software implementations for some approaches. This report summarizes the work done under this LDRD.
Using Neural Architecture Search for Improving Software Flaw Detection in Multimodal Deep Learning Models
Cooper, Alexis; Zhou, Xin; Dunlavy, Daniel M.; Heidbrink, Scott
Software flaw detection using multimodal deep learning models has been demonstrated as a very competitive approach on benchmark problems. In this work, we demonstrate that even better performance can be achieved using neural architecture search (NAS) combined with multimodal learning models. We adapt a NAS framework aimed at investigating image classification to the problem of software flaw detection and demonstrate improved results on the Juliet Test Suite, a popular benchmarking data set for measuring performance of machine learning models in this problem domain.
Evaluation the Geotech GS-13BH Borehole Seismic Sensor
Sandia National Laboratories has tested and evaluated the Geotech GS-13BH borehole sensor. The sensor provides a response similar to that of the standard GS-13 short-period seismic sensor intended for pier-installations in a borehole package. The purpose of this seismometer evaluation was to determine a measured sensitivity, amplitude and phase response, self-noise and dynamic range, passband and acceleration response of its calibration coil.
What Questions Would a Systems Engineer Ask to Assess Systems Engineering Models as Credible
Carroll, Edward R.; Malins, Robert J.
Digital Systems Engineering strategies typically call for digital Systems Engineering models to be retained in repositories and certified as an authoritative source of truth (enabling model reuse, qualification, and collaboration). In order for digital Systems Engineering models to be certified as authoritative (credible), they need to be assessed - verified and validated - and with the amount of uncertainty in the model quantified (consider reusing someone else's model without knowing the author). Due to this increasing model complexity, the authors assert that traditional human-based methods for validating, verifying, and uncertainty quantification - such as human-based peer-review sessions - cannot sufficiently establish that a digital Systems Engineering model of a complex system is credible. Digital Systems Engineering models of complex systems can contain millions of nodes and edges. The authors assert that this level of detail is beyond the ability of any group of humans - even working for weeks at a time - to discern and catch every minor model infraction. In contrast, computers are highly effective at discerning infractions with massive amounts of information. The authors suggest that a better approach might be to focus the humans at what model patterns should be assessed and enable the computer to assess the massive details in accordance with those patterns - by running through perhaps 100,000 test loops. In anticipation of future projects to implement and automate the assessment of models at Sandia National Laboratories, a study was initiated to elicit input from a group of 25 Systems Engineering experts. The authors positioning query began with - 'What questions would a Systems Engineer ask to assess Systems Engineering models for credibility?" This report documents the results of that survey.
Response of GaN-Based Semiconductor Devices to Ion and Gamma Irradiation
Aguirre, Brandon A.; King, Joseph; Manuel, Jack; Vizkelethy, Gyorgy; Bielejec, Edward S.; Griffin, Patrick J.
GaN has electronic properties that make it an excellent material for the next generation of power electronics; however, its radiation hardening still needs further understanding before it is used in radiation environments. In this work we explored the response of commercial InGaN LEDs to two different radiation environments: ion and gamma irradiations. For ion irradiations we performed two types of irradiations at the Ion Beam Lab (IBL) at Sandia National Laboratories (SNL): high energy and end of range (EOR) irradiations. For gamma irradiations we fielded devices at the gamma irradiation facility (GIF) at SNL. The response of the LEDs to radiation was investigated by IV, light output and light output vs frequency measurements. We found that dose levels up to 500 krads do not degrade the electrical properties of the devices and that devices exposed to ion irradiations exhibit a linear and non- linear dependence with fluence for two different ranges of fluence levels. We also performed current injection annealing studies to explore the annealing properties of InGaN LEDs.
Coupling CTH to Linear Acoustic Propagation across an Air-Earth Interface
Preston, Leiph; Eliassi, Mehdi; Poppeliers, Christian
The interface between the Earth and the atmosphere forms a strong contrast in material properties. As such, numerical issues can arise when simulating an elastic wavefield across such a boundary when using a numerical simulation scheme. This is exacerbated when two different simulation codes are coupled straddling that interface. In this report we document how we implement the coupling of CTH, a nonlinear shock physics code, to a linearized elastic/acoustic wave propagation algorithm, axiElasti, across the air-earth interface. We first qualitatively verify that this stable coupling between the two algorithms produces expected results with no visible effects of the coupling interface. We then verify the coupling interface quantitatively by checking consistency with results from previous work and with coupled acoustic-elastic seismo-acoustic source inversions in three earth materials.
Multiscale Approach to Fast ModSim for Laser Processing of Metals for Future Nuclear Deterrence Environments
Moser, Daniel R.; Martinez, Mario J.; Johnson, Kyle L.; Rodgers, Theron M.
Predicting performance of parts produced using laser-metal processing remains an out- standing challenge. While many computational models exist, they are generally too computationally expensive to simulate the build of an engineering-scale part. This work develops a reduced order thermal model of a laser-metal system using analytical Green's function solutions to the linear heat equation, representing a step towards achieving a full part performance prediction in an "overnight" time frame. The developed model is able to calculate a thermal history for an example problem 72 times faster than a traditional FEM method. The model parameters are calibrated using a non-linear solution and microstructures and residual stresses calculated and compared to a non-linear case. The calibrated model shows promising agreement with a non-linear solution.
Particle Sensitivity Analysis
Lehoucq, Rich; Franke, Brian C.; Bond, Stephen D.; Mckinley, Scott A.
We propose to develop a computational sensitivity analysis capability for Monte Carlo sampling-based particle simulation relevant to Aleph, Cheetah-MC, Empire, Emphasis, ITS, SPARTA, and LAMMPS codes. These software tools model plasmas, radiation transport, low-density fluids, and molecular motion. Our report demonstrates how adjoint optimization methods can be combined with Monte Carlo sampling-based adjoint particle simulation. Our goal is to develop a sensitivity analysis to drive robust design-based optimization for Monte Carlo sampling-based particle simulation - a currently unavailable capability.
Advancing the science of explosive fragmentation and afterburn fireballs though experiments and simulations at the benchtop scale
Guildenbecher, Daniel; Dallman, Ann; Hall, Elise; Halls, Benjamin R.; Jones, E.M.C.; Kearney, Sean P.; Marinis, Ryan T.; Murzyn, C.M.; Richardson, Daniel; Perez, Francisco; Reu, P.L.; Thompson, Andrew D.; Welliver, Marc C.; Mazumdar, Yi C.; Brown, Alex D.; Pourpoint, Timothee L.; White, Catriona M.L.; Balachandar, S.; Houim, Ryan W.
Detonation of explosive devices produces extremely hazardous fragments and hot, luminous fireballs. Prior experimental investigations of these post-detonation environments have primarily considered devices containing hundreds of grams of explosives. While relevant to many applications, such large- scale testing also significantly restricts experimental diagnostics and provides limited data for model validation. As an alternative, the current work proposes experiments and simulations of the fragmentation and fireballs from commercial detonators with less than a gram of high explosive. As demonstrated here, reduced experimental hazards and increased optical access significantly expand the viability of advanced imaging and laser diagnostics. Notable developments include the first known validation of MHz-rate optical fragment tracking and the first ever Coherent Anti-Stokes Raman Scattering (CARS) measures of post-detonation fireball temperatures. While certainly not replacing the need for full-scale verification testing, this work demonstrates new opportunities to accelerate developments of diagnostics and predictive models of post-detonation environments.
Hanging String Cuts in SPR Caverns: Modeling Investigation and Comparison with Sonar Data
Zeitler, Todd Z.; Chojnicki, Kirsten
Investigation of leaching for oil sales includes looking closely at cavern geometries. Anomalous cavern "features" have been observed near the foot of some caverns subsequent to partial drawdowns. One potential mitigation approach to reducing further growth of preexisting features is based on the hypothesis that reducing the brine string length via a "string cue' would serve to move the zone associated with additional leaching to a location higher up in the cavern and thus away from the preexisting feature. Cutting of the hanging string is expected to provide a control of leaching depth that could be used to "smooth" existing features and thus reduce geomechanical instability in that region of the cavern. The SANSMIC code has been used to predict cavern geometry changes (i.e., the extent of cavern growth with depth) based on variable input parameters for four caverns: West Hackberry 11 (WH11), West Hackberry 113 (WH113), Big Hill 104 (BH104), Big Hill 114 (BH114). By comparing the initial sonar geometry with resultant geometries calculated by the SANSMIC code, conclusions may be drawn about the potential impact of these variables on future cavern growth. Ultimately, these conclusions can be used to assess possible mitigation strategies such as the potential advantage of cutting versus not cutting a brine string. This work has resulted in a recommendation that a hanging string cut of 80 ft in WH11 would be beneficial to future cavern geometry, while there would be little to no benefit to string cuts in the other three caverns investigated here. The WH11 recommendation was followed in 2019, resulting in an operational string cut. A sonar performed after the string cut showed no adverse leaching in the area of the preexisting flare, as expected from the results of the preliminary SANSMIC runs described in this report. Additional SANSMIC modeling of the actual amount of injected raw water resulted in good agreement with the post-cut sonar.
HEMP Testing of Substation Yard Circuit Breaker Control and Protective Relay Circuits
Baughman, Alfred N.; Bowman, Tyler C.; Guttromson, Ross; Halligan, Matthew; Minteer, Tim; Mooney, Travis; Vorse, Chad
There are concerns about the effects of High-Altitude Electromagnetic Pulses (HEMP) on the electric power grid. Activities to date tested and analyzed vulnerability of digital protective relays (DPRs) used in power substations, but the effect of HEMP on the greater substation environment is not well known. This work establishes a method of testing the vulnerability of circuit breaker control and protective relay circuits to the radiated E1 pulse associated with HEMP based on coupling to the cables in a substation yard. Two DPRs from Schweitzer Engineering Laboratories, Inc. were independently tested. The test setup also included a typical cable in a substation yard with return plane to emulate the ground grid and other ground conductors near the yard cable, cabinetry housing the installed DPRs, station battery and battery charger, terminal block elements, and a breaker simulator to emulate a substation yard configuration. The DPRs were powered from the station battery and the transformer inputs energized with a three-phase source to maintain typical operating conditions during the tests. Vulnerability testing consisted of a conducted E1 pulse injected into the center of the yard cable of the DPR circuits. Current measurements on the yard cable and DPR inputs indicated significant attenuation of the conducted pulse arriving at the control house equipment from the emulated substation yard. This reduction was quantified with respect to the equivalent open-circuit voltage on the yard cable. No equipment damage or undesired operation occurred on the tested circuits for values below 180 kV, which is significantly higher than the anticipated coupling to a substation yard cable.
Arctic Tipping Points Triggering Global Change (LDRD Final Report)
Peterson, Kara J.; Powell, Amy J.; Tezaur, Irina K.; Roesler, Erika L.; Nichol, Jeffrey; Peterson, Matthew G.; Davis, Warren L.; Jakeman, John D.; Stracuzzi, David J.; Bull, Diana L.
The Arctic is warming and feedbacks in the coupled Earth system may be driving the Arctic to tipping events that could have critical downstream impacts for the rest of the globe. In this project we have focused on analyzing sea ice variability and loss in the coupled Earth system Summer sea ice loss is happening rapidly and although the loss may be smooth and reversible, it has significant consequences for other Arctic systems as well as geopolitical and economic implications. Accurate seasonal predictions of sea ice minimum extent and long-term estimates of timing for a seasonally ice-free Arctic depend on a better understanding of the factors influencing sea ice dynamics and variation in this strongly coupled system. Under this project we have investigated the most influential factors in accurate predictions of September Arctic sea ice extent using machine learning models trained separately on observational data and on simulation data from five E3SM historical ensembles. Monthly averaged data from June, July, and August for a selection of ice, ocean, and atmosphere variables were used to train a random forest regression model. Gini importance measures were computed for each input feature with the testing data. We found that sea ice volume is most important earlier in the season (June) and sea ice extent became a more important predictor closer to September. Results from this study provide insight into how feature importance changes with forecast length and illustrates differences between observational data and simulated Earth system data. We have additionally performed a global sensitivity analysis (GSA) using a fully coupled ultra- low resolution configuration E3SM. To our knowledge, this is the first global sensitivity analysis involving the fully-coupled E3SM Earth system model. We have found that parameter variations show significant impact on the Arctic climate state and atmospheric parameters related to cloud parameterizations are the most significant. We also find significant interactions between parameters from different components of E3SM. The results of this study provide invaluable insight into the relative importance of various parameters from the sea ice, atmosphere and ocean components of the E3SM (including cross-component parameter interactions) on various Arctic-focused quantities of interest (QOIs).
Diversified Therapeutic Phage Cocktails from Close Relatives of the Target Bacterium
This project tackles the antibiotic resistance crisis, developing a new method for discovering numerous efficacious bacteriophages for therapeutic cocktails against bacterial pathogens. The phage therapy approach to infectious disease, recently rekindled in U.S. medicine, requires numerous phages for each bacterial pathogen. Our approach 1) uses Sandia-unique software to identify dormant phages (prophages) integrated into bacterial chromosomes, 2) identifies prophage-laden bacteria that are close relatives of the target pathogenic strain to be killed, and 3) engineers away properties of these phages that are undesirable for therapy. We have perfected our phage-finding software, implemented our phage therapy strategy by targeting the pathogen Pseudomonas aeruginosa, and prepared new software to assist the phage engineering. We then turned toward Burkholderia pathogens, aiming to overcome the difficulty to transform these bacteria with a novel phage conjugation approach. Our work demonstrates the validity of a new approach to phage therapy for killing antibiotic resistant pathogens.
Regression Based Approach for Robust Finite Element Analysis on Arbitrary Grids. LDRD Final Report
Kuberry, Paul; Bochev, Pavel B.; Koester, Jacob K.; Trask, Nathaniel A.
This report summarizes the work performed under a one-year LDRD project aiming to enable accurate and robust numerical simulation of partial differential equations for meshes that are of poor quality. Traditional finite element methods use the mesh to both discretize the geometric domain and to define the finite element shape functions. The latter creates a dependence between the quality of the mesh and the properties of the finite element basis that may adversely affect the accuracy of the discretized problem. In this project, we propose a new approach for defining finite element shape functions that breaks this dependence and separates mesh quality from the discretization quality. At the core of the approach is a meshless definition of the shape functions, which limits the purpose of the mesh to representing the geometric domain and integrating the basis functions without having any role in their approximation quality. The resulting non-conforming space can be utilized within a standard discontinuous Galerkin framework providing a rigorous foundation for solving partial differential equations on low-quality meshes. We present a collection of numerical experiments demonstrating our approach in a wide range of settings: strongly coercive elliptic problems, linear elasticity in the compressible regime, and the stationary Stokes problem. We demonstrate convergence for all problems and stability for element pairs for problems which usually require inf-sup compatibility for conforming methods, also referring to a minor modification possible through the symmetric interior penalty Galerkin framework for stabilizing element pairs that would otherwise be traditionally unstable. Mesh robustness is particularly critical for elasticity, and we provide an example that our approach provides a greater than 5x improvement in accuracy and allows for taking an 8x larger stable timestep for a highly deformed mesh, compared to the continuous Galerkin finite element method. The report concludes with a brief summary of ongoing projects and collaborations that utilize or extend the products of this work.
A Review of Sandia Energy Storage Research Capabilities and Opportunities (2020 to 2030)
Ho, Clifford K.; Atcitty, Stanley; Bauer, Stephen J.; Borneo, Daniel R.; Byrne, Raymond H.; Chalamala, Babu C.; Lamb, Joshua; Lambert, Timothy N.; Schenkman, Benjamin L.; Spoerke, Erik D.; Zimmerman, Jonathan A.
Large-scale integration of energy storage on the electric grid will be essential to enabling greater penetration of intermittent renewable energy sources, modernizing the grid for increased flexibility security, reliability, and resilience, and enabling cleaner forms of transportation. The purpose of this report is to summarize Sandia's research and capabilities in energy storage and to provide a preliminary roadmap for future efforts in this area that can address the ongoing program needs of DOE and the nation. Mission and vision statements are first presented followed by an overview of the organizational structure at Sandia that provides support and activities in energy storage. Then, a summary of Sandia's energy storage capabilities is presented by technology, including battery storage and materials, power conversion and electronics, subsurface-based energy storage, thermal/thermochemical energy storage, hydrogen storage, data analytics/systems optimization/controls, safety of energy storage systems, and testing/demonstrations/model validation. A summary of identified gaps and needs is also presented for each technology and capability.
Bistatic Synthetic Aperture Radar - Issues Analysis and Design
The physical separation of the transmitter from the receiver into perhaps separate flight vehicles (with separate flight paths) in a bistatic Synthetic Aperture radar system adds considerable complexity to an already complex system. Synchronization of waveform parameters and timing attributes become problematic, and notions of even the synthetic aperture itself take on a new level of abstractness. Consequently, a high-performance, fine-resolution, and reliable bistatic SAR system really needs to be engineered from the ground up, with tighter specifications on a number of parameters, and entirely new functionality in other areas. Nevertheless, such a bistatic SAR system appears viable.
Modeling of Atom Interferometer Accelerometer
Soh, Daniel B.S.; Lee, Jongmin; Schwindt, Peter D.
This report presents the theoretical effort to model and simulate the atom-interferometer accelerometer operating in a highly mobile environment. Multitudes of non-idealities may occur in such a rapidly-changing environment with a large acceleration whose amplitude and direction both change quickly. We studied the undesired effect of high mobility in the atom-interferometer accelerator in a detailed model and a simulator. The undesired effects include the atom cloud's movement during Raman pulses, the Doppler effect due to the relative movement between the atom-cloud and the supporting platform, the finite atom cloud temperature, and the lateral movement of the atom cloud. We present the relevant feed-forward mitigation strategies for each identified non-ideality to neutralize the impact and obtain accurate acceleration measurements.
Dispersion Validation for Flow Involving a Large Structure Revisited: 45 Degree Rotation
Brown, Alexander L.; Lance, Blake; Clemenson, Michael; Jones, Samuel T.; Benson, Michael J.; Elkins, Chris
The atmospheric dispersion of contaminants in the wake of a large urban structure is a challenging fluid mechanics problem of interest to the scientific and engineering communities. Magnetic Resonance Velocimetry (MRV) and Magnetic Resonance Concentration (MRC) are relatively new techniques that leverage diagnostic equipment used primarily by the medical field to make 3D engineering measurements of flow and contaminant dispersal. SIERRA/Fuego, a computational fluid dynamics (CFD) code at Sandia National Labs is employed to make detailed comparisons to the dataset to evaluate the quantitative and qualitative accuracy of the model. This work is the second in a series of scenarios. In the prior work, a single large building in an array of similar buildings was considered with the wind perpendicular to a building face. In this work, the geometry is rotated by 45 degrees and improved studies are performed for simulation credibility. The comparison exercise shows conditionally good comparisons between the model and experiment. Model uncertainties are assessed through parametric variations. Various methods of quantifying the accuracy between experiments and data are examined Three-dimensional analysis of accuracy is performed. The effort helped identify deficiencies in the techniques used to make these comparisons, and further methods development therefore becomes one of the main recommendations for follow-on work.
Feasibility Study of Replacing the R/V Robert Gordon Sproul with a Hybrid Vessel Employing Zero-emission Propulsion Technology
Klebanoff, Leonard E.; Caughlan, Sean A.M.; Madsen, Robert T.; Leach, Timothy S.; Conard, Cody J.; Appelgate Jr., Bruce
This project is a natural "follow-on" to the 2017 MARAD-funded project establishing the technical, regulatory, and economic feasibilities of a zero-emission hydrogen fuel-cell coastal research vessel named the Zero-V. In this follow-on project, we examine the applicability of hydrogen fuel-cell propulsion technology for a different kind of vessel, namely a smaller coastal/local research vessel targeted as a replacement for the Scripps Institution of Oceanography (SIO) R/V Robert Gordon Sproul, which is approaching the end of its service life.
Effects of EMP Testing on Residential DC/AC Microinverters
Fierro, Andy; Le, Ken; Sanabria, David E.; Guttromson, Ross; Halligan, Matthew; Lehr, Jane
Electromagnetic pulse (EMP) coupling into electronic devices can be destructive to components potentially causing device malfunction or failure. A large electromagnetic field generated from the EMP can induce large voltages and currents in components. As such, the effects of EMP on different devices needs to be understood to elucidate the effect of EMP on potentially vulnerable systems. This report presents test results for small-scale residential DC to AC solar panel microinverters that were subjected to high voltage impulses and currents. The impulses were intended to emulate an EMP coupling event to the AC and DC sides of the microinverter. State-of-health measurements were conducted to characterize device performance before and after each test.
Sandia's Integrated Methodology for Energy and Infrastructure Resilience Analysis
Wachtel, Amanda; Jones, Katherine; Baca, Michael J.; Neill-Carrillo, Aponte'; Demenno, Mercy B.
Sandia National Laboratories' (Sandia) Resilient Energy Systems (RES) Strategic Initiative is establishing a strategic vision for U.S. energy systems' resilience through threat-informed research and development, enabling energy and interdependent infrastructure systems to successfully adapt in an environment of accelerating change. A key challenge in promoting energy systems resilience lies in developing rigorous resilience analysis methodologies to quantify system performance. Resilience analysis methodologies should enable evaluation of the consequences of various disruptions and the relative effectiveness of potential mitigations. To address this challenge, RES synthesized the common components of Sandia's resilience frameworks into an integrated methodology for energy and infrastructure resilience analysis. This report documents, demonstrates, and extends this methodology.
Microstructural Changes to Thermally Sprayed Materials Subjected to Dynamic Compression
Mccoy, Chad A.; Moore, Nathan W.; Vackel, Andrew
Dynamic compression of materials can induce a variety of microstructural changes. As thermally-sprayed materials have highly complex microstructures, the expected pressure at which changes occur cannot be predicted a priori. In addition, typical in-situ measurements such as velocimetry are unable to adequately diagnose microstructural changes such as failure or pore collapse. Quasi-isentropic compression experiments with sample recovery were conducted to examine microstructural changes in thermally sprayed tantalum and tantalum-niobium blends up to 8 GPa pressure. Spall fracture was observed in all tests, and post-shot pore volume decreased relative to the initial state. The blended material exhibited larger spall planes with fracture occurring at interphase boundaries. An estimate of the pressure at which pore collapse is complete was determined to be ~26 GPa for pure tantalum and ~19 GPa for the tantalumniobium blend under these loading conditions.
ALEGRA Parallel Scaling for Shock in a Heterogeneous Structure
We investigate the strong and weak parallel scaling performance of the ALEGRA multiphysics finite element program when solving a problem involving shock propagation through a heterogeneous material. We determine that ALEGRA scales well over a wide range of problem sizes, cores, and element sizes, and that scaling generally improves as the minimum element size in the mesh increases.
Efficacy and Delivery of Novel FAST Agents for Coronaviruses
We proposed to test and develop advanced delivery for novel agents from our collaborators Facile Accelerated Specific Therapeutics (FAST) platform to reduce coronavirus replication. Sachi Bioworks Inc., Prof. Anushree Chatterjee, and Prof. Prashant Nagpal at the University of Colorado Boulder have developed a bioinformatics and synthesis pipeline to produce sequence specific theranostic agents (agents that can be therapies and/or diagnostics) that are inherently transported into the cytoplasm of mammalian host cells and sequence-specifically interfere in nucleic acid replication. The agent comprises a small nanoparticle (2-5 nm) chosen for ideal cellular transport and/or imaging conjugated to a short, synthetic DNA analog oligomer designed for binding to one or more target viral sequences. The sequence specific binding of the FAST agent to its target prevents nucleic acid replication due to its high affinity binding. While the small nanoparticle facilitates delivery in vitro, we plan to package the FAST agents into a larger nanoparticle (80-300 nm) for future in vivo delivery applications. Our team at Sandia has expertise encapsulating biomolecules including protein, DNA, and RNA into solid lipid nanoparticles (LNP) and lipid coated mesoporous silica nanoparticles (LC-MSN) and shown successful delivery in mouse models to multiple tissues. Our team focused on formulation parameters for FAST agents into lipid nanoparticles (LNP) and lipid coated mesoporous silica nanoparticles (LC-MSN) for enhanced delivery and/or efficacy and in vivo translation. We used lipid formulas that have been shown in literature to facility in vitro and more importantly, in vivo delivery. In our work discussed below, we successfully demonstrate loading and release of FAST agents on silica core and stable LC-MSN in a reasonable size range for in vivo testing.
Joint Analysis of Program Data Representations using Machine Learning for Improved Software Assurance and Development Capabilities
Heidbrink, Scott; Rodhouse, Kathryn N.; Dunlavy, Daniel M.; Cooper, Alexis; Zhou, Xin
We explore the use of multiple deep learning models for detecting flaws in software programs. Current, standard approaches for flaw detection rely on a single representation of a software program (e.g., source code or a program binary). We illustrate that, by using techniques from multimodal deep learning, we can simultaneously leverage multiple representations of software programs to improve flaw detection over single representation analyses. Specifically, we adapt three deep learning models from the multimodal learning literature for use in flaw detection and demonstrate how these models outperform traditional deep learning models. We present results on detecting software flaws using the Juliet Test Suite and Linux Kernel.
Resistive heating in an electrified domain with a spherical inclusion: an ALEGRA verification study
Rodriguez, Angel E.; Siefert, Christopher; Niederhaus, John H.J.
A verification study is conducted for the ALEGRA software, using the problem of an electrified medium with a spherical inclusion, paying special attention to resistive heating. We do so by extending an existing analytic solution for this problem to include both conducting and insulating inclusions, and we examine the effects of mesh resolution and mesh topology, considering both body-fitted and rectangular meshes containing mixed cells. We present observed rates of convergence with respect to mesh refinement for four electromagnetic quantities: electric potential, electric field, current density and Joule power.
Partitioning of Complex Fluids at Mineral Surfaces
Greathouse, Jeffery A.; Long, Daniel M.; Xu, Guangping; Yoon, Hongkyu; Kim, Iltai; Jungjohann, Katherine L.
This report summarizes the results obtained during the LDRD project entitled "Partitioning of Complex Fluids at Mineral Interfaces." This research addressed fundamental aspects of such interfaces, which are relevant to energy-water applications in the subsurface, including fossil energy extraction and carbon sequestration. This project directly addresses the problem of selectivity of complex fluid components at mineral-fluid interfaces, where complex fluids are defined as a mixture of hydrophobic and hydrophilic components: e.g., water, aqueous ions, polar/nonpolar organic compounds. Specifically, this project investigates how adsorption selectivity varies with surface properties and fluid composition. Both experimental and molecular modeling techniques were used to better understand trends in surface wettability on mineral surfaces. The experimental techniques spanned the macroscale (contact angle measurements) to the nanoscale (cryogenic electronic microscopy and vibrational spectroscopy). We focused on an anionic surfactant and a well-characterized mineral phase representative of clay phases present in oil- and gas-producing shale deposits. Collectively, the results consistently demonstrate that the presence of surfactant in the aqueous fluid significantly affects the mineral-fluid interfacial structure. Experimental and molecular modeling results reveal details of the surfactant structure at the interface, and how this structure varies with surfactant coverage and fluid composition.
Understanding Microstructural Effects on Dynamic Performance Towards the Development of Shock Metamaterials
Branch, Brittany A.; Specht, Paul E.; Ruggles, Timothy; Moore, David G.; Jared, Bradley H.
With the recent advances in additive manufacturing (AM), long-range periodic lattice assemblies are being developed for vibration and shock mitigation components in aerospace and military applications with unique geometric and topological structures. There has been extensive work in understanding the static properties associated with varying topology of these lattice architectures, but there is almost no understanding of microstructural affects in such structures under high-strain rate dynamic loading conditions. Here we report the shock behavior of lattices with varying intrinsic grain structures achieved by post process annealing. High resolution 316L stainless steel lattices were 3D printed by a laser-powder bed fusion machine and characterized by computed tomography. Subsequent annealing resulted in stress-relieved and recrystallized lattices. Overall the lattices had strong cubic texture aligning with the x-, y- and z-directions of the build with a preference outside the build direction (z). The recrystallized sample had more equiaxed polygonal grains and a layer of BCC ferrite at the surface of the structure approximately 1 grain thick. Upon dynamic compression the as-deposited lattice showed steady compaction behavior while the heat-treated lattices exhibit negative velocity behavior indicative of failure. We attribute this to the stiffer BCC ferrite in the annealed lattices becoming damaged and fragmenting during compression.
Arctic Coastal Erosion: Modeling and Experimentation
Bull, Diana L.; Bristol, Emily M.; Brown, Eloise; Choens II, Robert C.; Connolly, Craig T.; Flanary, Christopher; Frederick, Jennifer M.; Jones, Benjamin M.; Jones, Craig A.; Ward Jones, Melissa; Mcclelland, James W.; Mota, Alejandro; Tezaur, Irina K.
Increasing Arctic coastal erosion rates have put critical infrastructure and native communities at risk while also mobilizing ancient organic carbon into modern carbon cycles. Although the Arctic comprises one-third of the global coastline and has some of the fastest eroding coasts, current tools for quantifying permafrost erosion are unable to explain the episodic, storm-driven erosion events. Our approach, mechanistically coupling oceanographic predictions with a terrestrial model to capture the thermo-mechanical dynamics of erosion, enables this much needed treatment of transient erosion events. The Arctic Coastal Erosion Model consists of oceanographic and atmospheric boundary conditions that force a coastal terrestrial permafrost environment in Albany (a multi-physics based finite element model). An oceanographic modeling suite (consisting of WAVEWATCH III, Delft3D-FLOW, and Delft3D-WAVE) produced time-dependent surge and run-up boundary conditions for the terrestrial model. In the terrestrial model, a coupling framework unites the mechanical and thermal aspects of erosion. 3D stress/strain fields develop in response to a plasticity model of the permafrost that is controlled by the frozen water content determined by modeling 3D heat conduction and solid-liquid phase change. This modeling approach enables failure from any allowable deformation (block failure, slumping, etc.). Extensive experimental work has underpinned the ACE Model development including field campaigns to measure in situ ocean and erosion processes, strength properties derived from thermally driven geomechanical experiments, as well as extensive physical composition and geochemical analyses. Combined, this work offers the most comprehensive and physically grounded treatment of Arctic coastal erosion available in the literature. The ACE model and experimental results can be used to inform scientific understanding of coastal erosion processes, contribute to estimates of geochemical and sediment land-to-ocean fluxes, and facilitate infrastructure susceptibility assessments.
Coupling of Laminar-Turbulent Transition with RANS Computational Fluid Dynamics
Wagnild, Ross M.; Fike, Jeffrey; Kucala, Alec; Krygier, Michael; Bitter, Neal
This project combines several new concepts to create a boundary layer transition prediction capability that is suitable for analyzing modern hypersonic flight vehicles. The first new concept is the use of ''optimization'' methods to detect the hydrodynamic instabilities that cause boundary layer transition; the use of this method removes the need for many limiting assumptions of other methods and enables quantification of the interactions between boundary layer instabilities and the flow field imperfections that generate them. The second new concept is the execution of transition analysis within a conventional hypersonics CFD code, using the same mesh and numerical schemes for the transition analysis and the laminar flow simulation. This feature enables rapid execution of transition analysis with less user oversight required and no interpolation steps needed.
Multi-Axis Resonant Plate Shock Testing Evaluation and Test Specification Development
Sisemore, Carl; Babuska, Vit; Flores, Robert X.
Resonant plate testing is a shock test method that is frequently used to simulate pyroshock events in the laboratory. Recently, it was discovered that if the unit under test is installed at an off-center location, a tri-axial accelerometer would record a shock response in three directions and the resulting shock response spectra implied that the test may have qualified the component in three directions simultaneously. The purpose of this research project was to evaluate this idea of multi-axis shock testing to determine if it was truly a multi-axis shock environment and if such a test could be used as an equivalent component qualification test. A study was conducted using generic, additively manufactured components tested on a resonant plate, along with an investigation of plate motion to evaluate the component response to off- center plate excitation. The data obtained here along with the analytical simulations performed indicate that off-center resonant plate tests are actually not three-axis shock tests, but rather single axis shocks at an arbitrary angle dictated by the location of the unit under test on the plate. This conclusion is supported by the fact that only one vectored shock input is provided to the component in a resonant plate test. Thus, the output response is a coupled response of the transverse plate vibration and the rotational motion of the component on the plate. Additionally, a multi-axis shock test defined by three single axis test environments always results in a significant component over-test in one direction.
How Low Can You Go? Using Synthetic 3D Imagery to Drastically Reduce Real-World Training Data for Object Detection
Gastelum, Zoe N.; Shead, Timothy M.
Deep convolutional neural networks (DCNNs) currently provide state-of-the-art performance on image classification and object detection tasks, and there are many global security mission areas where such models could be extremely useful. Crucially, the success of these models is driven in large part by the widespread availability of high-quality open source data sets such as Image Net, Common Objects in Context (COCO), and KITTI, which contain millions of images with thousands of unique labels. However, global security relevant objects-of-interest can be difficult to obtain: relevant events are low frequency and high consequence; the content of relevant images is sensitive; and adversaries and proliferators seek to obscure their activities. For these cases where exemplar data is hard to come-by, even fine-tuning an existing model with available data can be effectively impossible. Recent work demonstrated that models can be trained using a combination of real-world and synthetic images generated from 3D representations; that such models can exceed the performance of models trained using real-world data alone; and that the generated images need not be perfectly realistic (Tremblay, et al., 2018). However, this approach still required hundreds to thousands of real-world images for training and fine tuning, which for sparse, global security-relevant datasets can be an unrealistic hurdle. In this research, we validate the performance and behavior of DCNN models as we drive the number of real-world images used for training object detection tasks down to a minimal set. We perform multiple experiments to identify the best approach to train DCNNs from an extremely small set of real-world images. In doing so, we: Develop state-of-the-art, parameterized 3D models based on real-world images and sample from their parameters to increase the variance in synthetic image training data; Use machine learning explainability techniques to highlight and correct through targeted training the biases that result from training using completely synthetic images; and Validate our results by comparing the performance of the models trained on synthetic data to one another, and to a control model created by fine-tuning an existing ImageNet-trained model with a limited number (hundreds) of real-world images.
Assessing the Vulnerability of Unmanned Aircraft Systems to Directed Acoustic Energy
The increasingly large payloads of Unmanned Aircraft Systems (UASs) are exponentially increasing the threat to the nuclear enterprise. Current mitigation using RF interference is effective, but not feasible for fully autonomous systems and is prohibited in many areas. A new approach to UAS threat mitigation is needed that does not create radio interference but is effective against any type of vehicle. At the present time there is no commercial counter-UAS system that directly assaults the mems gyros and accelerometers in the Inertial Measurement Unit on the aircraft. But lab testing has revealed resonances in some IMUs that make them susceptible to moderate amplitude acoustic monotones. Sandia's energetic materials facility has enabled a quick and thorough exploration of UAS vulnerability to directed acoustic energy by using intense acoustic impulses to destabilize or down a UAS. We have: 1) detonated/deflagrated explosive charges of various sizes; 2) accurately measured impulse pressure and pulse duration; 3) determined what magnitude of acoustic insult to the IMU disrupts flight and for how long and; 4) determined if the air blast/shock wave on aircraft/propellers disrupts flight.
FY20Q4 report for ATDM AD projects to ECP [Kokkos, etc.]
Activities, accomplishments, next steps and outreach are reported, primarily related to the Kokkos project.
Summary Report for the NEPA Impact Analysis. Revision 1
Zeitler, Todd Z.; Brunell, Sarah B.; Feng, Lianzhong; Kicker, Dwayne C.; Kim, Sungtae; Long, Jennifer J.; Rechard, Robert P.; Hansen, Clifford; Wagner, Stephen W.
The Waste Isolation Pilot Plant (WIPP), located in southeastern New Mexico, has been developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of defense-related transuranic (TRU) waste. Containment of TRU waste at the WIPP facility is derived from standards set forth in Title 40 of the Code of Federal Regulations (CFR), Part 191. The DOE assesses compliance with the containment standards according to the Certification Criteria in Title 40 CFR Part 194 by means of Performance Assessment (PA) calculations performed by Sandia National Laboratories (SNL). WIPP PA calculations estimate the probability of radionuclide releases from the repository to the accessible environment for a regulatory period of 10,000 years after facility closure. The DOE Carlsbad Field Office (CBFO) has initiated a National Environmental Policy Act (NEPA) action for a proposal to excavate and use additional transuranic (TRU) waste disposal panels at the WIPP facility. This report documents an analysis undertaken as part of an effort to evaluate the potential environmental consequences of the proposed action. Although not explicitly required for a NEPA analysis, evaluations of a dose indicator to hypothetical members of the public after final facility closure are presented in this report. The analysis is carried out in two stages: first, Performance Assessment (PA) calculations quantify the potential releases to the accessible environment over a 10,000-year post-closure period. Second, dose was evaluated for three hypothetical exposure pathways using the conservative radionuclide concentrations assumed to be released to the accessible environment.
ISO 55001 Asset Management Gap Analysis - Final Results [Spreadsheet]
Otero, Pete; Foster, Birgitta T.; Clark, Waylon T.; Evans, Christopher A.; Zavadil, John; Michaels, Jeremy; Sholtis, Diane; Martinez, Gabriel
A spreadsheet showing the final results from the ISO 55001 Section Alignment ratings is shown, including gap analysis and IAM maturity ratings.
FY20Q4 Report for ATDM AD Projects to ECP [SPARC, etc.]
Sections include: performance to plan, components, exceeds, and lessons learned. Updated areas include: SPARC, UMR, EMPIRE, Panzer.
Review of the Nuclear Energy Agency (NEA) Ancillary Thermodynamic Database (TDB) Volume (DRAFT REV. 0)
Jove-Colon, Carlos F.; Sanchez, Amanda
The Nuclear Energy Agency (NEA) Ancillary data volume comprises thermodynamic data of mineral and aqueous species that, in addition to Auxiliary Data (as referred to in previous NEA thermodynamic data volumes), is necessary to calculations of chemical interactions relevant to radioactive waste management and nuclear energy. This SAND report is a review of the NEA Ancillary data critical reviews volume of thermodynamic data parameters. The review given in this report mainly involves data comparison with other thermodynamic data assessments, analysis of thermodynamic parameters, and examination of data sources. Only new and updated data parameters were considered in this review. Overall, no major inconsistencies or errors were found as allowed by the comparisons conducted in this review. Some remarks were noted, for example, on the consideration of relevant studies and/or comparisons on the analysis and retrieval of thermodynamic data parameters not cited in the respective sections.
Pre-Symptomatic COVID Screening
Temperature checks for fever are extensively used for preliminary COVID screenings but are ineffective during the incubation stage of infection when a person is asymptotic. Researchers at the European Centre for Disease Prevention and Control concluded that approximately 75% of passengers infected with COVID-19 and traveling from affected Chinese cities would not be detected by early screening. Core body temperature is normally kept within a narrow range and has the smallest relative standard deviation of all vital signs. Heat in the body is prioritized around internal organs at the expense of the periphery by controlling blood flow. In fact, blood flow to the skin may vary by a factor of 100 depending on thermal conditions. This adaptation causes rapid temperature fluctuations in different skin regions from changes in cardiac output, metabolism, and likely cytokine diffusion during inflammation that would not be seen in average core body temperature. Current IR and thermal scanners used for temperature checks are not necessarily reflective of core body temperatures and require cautious interpretation as they frequently result in false positive and false negative diagnosis. Hand held thermometers measure average skin temperatures and can get readings that differ from core body temperature by as much as 7°. Rather than focusing on a core body temperature threshold assessment we believe that variability of temperature patterns using a novel wearable transdermal microneedle sensor will be more sensitive to infections in the incubation stage and propose to develop a wearable transdermal temperature sensor using established Sandia microneedle technology for pre-symptomatic COVID screening that can additionally be used to monitor disease progression at later stages.
Predicting Future Disease Burden in a Rapidly Changing Climate
Powell, Amy J.; Tezaur, Irina K.; Davis, Warren L.; Peterson, Kara J.; Rempe, Susan; Smallwood, Chuck R.; Roesler, Erika L.
The interplay of a rapidly changing climate and infectious disease occurrence is emerging as a critical topic, requiring investigation of possible direct, as well as indirect, connections between disease processes and climate-related variation and phenomena. First, we introduce and overview three infectious disease exemplars (dengue, influenza, valley fever) representing different transmission classes (insect-vectored, human-to-human, environmentally-transmitted) to illuminate the complex and significant interplay between climate disease processes, as well as to motivate discussion of how Sandia can transform the field, and change our understanding of climate-driven infectious disease spread. We also review state-of-the-art epidemiological and climate modeling approaches, together with data analytics and machine learning methods, potentially relevant to climate and infectious disease studies. We synthesize the modeling and disease exemplars information, suggesting initial avenues for research and development (R&D) in this area, and propose potential sponsors for this work. Whether directly or indirectly, it is certain that a rapidly changing climate will alter global disease burden. The trajectory of climate change is an important control on this burden, from local, to regional and global scales. The efforts proposed herein respond to the National Research Councils call for the creation of a multidisciplinary institute that would address critical aspects of these interlocking, cascading crises.
Asynchronous Ballistic Reversible Computing using Superconducting elements
Lewis, Rupert M.; Missert, Nancy; Henry, Michael D.; Frank, Michael P.
Computing uses energy. At the bare minimum, erasing information in a computer increases the entropy. Landauer has calculated %7E kBT log(2) Joules is dissipated per bit of energy erased. While the success of Moores law has allowed increasing computing power and efficiency for many years, these improvements are coming to an end. This project asks if there is a way to continue those gains by circumventing Landauer through reversible computing. We explore a new reversible computing paradigm, asynchronous ballistic reversible computing or ABRC. The ballistic nature of data in ABRC matches well with superconductivity which provides a low-loss environment and a quantized bit encoding the fluxon. We discuss both these and our development of a superconducting fabrication process at Sandia. We describe a fully reversible 1-bit memory cell based on fluxon dynamics. Building on this model, we propose several other gates which may also offer reversible operation.
Rapid Assessment of Autoignition Propensity in Novel Fuels and Blends
Sheps, Leonid; Buras, Zachary; Zador, Judit; Au, Kendrew; Safta, Cosmin
We developed a computational strategy to correlate bulk combustion metrics of novel fuels and blends in the low-temperature autoignition regime with measurements of key combustion intermediates in a small-volume, dilute, high-pressure reactor. We used neural net analysis of a large simulation dataset to obtain an approximate correlation and proposed experimental and computational steps needed to refine such a predictive correlation. We also designed and constructed a high-pressure laboratory apparatus to conduct the proposed measurements and demonstrated its performance on three canonical fuels: n-heptane, i-octane, and dimethyl ether.
A Quantum Analog Coprocessor for Correlated Electron Systems Simulation
Baczewski, Andrew D.; Brickson, Mitchell I.; Campbell, Quinn; Jacobson, Noah T.; Maurer, Leon
Analog quantum simulation is an approach for studying physical systems that might otherwise be computationally intractable to simulate on classical high-performance computing (HPC) systems. The key idea behind analog quantum simulation is the realization of a physical system with a low-energy effective Hamiltonian that is the same as the low-energy effective Hamiltonian of some target system to be studied. Purpose-built nanoelectronic devices are a natural candidate for implementing the analog quantum simulation of strongly correlated materials that are otherwise challenging to study using classical HPC systems. However, realizing devices that are sufficiently large to study the properties of a non-trivial material system (e.g., those described by a Fermi-Hubbard model) will eventually require the fabrication, control, and measurement of at least 0(10) quantum dots, or other engineered quantum impurities. As a step toward large-scale analog or digital quantum simulation platforms based on nanoelectronic devices, we propose a new approach to analog quantum simulation that makes use of the large Hilbert space dimension of the electronic baths that are used to adjust the occupancy of one or a few engineered quantum impurities. This approach to analog quantum simulation allows us to study a wide array of quantum impurity models. We can further augment the computational power of such an approach by combining it with a classical computer to facilitate dynamical mean-field theory (DMFT) calculations. DMFT replaces the solution of a lattice impurity problem with the solution of a family of localized impurity problems with bath couplings that are adjusted to satisfy a self-consistency condition between the two models. In DMFT, the computationally challenging task is the high-accuracy solution of an instance of a quantum impurity model that is determined self-consistently in coordination with a mean-field calculation. We propose using one or a few engineered quantum impurities with adjustable couplings to baths to realize an analog quantum coprocessor that effects the solution of such a model through measurements of a physical quantum impurity, operating in coordination with a classical computer to achieve a self-consistent solution to a DMFT calculation. We focus on implementation details relevant to a number of technologies for which Sandia has design, fabrication, and measurement expertise. The primary technical advances outlined in this report concern the development of a supporting modeling capability. As with all analog quantum simulation platforms, the successful design and operation of individual devices depends critically on one's ability to predict the effective low-energy Hamiltonian governing its dynamics Our project has made this possible and lays the foundation for future experimental implementations.
Handheld Biosensor for COVID-19 Screening
Branch, Darren W.; Hayes, Dulce C.
We have made significant progress toward the development of an integrated nucleic acid amplification system for Autonomous Medical Devices Incorporated (AMDIs) Optikus handheld diagnostic device. In this effort, we developed a set of loop-mediated isothermal amplification (LAMP) primers for SARS-CoV-2 and then demonstrate amplification directly on a surface acoustic wave (SAW) sensor. We built associated hardware and developed a C-code to control the amplification process. The goal of this project was to develop a nucleic amplification assay that is compatible with SAW sensors to enable both nucleic and serological testing in a single handheld diagnostic device. Toward this goal, AMDI is collaborating Sandia National Laboratories to develop a rapid, portable diagnostic screening device that utilizes Sandias unique surface acoustic wave biosensor (SAW) for COVID-19 detection. Previously, the SANDIA- AMDI SAW sensor has successfully detected multiple high-profile bacteria viruses, including Ebola, HIV, Sin Nombre, and Anthrax. Over the last two years, AMDI and SANDIA have significantly improved the sensitivity and detection capability of the SAW biosensor and have also developed a modular hand-held, portable platform called the Optikus, which uses CD microfluidics and handheld instrumentation to automate all sample preparation, reagent introduction, sample delivery, and measurement for a number of different assay targets. We propose to use this platform for the development of a rapid (%3C30 minutes), point-of-care diagnostic test for detection of COVID-19 from nasal swab samples.
Neuromorphic scaling advantages for energy-efficient random walk computations
Smith, J.D.; Hill, Aaron; Reeder, Leah; Franke, Brian C.; Lehoucq, Rich; Parekh, Ojas D.; Severa, William M.; Aimone, James B.
Computing stands to be radically improved by neuromorphic computing (NMC) approaches inspired by the brain's incredible efficiency and capabilities. Most NMC research, which aims to replicate the brain's computational structure and architecture in man-made hardware, has focused on artificial intelligence; however, less explored is whether this brain-inspired hardware can provide value beyond cognitive tasks. We demonstrate that high-degree parallelism and configurability of spiking neuromorphic architectures makes them well-suited to implement random walks via discrete time Markov chains. Such random walks are useful in Monte Carlo methods, which represent a fundamental computational tool for solving a wide range of numerical computing tasks. Additionally, we show how the mathematical basis for a probabilistic solution involving a class of stochastic differential equations can leverage those simulations to provide solutions for a range of broadly applicable computational tasks. Despite being in an early development stage, we find that NMC platforms, at a sufficient scale, can drastically reduce the energy demands of high-performance computing platforms.
Bioscience COVID Rapid Response Report
The COVID-19 disease outbreak and its impact on global health and economies have highlighted the national security threat posed by pathogens with pandemic potential and the need for rapid development of effective diagnostics and medical countermeasures. The Bioscience IA selected for funding rapid COVID LDRD project proposals that addressed critical R&D gaps in pandemic response that could be accomplished in 1-3 months with the requested funding. In total, the Bioscience IA funded nine rapid projects that addressed 1) rapid and accurate methods for SARS-CoV-2 RNA detection, 2) modeling tools to help prioritize populations for diagnostic testing, 3) bioinformatic tools to track SARS-CoV-2 genomic sequence changes over time, 4) molecular inhibitors of SARS-CoV-2 cellular infection, and 5) method for rapid staging of COVID19 disease to enable administration of more effective treatments. In addition, LDRD funded one larger project to be completed in FY21 that leverages Sandia capabilities to address the need for platform diagnostics and therapeutics that can be rapidly tailored against emerging pathogen targets.
Experimental and Theoretical Studies of Ultrafast Vibrational Energy Transfer Dynamics in Energetic Materials
Ramasesha, Krupa; Wood, M.A.; Cole-Filipiak, Neil C.; Knepper, Robert A.
Energy transfer through anharmonically-coupled vibrations influences the earliest chemical steps in shockwave-induced detonation in energetic materials. A mechanistic description of vibrational energy transfer is therefore necessary to develop predictive models of energetic material behavior. We performed transient broadband infrared spectroscopy on hundreds of femtoseconds to hundreds of picosecond timescales as well as density functional theory and molecular dynamics simulations to investigate the evolution of vibrational energy distribution in thin film samples of pentaerythritol tetranitrate (PETN) , 1,3,5 - trinitroperhydro - 1,3,5 - triazine (RDX) , and 2,4,6 - triamino 1,3,5 - trinitrobenzene (TATB). Experimental results show dynamics on multiple timescales, providing strong evidence for coupled vibrations in these systems, as well as material-dependent evolution on tens to hundreds of picosecond timescales. Theoretical results also reveal pathways and distinct timescales for energy transfer through coupled vibrations in the three investigated materials, providing further insight into the mechanistic underpinnings of energy transfer dynamics in energetic material sensitivity.
Noise Erasure in Quantum-Limited Current Amplifiers
Harris, Charles T.; Lu, Tzu M.; Bethke, Donald; Lewis, Rupert M.; Del Skinner Ramos, Suelicarmen
Superconducting quantum interference devices (SQUIDs) are extraordinarily sensitive to magnetic flux and thus make excellent current amplifiers for cryogenic applications. One such application of high interest to Sandia is the set-up and state read-out of quantum dot based qubits, where a qubit state is read out from a short current pulse (microseconds to milliseconds long) of approximately 100 pA, a signal that is easily corrupted by noise in the environment. A Parametric SQUID Amplifier can be high bandwidth (in the GHz range), low power dissipation (less than 1pW), and can be easily incorporated into multi-qubit systems. In this SAIL LDRD, we will characterize the noise performance of the parametric amplifier front end -- the SQUID -- in an architecture specific to current readout for spin qubits. Noise is a key metric in amplification, and identifying noise sources will allow us to optimize the system to reduce its effects, resulting in higher fidelity readout. This effort represents a critical step in creating the building blocks of a high speed, low power, parametric SQUID current amplifier that will be needed in the near term as quantum systems with many qubits begin to come on line in the next few years.
Efficient Scalable Tomography of Many-Qubit Quantum Processors
Quantum computing has the potential to realize powerful and revolutionary applications. A quantum computer can, in theory, solve certain problems exponentially faster than its classical counterparts. The current state of the art devices, however, are too small and noisy to practically realize this goal. An important tool for the advancement of quantum hardware, called model-based characterization, seeks to learn what types of noise are exhibited in a quantum processor. This technique, however, is notoriously difficult to scale up to even modest numbers of qubit, and has been limited to just 2 qubits until now. In this report, we present a novel method for performing model-based characterization, or tomography, on a many-qubit quantum processor. We consider up to 10 qubits, but the technique is expected to scale to even larger systems.
A Multi-Instance learning Framework for Seismic Detectors
Ray, Jaideep; Wang, Fulton; Young, Christopher J.
In this report, we construct and test a framework for fusing the predictions of a ensemble of seismic wave detectors. The framework is drawn from multi-instance learning and is meant to improve the predictive skill of the ensemble beyond that of the individual detectors. We show how the framework allows the use of multiple features derived from the seismogram to detect seismic wave arrivals, as well as how it allows only the most informative features to be retained in the ensemble. The computational cost of the "ensembling" method is linear in the size of the ensemble, allowing a scalable method for monitoring multiple features/transformations of a seismogram. The framework is tested on teleseismic and regional p-wave arrivals at the IMS (International Monitoring System) station in Warramunga, NT, Australia and the PNSU station in University of Utah's monitoring network.
Applying Compression-Based Metrics to Seismic Data in Support of Global Nuclear Explosion Monitoring
Matzen, Laura E.; Ting, Christina; Field, Richard V.; Morrow, J.D.; Brogan, Ronald; Young, Christopher J.; Zhou, Angela; Trumbo, Michael C.S.; Coram, Jamie L.
The analysis of seismic data for evidence of possible nuclear explosion testing is a critical global security mission that relies heavily on human expertise to identify and mark seismic signals embedded in background noise. To assist analysts in making these determinations, we adapted two compression distance metrics for use with seismic data. First, we demonstrated that the Normalized Compression Distance (NCD) metric can be adapted for use with waveform data and can identify the arrival times of seismic signals. Then we tested an approximation for the NCD called Sliding Information Distance (SLID), which can be computed much faster than NCD. We assessed the accuracy of the SLID output by comparing it to both the Akaike Information Criterion (AIC) and the judgments of expert seismic analysts. Our results indicate that SLID effectively identifies arrival times and provides analysts with useful information that can aid their analysis process.
Simulation Analysis of Geometry and Material Effects for Dropkinson Bar
Brif, Constantin; Stershic, Andrew J.
The reported research is motivated by the need to address a key issue affecting the Dropkinson bar apparatus. This unresolved issue is the interference of the stress wave reflected from the bar-beam boundary with the measurement of the stress-strain response of a material tested in the apparatus. The purpose of the wave beam that is currently connected to the bar is to dissipate the stress wave, but the portion of the wave reflected from the bar-beam boundary is still significant. First, we focused on understanding which parameters affect the reflected wave's arrival time at a strain gauge. Specifically, we used finite-element numerical simulations with the Sierra/SM module to study the effects of various bar-beam connection fixities, alternative wave beam materials, and alternative geometries of the Dropkinson bar system based on a monolithic design. The conclusion of this study is that a partial reflection always occurs at the bar-beam boundary (or, for a monolithic design, at a point where the bar geometry changes). Therefore, given a fixed total length of the bar, it is impossible to increase the reflected wave's arrival time by any significant amount. After reaching this conclusion, we focused instead on trying to minimize the energy of the reflected stress wave circulating up and down through the bar over a relatively long period of time (10 ms). Once again, we used numerical simulations with the Sierra/SM module to investigate the effects of various bar-beam connection fixities, alternative wave beam materials, and parameters of an asymmetric monolithic design of the bar-and-beam system. This study demonstrated that various parameters can significantly affect the energy of the wave reflections, with the difference between best and worst configurations being about one order of magnitude in terms of energy. Based on the obtained results, we conclude with concrete takeaways for Dropkinson bar users and propose potential directions for future research and optimization.
Conditional Generative Adversarial Networks for Solving Heat Transfer Problems
Martinez, Matthew T.; Heiner, Olivia N.
Generative Adversarial Networks (GANs) have been used as a deep learning approach to solving physics and engineering problems. Using deep learning for these problems is attractive in that reasonably accurate models can be inferred from only raw data, eliminating the need to define the exact physical equations governing a problem. We expand on previous work using GANs to generate steady-state solutions to the two-dimensional heat equation. Using a basic conditional GAN (cGAN), we generate accurate solutions for rectangular domains conditioned on four edge boundary conditions (MAE < 0.5%). For finding steady-state solutions over arbitrary two-dimensional domains (not constrained to rectangles), we use a cGAN designed for image-to-image translation. We train this GAN on various types of geometric domains (circles, squares, triangles, shapes with one circular or rectangular hole), achieving accurate results on test data made up of geometries similar to those in training (MAE < 1%). For both of these GANs, we experiment with different loss function terms, showing that a term using the gradients of solution images significantly improves the basic cGAN but not the image-to-image GAN. Lastly, we show that the image-to-image GAN performs poorly when applied to two-dimensional geometries that vary in structure from training data (MAE < 8% for shapes with multiple holes or different shaped holes). This demonstrates the cGAN's lack of generalizability. While the cGAN is an accurate and computationally efficient method when trained and tested on similarly structured data, it is a much less reliable method when applied to data that is slightly different in structure from the training data.
Mosaics, The Best of Both Worlds: Analog devices with Digital Spiking Communication to build a Hybrid Neural Network Accelerator
Aimone, James B.; Bennett, Christopher; Cardwell, Suma G.; Dellana, Ryan; Xiao, Tianyao P.
Neuromorphic architectures have seen a resurgence of interest in the past decade owing to 100x-1000x efficiency gain over conventional Von Neumann architectures. Digital neuromorphic chips like Intel's Loihi have shown efficiency gains compared to GPUs and CPUs and can be scaled to build larger systems. Analog neuromorphic architectures promise even further savings in energy efficiency, area, and latency than their digital counterparts. Neuromorphic analog and digital technologies provide both low-power and configurable acceleration of challenging artificial intelligence (AI) algorithms. We present a hybrid analog-digital neuromorphic architecture that can amplify the advantages of both high-density analog memory and spike-based digital communication while mitigating each of the other approaches' limitations.
Hydrogen Risk Assessment Models (HyRAM) (Version 3.0 Technical Reference Manual)
Ehrhart, Brian D.; Hecht, Ethan S.; Groth, Katrina M.; Reynolds, John T.; Blaylock, Myra L.; Carrier, Erin E.
The HyRAM software toolkit provides a basis for conducting quantitative risk assessment and consequence modeling for hydrogen infrastructure and transportation systems. HyRAM is designed to facilitate the use of state-of-the-art science and engineering models to conduct robust, repeatable assessments of hydrogen safety, hazards, and risk. HyRAM includes generic probabilities for hydrogen equipment failures, probabilistic models for the impact of heat flux on humans and structures, and computationally and experimentally validated first-order models of hydrogen release and flame physics. HyRAM integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing hydrogen hazards (thermal effects from jet fires, overpressure effects from deflagrations), and assessing impact on people and structures. HyRAM is developed at Sandia National Laboratories for the U.S. Department of Energy to increase access to technical data about hydrogen safety and to enable the use of that data to support development and revision of national and international codes and standards. HyRAM is a research software in active development and thus the models and data may change. This report will be updated at appropriate developmental intervals. This document provides a description of the methodology and models contained in the HyRAM version 3.0. HyRAM 3.0 includes the new ability to model cryogenic hydrogen releases from liquid hydrogen systems, using a different property calculation method and different equations of state. Other changes include modifications to the ignition probability calculations, component leak frequency calculations, and addition of default impulse data.
Tuning the critical Li intercalation concentrations for MoX2 bilayer phase transitions using classical and machine learning approaches
Spataru, Catalin D.; Witman, Matthew D.; Jones, Reese E.
Transition metal dichalcogenides (TMDs) such as MoX2 are known to undergo a structural phase transformation as well as a change in the electronic conductivity upon Li intercalation. These properties make them candidates for charge tunable ion-insertion materials that could be used in electro-chemical devices for neuromorphic computing applications. In this work we study the phase stability and electronic structure of Li-intercalated bilayer MoX2 with X=S, Se or Te. Using first-principles calculations in combination with classical and machine learning modeling approaches we find that the energy needed to stabilize the conductive phase decreases with increasing atomic mass of the chalcogen atom X. A similar decreasing trend is found in the threshold Li concentration where the structural phase transition takes place. While the electronic conductivity increases with increasing ion concentration at low concentrations, we do not observe a conductivity jump at the phase transition point.
Measuring and Extracting Activity from Time Series Data
Stracuzzi, David J.; Peterson, Matthew G.; Popoola, Gabriel A.
This report summarizes the results of an LDRD focused on developing and demonstrating statistically rigorous methods for analyzing and comparing complex activities from remote sensing data. Identifying activity from remote sensing data, particularly those that play out over time and span multiple locations, often requires extensive manual effort because of the variety of features that describe the activity and the required domain expertise. Our results suggest that there are some hidden challenges in extracting and representing activities in sensor data. In particular, we found that the variability in the underlying behaviors can be difficult to overcome statistically, and the report identifies several examples of the issue. We discuss key lessons learned in the context of the project, and finally conclude with recommendations on next steps and future work.
Language Independent Static Analysis (LISA)
Ghormley, Douglas P.; Reedy, Geoffrey; Landin, Kirk T.
Software is becoming increasingly important in nearly every aspect of global society and therefore in nearly every aspect of national security as well. While there have been major advancements in recent years in formally proving properties of program source code during development, such approaches are still in the minority among development teams, and the vast majority of code in this software explosion is produced without such properties. In these cases, the source code must be analyzed in order to establish whether the properties of interest hold. Because of the volume of software being produced, automated approaches to software analysis are necessary to meet the need. However, this software boom is not occurring in just one language. There are a wide range of languages of interest in national security spaces, including well-known languages such as C, C++, Python, Java, Javascript, and many more. But recent years have produced a wide range of new languages, including Nim, (2008), Go (2009), Rust (2010), Dart (2011), Kotlin (2011), Elixir (2011), Red (2011), Julia (2012), Typescript (2012), Swift (2014), Hack (2014), Crystal (2014), Ballerina (2017) and more. Historically, automated software analyses are implemented as tools that intermingle both the analysis question at hand with target language dependencies throughout their code, making re-use of components for different analysis questions or different target languages impractical. This project seeks to explore how mission-relevant, static software analyses can be designed and constructed in a language-independent fashion, dramatically increasing the reusability of software analysis investments.
Sandia National Laboratories Early Career University Faculty Mentoring Program in International Safeguards
Solodov, Alexander; Peter-Stein, Natacha; Hartig, Kyle C.; Padilla, Eduardo A.; Di Fulvio, Angela; Shoman, Nathan
Recent years have seen a significantly increased focus in the areas of knowledge retention and mentoring of junior staff within the U.S. national laboratory complex. In order to involve the university community in this process, as well, an international safeguards mentoring program was established by Sandia National Laboratories (SNL) for early career university faculty. After a successful experience during 2019, the program continued into 2020 to include two new faculty members who were paired with SNL subject matter experts based on the topic of their individual projects: one to work on advanced laboratory work for physics, technology, and policy of nuclear safeguards and nonproliferation, and the other to look at machine learning applied to international safeguards and nonproliferation. There is a two-pronged purpose to the program: fostering the development of educational resources available for international safeguards and exploring new research topics stemming from the exchange of mentor and mentee. Further, the program as a whole allows for junior faculty members to establish and expand a relationship network within international safeguards. In addition, programs such as this build stronger connections between the academic and the national laboratory community. Thanks to the junior faculty members that now have new connections into the laboratory community and potential for collaboration projects with the laboratories in the future, safeguards knowledge can actually increase far beyond just individually engaging students using this new and efficient avenue.
Characterizing Shielded Special Nuclear Material by Neutron Capture Gamma-Ray Multiplicity Counting
Brien, Michael C.'.; Hamel, Michael C.
We present a new neutron multiplicity counting analysis and measurement method for neutron-shielded fissile material using neutron-capture gamma rays. Neutrons absorbed in shielding produce characteristic gamma rays that preserve the otherwise lost neutron multiplicity signature. Neutron multiplicity counting provides estimates of fission parameters, such as neutron leakage multiplication, spontaneous fissioner (e.g. Pu-240) mass, and (α,n) ratio. Standard neutron multiplicity counting can incorporate the new neutron-capture gamma-ray multiplicity counting technique to characterize previously degenerate or intractable source configurations by maximizing the multiplicity signature. The new method decouples neutron source-detector interferences, such as reflection and thermalization time in the detector, that could improve measurements of the mean neutron lifetime. We also develop a detector prototype for the multiplicity counting of neutron-capture gamma rays and present detector design considerations, such as detection material and shielding, to optimize the detection of the 2.2 MeV hydrogen capture gamma ray. We simulate the prototype neutron- capture gamma-ray multiplicity counter against the BeRP ball in polyethylene shells to inform future measurements.