Publications

Results 301–325 of 9,998

Search results

Jump to search filters

Mode-Selective Vibrational Energy Transfer Dynamics in 1,3,5-Trinitroperhydro-1,3,5-triazine (RDX) Thin Films

Journal of Physical Chemistry A

Cole-Filipiak, Neil C.; Knepper, Robert A.; Wood, M.A.; Ramasesha, Krupa

The coupling of inter- and intramolecular vibrations plays a critical role in initiating chemistry during the shock-to-detonation transition in energetic materials. Herein, we report on the subpicosecond to subnanosecond vibrational energy transfer (VET) dynamics of the solid energetic material 1,3,5-trinitroperhydro-1,3,5-triazine (RDX) by using broadband, ultrafast infrared transient absorption spectroscopy. Experiments reveal VET occurring on three distinct time scales: subpicosecond, 5 ps, and 200 ps. The ultrafast appearance of signal at all probed modes in the mid-infrared suggests strong anharmonic coupling of all vibrations in the solid, whereas the long-lived evolution demonstrates that VET is incomplete, and thus thermal equilibrium is not attained, even on the 100 ps time scale. Density functional theory and classical molecular dynamics simulations provide valuable insights into the experimental observations, revealing compression-insensitive time scales for the initial VET dynamics of high-frequency vibrations and drastically extended relaxation times for low-frequency phonon modes under lattice compression. Mode selectivity of the longest dynamics suggests coupling of the N-N and axial NO2stretching modes with the long-lived, excited phonon bath.

More Details

A FETI approach to domain decomposition for meshfree discretizations of nonlocal problems

Computer Methods in Applied Mechanics and Engineering

Xu, Xiao; Glusa, Christian; D'Elia, Marta; Foster, John E.

We propose a domain decomposition method for the efficient simulation of nonlocal problems. Our approach is based on a multi-domain formulation of a nonlocal diffusion problem where the subdomains share “nonlocal” interfaces of the size of the nonlocal horizon. This system of nonlocal equations is first rewritten in terms of minimization of a nonlocal energy, then discretized with a meshfree approximation and finally solved via a Lagrange multiplier approach in a way that resembles the finite element tearing and interconnect method. Specifically, we propose a distributed projected gradient algorithm for the solution of the Lagrange multiplier system, whose unknowns determine the nonlocal interface conditions between subdomains. Several two-dimensional numerical tests on problems as large as 191 million unknowns illustrate the strong and the weak scalability of our algorithm, which outperforms the standard approach to the distributed numerical solution of the problem. Finally, this work is the first rigorous numerical study in a two-dimensional multi-domain setting for nonlocal operators with finite horizon and, as such, it is a fundamental step towards increasing the use of nonlocal models in large scale simulations.

More Details

Revealing quantum effects in highly conductive δ-layer systems

Communications Physics

Mamaluy, Denis; Mendez Granado, Juan P.; Gao, Xujiao; Misra, Shashank

Thin, high-density layers of dopants in semiconductors, known as δ-layer systems, have recently attracted attention as a platform for exploration of the future quantum and classical computing when patterned in plane with atomic precision. However, there are many aspects of the conductive properties of these systems that are still unknown. Here we present an open-system quantum transport treatment to investigate the local density of electron states and the conductive properties of the δ-layer systems. A successful application of this treatment to phosphorous δ-layer in silicon both explains the origin of recently-observed shallow sub-bands and reproduces the sheet resistance values measured by different experimental groups. Further analysis reveals two main quantum-mechanical effects: 1) the existence of spatially distinct layers of free electrons with different average energies; 2) significant dependence of sheet resistance on the δ-layer thickness for a fixed sheet charge density.

More Details

GDSA Framework Development and Process Model Integration FY2021

Mariner, Paul; Berg, Timothy M.; Debusschere, Bert; Eckert, Aubrey; Harvey, Jacob A.; Laforce, Tara C.; Leone, Rosemary C.; Mills, Melissa M.; Nole, Michael A.; Park, Heeho D.; Perry, F.V.; Seidl, D.T.; Swiler, Laura P.; Chang, Kyung W.

The Spent Fuel and Waste Science and Technology (SFWST) Campaign of the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE), Office of Spent Fuel & Waste Disposition (SFWD) is conducting research and development (R&D) on geologic disposal of spent nuclear fuel (SNF) and highlevel nuclear waste (HLW). A high priority for SFWST disposal R&D is disposal system modeling (DOE 2012, Table 6; Sevougian et al. 2019). The SFWST Geologic Disposal Safety Assessment (GDSA) work package is charged with developing a disposal system modeling and analysis capability for evaluating generic disposal system performance for nuclear waste in geologic media.

More Details

Comprehensive Material Characterization and Simultaneous Model Calibration for Improved Computational Simulation Credibility

Seidl, D.T.; Jones, E.M.C.; Lester, Brian T.

Computational simulation is increasingly relied upon for high-consequence engineering decisions, and a foundational element to solid mechanics simulations is a credible material model. Our ultimate vision is to interlace material characterization and model calibration in a real-time feedback loop, where the current model calibration results will drive the experiment to load regimes that add the most useful information to reduce parameter uncertainty. The current work investigated one key step to this Interlaced Characterization and Calibration (ICC) paradigm, using a finite load-path tree to incorporate history/path dependency of nonlinear material models into a network of surrogate models that replace computationally-expensive finite-element analyses. Our reference simulation was an elastoplastic material point subject to biaxial deformation with a Hill anisotropic yield criterion. Training data was generated using either a space-filling or adaptive sampling method, and surrogates were built using either Gaussian process or polynomial chaos expansion methods. Surrogate error was evaluated to be on the order of 10⁻5 and 10⁻3 percent for the space-filling and adaptive sampling training data, respectively. Direct Bayesian inference was performed with the surrogate network and with the reference material point simulator, and results agreed to within 3 significant figures for the mean parameter values, with a reduction in computational cost over 5 orders of magnitude. These results bought down risk regarding the surrogate network and facilitated a successful FY22-24 full LDRD proposal to research and develop the complete ICC paradigm.

More Details

Peridynamic Model for Single-Layer Graphene Obtained from Coarse Grained Bond Forces

D'Elia, Marta; Silling, Stewart; You, Huaiqian; Yu, Yue; Fermen-Coker, Muge

An ordinary state-based peridynamic material model is proposed for single sheet graphene. The model is calibrated using coarse grained molecular dynamics simulations. The coarse graining method allows the dependence of bond force on bond length to be determined, including the horizon. The peridynamic model allows the horizon to be rescaled, providing a multiscale capability and allowing for substantial reductions in computational cost compared with molecular dynamics. The calibrated peridynamic model is compared to experimental data on the deflection and perforation of a graphene monolayer by an atomic force microscope probe.

More Details

Sensitivity Analysis Comparisons on Geologic Case Studies: An International Collaboration

Swiler, Laura P.; Becker, Dirk-Alexander; Brooks, Dusty M.; Govaerts, Joan; Koskinen, Lasse; Plischke, Elmar; Rohlig, Klaus-Jurgen; Saveleva, Elena; Spiessl, Sabine M.; Stein, Emily; Svitelman, Valentina

Over the past four years, an informal working group has developed to investigate existing sensitivity analysis methods, examine new methods, and identify best practices. The focus is on the use of sensitivity analysis in case studies involving geologic disposal of spent nuclear fuel or nuclear waste. To examine ideas and have applicable test cases for comparison purposes, we have developed multiple case studies. Four of these case studies are presented in this report: the GRS clay case, the SNL shale case, the Dessel case, and the IBRAE groundwater case. We present the different sensitivity analysis methods investigated by various groups, the results obtained by different groups and different implementations, and summarize our findings.

More Details

Multimode Metastructures: Novel Hybrid 3D Lattice Topologies

Boyce, Brad L.; Garland, Anthony; White, Benjamin C.; Jared, Bradley H.; Conway, Kaitlynn; Adstedt, Katerina; Dingreville, Remi; Robbins, Joshua; Walsh, Timothy; Alvis, Timothy; Branch, Brittany A.; Kaehr, Bryan J.; Kunka, Cody; Leathe, Nicholas S.

With the rapid proliferation of additive manufacturing and 3D printing technologies, architected cellular solids including truss-like 3D lattice topologies offer the opportunity to program the effective material response through topological design at the mesoscale. The present report summarizes several of the key findings from a 3-year Laboratory Directed Research and Development Program. The program set out to explore novel lattice topologies that can be designed to control, redirect, or dissipate energy from one or multiple insult environments relevant to Sandia missions, including crush, shock/impact, vibration, thermal, etc. In the first 4 sections, we document four novel lattice topologies stemming from this study: coulombic lattices, multi-morphology lattices, interpenetrating lattices, and pore-modified gyroid cellular solids, each with unique properties that had not been achieved by existing cellular/lattice metamaterials. The fifth section explores how unintentional lattice imperfections stemming from the manufacturing process, primarily sur face roughness in the case of laser powder bed fusion, serve to cause stochastic response but that in some cases such as elastic response the stochastic behavior is homogenized through the adoption of lattices. In the sixth section we explore a novel neural network screening process that allows such stocastic variability to be predicted. In the last three sections, we explore considerations of computational design of lattices. Specifically, in section 7 using a novel generative optimization scheme to design novel pareto-optimal lattices for multi-objective environments. In section 8, we use computational design to optimize a metallic lattice structure to absorb impact energy for a 1000 ft/s impact. And in section 9, we develop a modified micromorphic continuum model to solve wave propagation problems in lattices efficiently.

More Details

Science and Engineering of Cybersecurity by Uncertainty quantification and Rigorous Experimentation (SECURE) (Final Report)

Pinar, Ali P.; Tarman, Thomas D.; Swiler, Laura P.; Gearhart, Jared L.; Hart, Derek; Vugrin, Eric; Cruz, Gerardo J.; Arguello, Bryan; Geraci, Gianluca; Debusschere, Bert; Hanson, Seth T.; Outkin, Alexander V.; Thorpe, Jamie E.; Hart, William E.; Sahakian, Meghan A.; Gabert, Kasimir G.; Glatter, Casey; Johnson, Emma S.; Punla-Green, She'Ifa'

This report summarizes the activities performed as part of the Science and Engineering of Cybersecurity by Uncertainty quantification and Rigorous Experimentation (SECURE) Grand Challenge LDRD project. We provide an overview of the research done in this project, including work on cyber emulation, uncertainty quantification, and optimization. We present examples of integrated analyses performed on two case studies: a network scanning/detection study and a malware command and control study. We highlight the importance of experimental workflows and list references of papers and presentations developed under this project. We outline lessons learned and suggestions for future work.

More Details

Propagation of a Stress Pulse in a Heterogeneous Elastic Bar

Journal of Peridynamics and Nonlocal Modeling

Silling, Stewart

The propagation of a wave pulse due to low-speed impact on a one-dimensional, heterogeneous bar is studied. Due to the dispersive character of the medium, the pulse attenuates as it propagates. This attenuation is studied over propagation distances that are much longer than the size of the microstructure. A homogenized peridynamic material model can be calibrated to reproduce the attenuation and spreading of the wave. The calibration consists of matching the dispersion curve for the heterogeneous material near the limit of long wavelengths. It is demonstrated that the peridynamic method reproduces the attenuation of wave pulses predicted by an exact microstructural model over large propagation distances.

More Details

Integrated System and Application Continuous Performance Monitoring and Analysis Capability

Aaziz, Omar R.; Allan, Benjamin A.; Brandt, James M.; Cook, Jeanine; Devine, Karen; Elliott, James E.; Gentile, Ann C.; Hammond, Simon; Kelley, Brian M.; Lopatina, Lena; Moore, Stan G.; Olivier, Stephen L.; Foulk, James W.; Poliakoff, David; Pawlowski, Roger; Regier, Phillip; Schmitz, Mark E.; Schwaller, Benjamin; Surjadidjaja, Vanessa; Swan, Matthew S.; Tucker, Nick; Tucker, Thomas; Vaughan, Courtenay T.; Walton, Sara P.

Scientific applications run on high-performance computing (HPC) systems are critical for many national security missions within Sandia and the NNSA complex. However, these applications often face performance degradation and even failures that are challenging to diagnose. To provide unprecedented insight into these issues, the HPC Development, HPC Systems, Computational Science, and Plasma Theory & Simulation departments at Sandia crafted and completed their FY21 ASC Level 2 milestone entitled "Integrated System and Application Continuous Performance Monitoring and Analysis Capability." The milestone created a novel integrated HPC system and application monitoring and analysis capability by extending Sandia's Kokkos application portability framework, Lightweight Distributed Metric Service (LDMS) monitoring tool, and scalable storage, analysis, and visualization pipeline. The extensions to Kokkos and LDMS enable collection and storage of application data during run time, as it is generated, with negligible overhead. This data is combined with HPC system data within the extended analysis pipeline to present relevant visualizations of derived system and application metrics that can be viewed at run time or post run. This new capability was evaluated using several week-long, 290-node runs of Sandia's ElectroMagnetic Plasma In Realistic Environments ( EMPIRE ) modeling and design tool and resulted in 1TB of application data and 50TB of system data. EMPIRE developers remarked this capability was incredibly helpful for quickly assessing application health and performance alongside system state. In short, this milestone work built the foundation for expansive HPC system and application data collection, storage, analysis, visualization, and feedback framework that will increase total scientific output of Sandia's HPC users.

More Details

Mapping Stochastic Devices to Probabilistic Algorithms

Aimone, James B.; Safonov, Alexander

Probabilistic and Bayesian neural networks have long been proposed as a method to incorporate uncertainty about the world (both in training data and operation) into artificial intelligence applications. One approach to making a neural network probabilistic is to leverage a Monte Carlo sampling approach that samples a trained network while incorporating noise. Such sampling approaches for neural networks have not been extensively studied due to the prohibitive requirement of many computationally expensive samples. While the development of future microelectronics platforms that make this sampling more efficient is an attractive option, it has not been immediately clear how to sample a neural network and what the quality of random number generation should be. This research aimed to start addressing these two fundamental questions by examining basic “off the shelf” neural networks can be sampled through a few different mechanisms (including synapse “dropout” and neuron “dropout”) and examine how these sampling approaches can be evaluated both in terms of evaluating algorithm effectiveness and the required quality of random numbers.

More Details

Efficient flexible characterization of quantum processors with nested error models

New Journal of Physics

Nielsen, Erik N.; Rudinger, Kenneth M.; Proctor, Timothy J.; Young, Kevin; Blume-Kohout, Robin

We present a simple and powerful technique for finding a good error model for a quantum processor. The technique iteratively tests a nested sequence of models against data obtained from the processor, and keeps track of the best-fit model and its wildcard error (a metric of the amount of unmodeled error) at each step. Each best-fit model, along with a quantification of its unmodeled error, constitutes a characterization of the processor. We explain how quantum processor models can be compared with experimental data and to each other. We demonstrate the technique by using it to characterize a simulated noisy two-qubit processor.

More Details
Results 301–325 of 9,998
Results 301–325 of 9,998