Publications

Results 32001–32200 of 96,771

Search results

Jump to search filters

Sierra/SolidMechanics 4.46 Verification Tests Manual

Merewether, Mark T.; Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel J.; Le, San L.; Littlewood, David J.; Mosby, Matthew D.; Pierson, Kendall H.; Porter, V.L.; Shelton, Timothy S.; Thomas, Jesse D.; Tupek, Michael R.; Veilleux, Michael V.; Xavier, Patrick G.; Clutz, Christopher J.R.; Manktelow, Kevin M.

Presented in this document is a small portion of the tests that exist in the Sierra/SolidMechanics (Sierra/SM) verification test suite. Most of these tests are run nightly with the Sierra/SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra/SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra/SM Example Problems Manual. Note, many other verification tests exist in the Sierra/SM test suite, but have not yet been included in this manual.

More Details

Effects of charge noise on a pulse-gated singlet-triplet S - T- qubit

Physical Review B

Qi, Zhenyi; Wu, X.; Ward, Daniel R.; Prance, J.R.; Kim, Dohun; Laros, James H.; Mohr, R.T.; Shi, Zhan; Savage, D.E.; Lagally, M.G.; Eriksson, M.A.; Coppersmith, S.N.; Friesen, Mark; Vavilov, M.G.

We study the dynamics of a pulse-gated semiconductor double-quantum-dot qubit. In our experiments, the qubit coherence times are relatively long, but the visibility of the quantum oscillations is low. We show that these observations are consistent with a theory that incorporates decoherence arising from charge noise that gives rise to detuning fluctuations of the double dot. Because effects from charge noise are largest near the singlet-triplet avoided level crossing, the visibility of the oscillations is low when the singlet-triplet avoided level crossing occurs in the vicinity of the charge degeneracy point crossed during the manipulation, but there is only modest dephasing at the large detuning value at which the quantum phase accumulates. This theory agrees well with experimental data and predicts that the visibility can be increased greatly by appropriate tuning of the interdot tunneling rate.

More Details

Sierra/SolidMechanics 4.46 Goodyear Specific

Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel J.; Le, San L.; Littlewood, David J.; Merewether, Mark T.; Mosby, Matthew D.; Pierson, Kendall H.; Porter, V.L.; Shelton, Timothy S.; Thomas, Jesse D.; Tupek, Michael R.; Veilleux, Michael V.; Xavier, Patrick G.

This document covers Sierra/SolidMechanics capabilities specific to Goodyear use cases. Some information may be duplicated directly from the Sierra/SolidMechanics User’s Guide but is reproduced here to provide context for Goodyear-specific options.

More Details

Sierra/SolidMechanics 4.46 Example Problems Manual

Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel J.; Le, San L.; Littlewood, David J.; Merewether, Mark T.; Mosby, Matthew D.; Pierson, Kendall H.; Porter, V.L.; Shelton, Timothy S.; Thomas, Jesse D.; Tupek, Michael R.; Veilleux, Michael V.

Presented in this document are tests that exist in the Sierra Solid Mechanics example problem suite. The purpose of these examples is to showcase common and advanced code capabilities. Note that many other regression and verification tests exist in the Sierra/SM test suite that have not been included in this manual.

More Details

Sierra/SolidMechanics 4.46 User's Guide

Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel J.; Le, San L.; Littlewood, David J.; Merewether, Mark T.; Mosby, Matthew D.; Pierson, Kendall H.; Porter, V.L.; Shelton, Timothy S.; Thomas, Jesse D.; Tupek, Michael R.; Veilleux, Michael V.

Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutions of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.

More Details

A Perspective on the Interaction Between the NCSD and ANSI/ANS-8 Standards

Miller, John A.

The intent of this paper is to illuminate the reasons behind the culture of standards involvement by the criticality safety community in the NCSD, and specifically highlight those NCSD activities that build/support this culture. Many NCSD members are currently active in ANSI/ANS-8 and other standards and have been for the last half century. This paper was inspired by a request from ANS’s Professional Divisions Committee concerning how the professional divisions could “increase support to Standards development as Subject Matter Experts”. The healthy level of involvement (i.e., culture) by the NCSD membership in standards was noted. A history of the ANSI/ANS-8 series standards is provided, with roots going back to 1955. The need became apparent during a cluster of nuclear criticality accidents that occurred between 1958-1962. The first NCS related standard was the American Standard N6.1-1964, the parent of ANSI/ANS-8.1, which was prepared in 1958 and adopted in 1964. Thus, the involvement in standards by the NCS community goes back more than 50 years. The NCSD continues to help foster a culture of standards use and development. However, the support provided by the NCSD is frequently not recognized, and standards activities are often viewed as separate from the NCSD. This paper highlights the interaction between the NCSD and the ANSI/ANS-8 standards activities as well as the benefit of this support.

More Details

A Perspective on the Interaction Between the NCSD and ANSI/ANS-8 Standards [Slides]

Miller, John A.

The intent of this paper is to illuminate the reasons behind the culture of standards involvement by the criticality safety community in the NCSD, and specifically highlight those NCSD activities that build/support this culture. Many NCSD members are currently active in ANSI/ANS-8 and other standards and have been for the last half century. This paper was inspired by a request from ANS’s Professional Divisions Committee concerning how the professional divisions could “increase support to Standards development as Subject Matter Experts”. The healthy level of involvement (i.e., culture) by the NCSD membership in standards was noted. A history of the ANSI/ANS-8 series standards is provided, with roots going back to 1955. The need became apparent during a cluster of nuclear criticality accidents that occurred between 1958-1962. The first NCS related standard was the American Standard N6.1-1964, the parent of ANSI/ANS-8.1, which was prepared in 1958 and adopted in 1964. Thus, the involvement in standards by the NCS community goes back more than 50 years. The NCSD continues to help foster a culture of standards use and development. However, the support provided by the NCSD is frequently not recognized, and standards activities are often viewed as separate from the NCSD. This paper highlights the interaction between the NCSD and the ANSI/ANS-8 standards activities as well as the benefit of this support.

More Details

Spatial Heterogeneities and Onset of Passivation Breakdown at Lithium Anode Interfaces

Journal of Physical Chemistry. C

Leung, Kevin L.; Jungjohann, Katherine L.

Effective passivation of lithium metal surfaces, and prevention of battery-shorting lithium dendrite growth, are critical for implementing lithium metal anodes for batteries with increased power densities. Nanoscale surface heterogeneities can be “hot spots” where anode passivation breaks down. Motivated by the observation of lithium dendrites in pores and grain boundaries in all-solid batteries, we examine lithium metal surfaces covered with Li2O and/or LiF thin films with grain boundaries in them. Electronic structure calculations show that at >0.25 V computed equilibrium overpotential Li2O grain boundaries with sufficiently large pores can accommodate Li0 atoms which aid e– leakage and passivation breakdown. Strain often accompanies Li insertion; applying an ~1.7% strain already lowers the computed overpotential to 0.1 V. Lithium metal nanostructures as thin as 12 Å are thermodynamically favored inside cracks in Li2O films, becoming “incipient lithium filaments”. LiF films are more resistant to lithium metal growth. Finally, the models used herein should in turn inform passivating strategies in all-solid-state batteries.

More Details

Intrusion Detection with Unsupervised Heterogeneous Ensembles Using Cluster-Based Normalization

Proceedings - 2017 IEEE 24th International Conference on Web Services, ICWS 2017

Ruoti, Scott; Heidbrink, Scott H.; Oneill, Mark; Gustafson, Eric D.; Choe, Yung R.

Outlier detection has been shown to be a promising machine learning technique for a diverse array of felds and problem areas. However, traditional, supervised outlier detection is not well suited for problems such as network intrusion detection, where proper labelled data is scarce. This has created a focus on extending these approaches to be unsupervised, removing the need for explicit labels, but at a cost of poorer performance compared to their supervised counterparts. Recent work has explored ways of making up for this, such as creating ensembles of diverse models, or even diverse learning algorithms, to jointly classify data. While using unsupervised, heterogeneous ensembles of learning algorithms has been proposed as a viable next step for research, the implications of how these ensembles are built and used has not been explored.

More Details

Intrusion Detection with Unsupervised Heterogeneous Ensembles Using Cluster-Based Normalization

Proceedings - 2017 IEEE 24th International Conference on Web Services, ICWS 2017

Ruoti, Scott; Heidbrink, Scott H.; Oneill, Mark; Gustafson, Eric D.; Choe, Yung R.

Outlier detection has been shown to be a promising machine learning technique for a diverse array of felds and problem areas. However, traditional, supervised outlier detection is not well suited for problems such as network intrusion detection, where proper labelled data is scarce. This has created a focus on extending these approaches to be unsupervised, removing the need for explicit labels, but at a cost of poorer performance compared to their supervised counterparts. Recent work has explored ways of making up for this, such as creating ensembles of diverse models, or even diverse learning algorithms, to jointly classify data. While using unsupervised, heterogeneous ensembles of learning algorithms has been proposed as a viable next step for research, the implications of how these ensembles are built and used has not been explored.

More Details

STAR-CCM+ (CFD) Calculations and Validation L3:VVI.H2L.P15.02

Gilkey, Lindsay N.

This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design points in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.

More Details

CTF (Subchannel) Calculations and Validation L3:VVI.H2L.P15.01

Gordon, Natalie G.

The goal of the Verification and Validation Implementation (VVI) High to Low (Hi2Lo) process is utilizing a validated model in a high resolution code to generate synthetic data for improvement of the same model in a lower resolution code. This process is useful in circumstances where experimental data does not exist or it is not sufficient in quantity or resolution. Data from the high-fidelity code is treated as calibration data (with appropriate uncertainties and error bounds) which can be used to train parameters that affect solution accuracy in the lower-fidelity code model, thereby reducing uncertainty. This milestone presents a demonstration of the Hi2Lo process derived in the VVI focus area. The majority of the work performed herein describes the steps of the low-fidelity code used in the process with references to the work detailed in the companion high-fidelity code milestone (Reference 1). The CASL low-fidelity code used to perform this work was Cobra Thermal Fluid (CTF) and the high-fidelity code was STAR-CCM+ (STAR). The master branch version of CTF (pulled May 5, 2017 – Reference 2) was utilized for all CTF analyses performed as part of this milestone. The statistical and VVUQ components of the Hi2Lo framework were performed using Dakota version 6.6 (release date May 15, 2017 – Reference 3). Experimental data from Westinghouse Electric Company (WEC – Reference 4) was used throughout the demonstrated process to compare with the high-fidelity STAR results. A CTF parameter called Beta was chosen as the calibration parameter for this work. By default, Beta is defined as a constant mixing coefficient in CTF and is essentially a tuning parameter for mixing between subchannels. Since CTF does not have turbulence models like STAR, Beta is the parameter that performs the most similar function to the turbulence models in STAR. The purpose of the work performed in this milestone is to tune Beta to an optimal value that brings the CTF results closer to those measured in the WEC experiments.

More Details

Uncertainty characterization of particle location from refocused plenoptic images

Optics Express

Munz, Elise D.; Guildenbecher, Daniel R.; Thurow, Brian S.

Plenoptic imaging is a 3D imaging technique that has been applied for quantification of 3D particle locations and sizes. This work experimentally evaluates the accuracy and precision of such measurements by investigating a static particle field translated to known displacements. Measured 3D displacement values are determined from sharpness metrics applied to volumetric representations of the particle field created using refocused plenoptic images, corrected using a recently developed calibration technique. Comparison of measured and known displacements for many thousands of particles allows for evaluation of measurement uncertainty. Mean displacement error, as a measure of accuracy, is shown to agree with predicted spatial resolution over the entire measurement domain, indicating robustness of the calibration methods. On the other hand, variation in the error, as a measure of precision, fluctuates as a function of particle depth in the optical direction. Error shows the smallest variation within the predicted depth of field of the plenoptic camera, with a gradual increase outside this range. The quantitative uncertainty values provided here can guide future measurement optimization and will serve as useful metrics for design of improved processing algorithms.

More Details

High performance terahertz metasurface quantum-cascade VECSEL with an intra-cryostat cavity

Applied Physics Letters

Xu, Luyao; Curwen, Christopher A.; Reno, J.L.; Williams, Benjamin S.

A terahertz quantum-cascade (QC) vertical-external-cavity surface-emitting-laser (VECSEL) is demonstrated with over 5 mW power in continuous-wave and single-mode operation above 77 K, in combination with a near-Gaussian beam pattern with a full-width half-max divergence as narrow as ∼5° × 5°, with no evidence of thermal lensing. This is realized by creating an intra-cryostat VECSEL cavity to reduce the cavity loss and designing an active focusing metasurface reflector with low power dissipation for efficient heat removal. Also, the intra-cryostat configuration allows the evaluation of QC-VECSEL operation vs. temperature, showing a maximum pulsed mode operating temperature of 129 K. While the threshold current density in the QC-VECSEL is higher compared to that in a conventional edge-emitting metal-metal waveguide QC-laser, the beam quality, slope efficiency, maximum power, and thermal resistance are all significantly improved.

More Details

Effective g factor of low-density two-dimensional holes in a Ge quantum well

Applied Physics Letters

Lu, Tzu-Ming L.; Harris, Charles T.; Huang, S.H.; Chuang, Y.; Li, J.Y.; Liu, C.W.

We report the measurements of the effective g factor of low-density two-dimensional holes in a Ge quantum well. Using the temperature dependence of the Shubnikov-de Haas oscillations, we extract the effective g factor in a magnetic field perpendicular to the sample surface. Very large values of the effective g factor, ranging from ∼13 to ∼28, are observed in the density range of 1.4×1010 cm-2- 1.4×1011 cm-2. When the magnetic field is oriented parallel to the sample surface, the effective g factor is obtained from a protrusion in the magneto-resistance data that signify full spin polarization. In the latter orientation, a small effective g factor, ∼1.3-1.4, is measured in the density range of 1.5×1010 cm-2- 2×1010 cm-2. This very strong anisotropy is consistent with theoretical predictions and previous measurements in other 2D hole systems, such as InGaAs and GaSb.

More Details

LDRD Report: Topological Design Optimization of Convolutes in Next Generation Pulsed Power Devices

Cyr, Eric C.; von Winckel, Gregory J.; Kouri, Drew P.; Gardiner, Thomas A.; Ridzal, Denis R.; Shadid, John N.; Miller, Sean M.

This LDRD project was developed around the ambitious goal of applying PDE-constrained opti- mization approaches to design Z-machine components whose performance is governed by elec- tromagnetic and plasma models. This report documents the results of this LDRD project. Our differentiating approach was to use topology optimization methods developed for structural design and extend them for application to electromagnetic systems pertinent to the Z-machine. To achieve this objective a suite of optimization algorithms were implemented in the ROL library part of the Trilinos framework. These methods were applied to standalone demonstration problems and the Drekar multi-physics research application. Out of this exploration a new augmented Lagrangian approach to structural design problems was developed. We demonstrate that this approach has favorable mesh-independent performance. Both the final design and the algorithmic performance were independent of the size of the mesh. In addition, topology optimization formulations for the design of conducting networks were developed and demonstrated. Of note, this formulation was used to develop a design for the inner magnetically insulated transmission line on the Z-machine. The resulting electromagnetic device is compared with theoretically postulated designs.

More Details

Unidirectional photonic wire laser

Nature Photonics

Khalatpour, Ali; Reno, J.L.; Kherani, Nazir P.; Hu, Qing

Photonic wire lasers are a new genre of lasers that have a transverse dimension much smaller than the wavelength. Unidirectional emission is highly desirable as most of the laser power will be in the desired direction. Owing to their small lateral dimension relative to the wavelength, however, the mode mostly propagates outside the solid core. Consequently, conventional approaches to attach a highly reflective element to the rear facet, whether a thin film or a distributed Bragg reflector, are not applicable. Here we propose a simple and effective technique to achieve unidirectionality. Terahertz quantum-cascade lasers with distributed feedback (DFB) were chosen as the platform of the photonic wire lasers. Unidirectionality is achieved with a power ratio of the forward/backward of about eight, and the power of the forward-emitting laser is increased by a factor of 1.8 compared with a reference bidirectional DFB laser. Furthermore, we achieved a wall plug power efficiency of â 1/41%.

More Details

Numeric invariants from multidimensional persistence

Journal of Applied and Computational Topology

Skryzalin, Jacek S.

Topological data analysis is the study of data using techniques from algebraic topology. Often, one begins with a finite set of points representing data and a “filter” function which assigns a real number to each datum. Using both the data and the filter function, one can construct a filtered complex for further analysis. For example, applying the homology functor to the filtered complex produces an algebraic object known as a “one-dimensional persistence module”, which can often be interpreted as a finite set of intervals representing various geometric features in the data. If one runs the above process incorporating multiple filter functions simultaneously, one instead obtains a multidimensional persistence module. Unfortunately, these are much more difficult to interpret. In this article, we analyze the space of multidimensional persistence modules from the perspective of algebraic geometry. We first build a moduli space of a certain subclass of easily analyzed multidimensional persistence modules, which we construct specifically to capture much of the information which can be gained by using multidimensional persistence instead of one-dimensional persistence. We argue that the global sections of this space provide interesting numeric invariants when evaluated against our subclass of multidimensional persistence modules. Finally, we extend these global sections to the space of all multidimensional persistence modules and discuss how the resulting numeric invariants might be used to study data. This paper extends the results of Adcock et al. (Homol Homotopy Appl 18(1), 381–402, 2016) by constructing numeric invariants from the computation of a multidimensional persistence module as given by Carlsson et al. (J Comput Geom 1(1), 72–100, 2010).

More Details

Opacity from two-photon processes

High Energy Density Physics

Hansen, Stephanie B.; More, Richard M.; Nagayama, Taisuke N.

The recent iron opacity measurements performed at Sandia National Laboratory by Bailey and collaborators have raised questions about the completeness of the physical models normally used to understand partially ionized hot dense plasmas. We describe calculations of two-photon absorption, which is a candidate for the observed extra opacity. Our calculations do not yet match the experiments but show that the two-photon absorption process is strong enough to require careful consideration.

More Details

Topological photonic structures for nanophotonics

International Conference on Transparent Optical Networks

Subramania, Ganapathi S.; Anderson, P.D.

Topological photonic structures in analogy to their electronic counterparts can provide new functionalities in nanophotonics. In particular, they can possess topologically protected photonic modes that can propagate unidirectionally without scattering and can have an extreme photonic density of states (PDOS). These unique properties can directly impact many photonic systems in optical communications and in quantum information processing applications such as single photon transport. In analogy to spin Hall effect in electronics, photonic systems can exhibit helicity or pseudo-spin dependent light transport. Below we describe such a system in a honeycomb two-dimensional hole-array photonic crystal. Enabling such properties at optical frequencies and on chip-scale will be very important for practical applications of such phenomena.

More Details

Modeling shockwaves and impact phenomena with Eulerian peridynamics

International Journal of Impact Engineering

Silling, Stewart A.; Parks, Michael L.; Kamm, James R.; Weckner, Olaf; Rassaian, Mostafa

Most previous development of the peridynamic theory has assumed a Lagrangian formulation, in which the material model refers to an undeformed reference configuration. In the present work, an Eulerian form of material modeling is developed, in which bond forces depend only on the positions of material points in the deformed configuration. The formulation is consistent with the thermodynamic form of the peridynamic model and is derivable from a suitable expression for the free energy of a material. It is shown that the resulting formulation of peridynamic material models can be used to simulate strong shock waves and fluid response in which very large deformations make the Lagrangian form unsuitable. The Eulerian capability is demonstrated in numerical simulations of ejecta from a wavy free surface on a metal subjected to strong shock wave loading. The Eulerian and Lagrangian contributions to bond force can be combined in a single material model, allowing strength and fracture under tensile or shear loading to be modeled consistently with high compressive stresses. This capability is demonstrated in numerical simulation of bird strike against an aircraft, in which both tensile fracture and high pressure response are important.

More Details

VISAR Analysis in the Frequency Domain

Journal of Dynamic Behavior of Materials

Laros, James H.; Specht, Paul E.

VISAR measurements are typically analyzed in the time domain, where velocity is approximately proportional to fringe shift. Moving to the frequency domain clarifies the limitations of this approximation and suggests several improvements. For example, optical dispersion preserves high-frequency information, so a zero-dispersion (air delay) interferometer does not provide optimal time resolution. Combined VISAR measurements can also improve time resolution. With adequate bandwidth and reasonable noise levels, it is quite possible to achieve better resolution than the VISAR approximation allows.

More Details

Multilevel acceleration of scattering-source iterations with application to electron transport

Nuclear Engineering and Technology

Drumm, Clifton R.; Fan, Wesley C.

Acceleration/preconditioning strategies available in the SCEPTRE radiation transport code are described. A flexible transport synthetic acceleration (TSA) algorithm that uses a low-order discrete-ordinates (SN) or spherical-harmonics (PN) solve to accelerate convergence of a high-order SN source-iteration (SI) solve is described. Convergence of the low-order solves can be further accelerated by applying off-the-shelf incomplete-factorization or algebraic-multigrid methods. Also available is an algorithm that uses a generalized minimum residual (GMRES) iterative method rather than SI for convergence, using a parallel sweep-based solver to build up a Krylov subspace. TSA has been applied as a preconditioner to accelerate the convergence of the GMRES iterations. The methods are applied to several problems involving electron transport and problems with artificial cross sections with large scattering ratios. These methods were compared and evaluated by considering material discontinuities and scattering anisotropy. Observed accelerations obtained are highly problem dependent, but speedup factors around 10 have been observed in typical applications.

More Details

Self-ion irradiation effects on mechanical properties of nanocrystalline zirconium films

MRS Communications

Wang, Baoming; Tomar, Vikas; Hattar, Khalid M.; Haque, M.A.

Zirconium thin films were irradiated at room temperature with an 800 keV Zr+ beam using a 6 MV HVE Tandem accelerator to 1.36 displacement per atom damage. Freestanding tensile specimens, 100 nm thick and 10 nm grain size, were tested in situ inside a transmission electron microscope. Significant grain growth (>300%), texture evolution, and displacement damage defects were observed. Stress-strain profiles were mostly linear elastic below 20 nm grain size, but above this limit, the samples demonstrated yielding and strain hardening. Experimental results support the hypothesis that grain boundaries in nanocrystalline metals act as very effective defect sinks.

More Details

Throwing computing into reverse

IEEE Spectrum

Frank, Michael P.

For more than 50 years, computers have made steady and dramatic improvements, all thanks to Moore’s Law—the exponential increase over time in the number of transistors that can be fabricated on an integrated circuit of a given size. Moore’s Law owed its success to the fact that as transistors were made smaller, they became simultaneously cheaper, faster, and more energy efficient. The payoff from this win-win-win scenario enabled reinvestment in semiconductor fabrication technology that could make even smaller, more densely-packed transistors. And so this virtuous cycle continued, decade after decade. Now though, experts in industry, academia, and government laboratories anticipate that semiconductor miniaturization won’t continue much longer—maybe 10 years or so, at best. Making transistors smaller no longer yields the improvements it used to. The physical characteristics of small transistors forced clock speeds to cease getting faster more than a decade ago, which drove the industry to start building chips with multiple cores. But even multi-core architectures must contend with increasing amounts of “dark silicon,” areas of the chip that must be powered off to avoid overheating.

More Details

Gas Release as a Deformation Signal

Bauer, Stephen J.

Radiogenic noble gases are contained in crustal rock at inter and intra granular sites. The gas composition depends on lithology, geologic history, fluid phases, and the aging effect by decay of U, Th, and K. The isotopic signature of noble gases found in rocks is vastly different than that of the atmosphere which is contributed by a variety of sources. When rock is subjected to stress conditions exceeding about half its yield strength, micro-cracks begin to form. As rock deformation progresses a fracture network evolves, releasing trapped noble gases and changing the transport properties to gas migration. Thus, changes in gas emanation and noble gas composition from rocks could be used to infer changes in stress-state and deformation. The purpose of this study has been to evaluate the effect of deformation/strain rate upon noble gas release. Four triaxial experiments were attempted for a strain rate range of %7E10-8 /s (180,000s) to %7E 10-4/s (500s); the three fully successful experiments (at the faster strain rates) imply the following: (1) helium is measurably released for all strain rates during deformation, this release is in amounts 1-2 orders of magnitude greater than that present in the air, and (2) helium gas release increases with decreasing strain rate.

More Details

Fuego/Scefire MPMD Coupling L2 Milestone Executive Summary

Pierce, Flint P.; Tencer, John T.; Pautz, Shawn D.; Drumm, Clifton R.

This milestone campaign was focused on coupling Sandia physics codes SIERRA low Mach module Fuego and RAMSES Boltzmann transport code Sceptre(Scefire). Fuego enables simulation of low Mach, turbulent, reacting, particle laden flows on unstructured meshes using CVFEM for abnormal thermal environments throughout SNL and the larger national security community. Sceptre provides simulation for photon, neutron, and charged particle transport on unstructured meshes using Discontinuous Galerkin for radiation effects calculations at SNL and elsewhere. Coupling these ”best of breed” codes enables efficient modeling of thermal/fluid environments with radiation transport, including fires (pool, propellant, composite) as well as those with directed radiant fluxes. We seek to improve the experience of Fuego users who require radiation transport capabilities in two ways. The first is performance. We achieve this through leveraging additional computational resources for Scefire, reducing calculation times while leaving unaffected resources for fluid physics. This approach is new to Fuego, which previously utilized the same resources for both fluid and radiation solutions. The second improvement enables new radiation capabilities, including spectral (banded) radiation, beam boundary sources, and alternate radiation solvers (i.e. Pn). This summary provides an overview of these achievements.

More Details

International Collaboration on Spent Fuel Disposition in Crystalline Media: FY17 Progress Report

Wang, Yifeng

Active participation in international R&D is crucial for achieving the Spent Fuel Waste Science & Technology (SFWST) long-term goals of conducting “experiments to fill data needs and confirm advanced modeling approaches” and of having a “robust modeling and experimental basis for evaluation of multiple disposal system options” (by 2020). DOE’s Office of Nuclear Energy (NE) has developed a strategic plan to advance cooperation with international partners. The international collaboration on the evaluation of crystalline disposal media at Sandia National Laboratories (SNL) in FY17 focused on the collaboration through the Development of Coupled Models and their Validation against Experiments (DECOVALEX-2019) project. The DECOVALEX project is an international research and model comparison collaboration, initiated in 1992, for advancing the understanding and modeling of coupled thermo-hydro-mechanical-chemical (THMC) processes in geological systems. SNL has been participating in three tasks of the DECOVALEX project: Task A. Modeling gas injection experiments (ENGINEER), Task C. Modeling groundwater recovery experiment in tunnel (GREET), and Task F. Fluid inclusion and movement in the tight rock (FINITO).

More Details

Creation of the NaSCoRD Database

Denman, Matthew R.; Jankovsky, Zachary; Stuart, Zacharia W.

This report was written as part of a United States Department of Energy (DOE), Office of Nuclear Energy, Advanced Reactor Technologies program funded project to re-create the capabilities of the legacy Centralized Reliability Database Organization (CREDO) database. The CREDO database provided a record of component design and performance documentation across various systems that used sodium as a working fluid. Regaining this capability will allow the DOE complex and the domestic sodium reactor industry to better understand how previous systems were designed and built for use in improving the design and operations of future loops. The contents of this report include: overview of the current state of domestic sodium reliability databases; summary of the ongoing effort to improve, understand, and process the CREDO information; summary of the initial efforts to develop a unified sodium reliability database called the Sodium System Component Reliability Database (NaSCoRD); and explain both how potential users can access the domestic sodium reliability databases and the type of information that can be accessed from these databases.

More Details

Bayesian Regression of Thermodynamic Models of Redox Active Materials

Johnston, Katherine

Finding a suitable functional redox material is a critical challenge to achieving scalable, economically viable technologies for storing concentrated solar energy in the form of a defected oxide. Demonstrating e ectiveness for thermal storage or solar fuel is largely accomplished by using a thermodynamic model derived from experimental data. The purpose of this project is to test the accuracy of our regression model on representative data sets. Determining the accuracy of the model includes parameter tting the model to the data, comparing the model using di erent numbers of param- eters, and analyzing the entropy and enthalpy calculated from the model. Three data sets were considered in this project: two demonstrating materials for solar fuels by wa- ter splitting and the other of a material for thermal storage. Using Bayesian Inference and Markov Chain Monte Carlo (MCMC), parameter estimation was preformed on the three data sets. Good results were achieved, except some there was some deviations on the edges of the data input ranges. The evidence values were then calculated in a variety of ways and used to compare models with di erent number of parameters. It was believed that at least one of the parameters was unnecessary and comparing evidence values demonstrated that the parameter was need on one data set and not signi cantly helpful on another. The entropy was calculated by taking the derivative in one variable and integrating over another. and its uncertainty was also calculated by evaluating the entropy over multiple MCMC samples. Afterwards, all the parts were written up as a tutorial for the Uncertainty Quanti cation Toolkit (UQTk).

More Details

Removal of Dissolved Silica using Calcinated Hydrotalcite in Real-life Applications

Sasan, Koroush S.; Brady, Patrick V.; Krumhansl, James L.; Nenoff, T.M.

Water shortages are a growing global problem. Reclamation of industrial and municipal wastewater will be necessary in order to mitigate water scarcity. However, many operational challenges, such as silica scaling, prevent large scale water reuse. Previously, our team at Sandia has demonstrated the use of selective ion exchange materials, such as calcinated hydrotalcite (HTC, (Mg 6 Al 2 (OH) 16 (CO 3 )*4H 2 O)), for the low cost removal of silica from synthetic cooling tower water. However, it is not currently know if calcinated HTC has similar capabilities in realistic applications. The purpose of this study was to investigate the ability of calcinated HTC to remove silica from real cooling tower water. This was investigated under both batch and continuous conditions, and in the presence of competing ions. It was determined that calcinated HTC behaved similarly in real and synthetic cooling tower water; the HTC is highly selective for the silica even in the presence of competing cations. Therefore, the data concludes that calcinated HTC is a viable anti-scaling pretreatment for the reuse of industrial wastewaters.

More Details

Lithium Oxysilicate Compounds Final Report

Apblett, Christopher A.; Coyle, Jaclyn C.

In this study, the structure and composition of lithium silicate thin films deposited by RF magnetron co-sputtering is investigated. Five compositions ranging from Li2Si2O5 to Li8SiO6 were confirmed by inductively coupled plasma-optical emission spectroscopy (ICP-OES) and structure analysis on the evolution of non-bridging oxygens in the thin films was conducted with fourier transform infrared (FTIR) spectroscopy. It was found that non-bridging oxygens (NBOs) increased as the silicate network breaks apart with increasing lithium content which agrees with previous studies on lithium silicates. Thin film impurities were examined with x-ray photoelectron spectroscopy (XPS) and time of flight secondary ion mass spectroscopy (TOFSIMS) and traced back to target synthesis. This study utilizes a unique synthesis technique for lithium silicate thin films and can be referred to in future studies on the ionic conductivity of lithium silicates formed on the surface of silicon anodes in lithium ion batteries.

More Details

Scaling tests of a new algorithm for DFT hybrid-functional calculations on Trinity Haswell

Wright, Alan F.; Modine, N.A.

We show scaling results for materials of interest in Sandia Radiation-Effects and High-Energy-Density-Physics Mission Areas. Each timing is from a self-consistent calculation for bulk material. Two timings are given: (1) walltime for the construction of the CR exchange operator (Exchange-Operator) and (2) walltime for everything else (non-Exchange-Operator).

More Details

Graph Learning in Knowledge Bases

Goldberg, Sean; Wang, Daisy Z.

The amount of text data has been growing exponentially in recent years, giving rise to automatic information extraction methods that store text annotations in a database. The current state-of-theart structured prediction methods, however, are likely to contain errors and it’s important to be able to manage the overall uncertainty of the database. On the other hand, the advent of crowdsourcing has enabled humans to aid machine algorithms at scale. As part of this project we introduced pi-CASTLE , a system that optimizes and integrates human and machine computing as applied to a complex structured prediction problem involving conditional random fields (CRFs). We proposed strategies grounded in information theory to select a token subset, formulate questions for the crowd to label, and integrate these labelings back into the database using a method of constrained inference. On both a text segmentation task over academic citations and a named entity recognition task over tweets we showed an order of magnitude improvement in accuracy gain over baseline methods.

More Details

Non-RF Chain of Custody Item Monitor (CoCIM) Development Report

Brotz, Jay K.; Wade, James R.; Schwartz, Steven R.

The Chain of Custody Item Monitor (CoCIM) developed by Sandia National Laboratories is one of the most mature and well-studied active seals for use in containment applications for arms control treaty verification and international nuclear safeguards. However, its typical design includes wireless communications provided by a radio frequency (RF) transmitter and receiver. While this provides flexibility of movement for many applications, it is unnecessary and undesired for some treaty verification applications. This report details the design and construction of two variants of the CoCIM that remove the RF transmission capability in favor of directly connected wired and short-range infrared communications, as well as a new coordinator that is used to interface the CoCIM to a computer, and new interface software that is simplified for a likely inspection use case.

More Details

Extending Hypersonic Diagnostics to the Third Dimension

Guildenbecher, Daniel R.; Kunzler, William M.; Sweatt, W.C.; Richardson, Daniel R.; Casper, Katya M.

The design, construction, and initial testing of a high-magnification, long working-distance plenoptic camera is reported. A plenoptic camera uses a microlens array to enable resolution of the spatial and angular information of the incoming light field. With this, instantaneous images can be numerically refocused and perspective shifted in post-processing to enable instantaneous three-dimensional (3D) resolution of a scene. Prior to this work, most applications of plenoptic imaging were limited to relatively low magnifications (1× or less) or small working distances. Here, a unique system is developed with enables 5× magnification at a working distance of over a quarter meter. Experimental results demonstrate ~25 m spatial resolution with 3D imaging capabilities. This technology is demonstrated on two practical applications. First, burning aluminum particles on the order of 100 m in diameter are imaged near the reacting surface of a combusting solid rocket propellant. The long working distance is particularly advantageous for protection of the experimental hardware in this extremely hazardous environment. Next, background oriented schlieren is used to resolve the 3D structure of an underexpanded free jet. This demonstrates the ability to resolve index-of-refraction gradients at the working distances and spatial scales necessary to meet our ultimate goal of resolving 3D turbulent transition in the boundary layer of Sandia’s Hypersonic Wind Tunnel (HWT).

More Details

Zero Waste Strategic Plan for SNL

Wrons, Ralph J.

Sandia National Laboratories/New Mexico is located in Albuquerque, New Mexico, primarily on Department of Energy (DOE) permitted land on approximately 2,800 acres of Kirtland Air Force Base. There are approximately 5.5 million square feet of buildings, with a workforce of approximately 9200 personnel. Sandia National Laboratories Materials Sustainability and Pollution Prevention (MSP2) program adopted in 2008 an internal team goal for New Mexico site operations for Zero Waste to LandfilI by 2025. Sandia solicited a consultant to assist in the development of a Zero Waste Strategic Plan. The Zero Waste Consultant Team selected is a partnership of SBM Management Services and Gary Liss & Associates. The scope of this Plan is non-hazardous solid waste and covers the life cycle of material purchases to the use and final disposal of the items at the end of their life cycle.

More Details

FY17 ASC P&EM L2 Milestone 6009: Demonstrate Thread Scalability within Aria on Both Sides of Trinity

Clausen, Jonathan C.

The use of next-generation platforms (NGPs), also known as advanced technology systems (ATS), that incorporate many-core and heterogeneous architectures for scientific computing represent a tectonic shift in computing hardware design that will require massive development work within the Sierra applications to harness these systems to their full potential. The completion of this milestone represents a first step towards this effort by threading many of the computational kernels within Sierra/Aria.

More Details

Twistact techno-economic analysis for wind turbine applications

Naughton, Brian T.; Koplow, Jeffrey P.; Vanness, Justin W.; Sethuraman, Latha; Maness, Michael; Dykes, Katherine

This report is the final deliverable for a techno-economic analysis of the Sandia National Laboratories-developed Twistact rotary electrical conductor. The U.S. Department of Energy Wind Energy Technologies Office supported a team of researchers at Sandia National Laboratories and the National Renewable Energy Laboratory to evaluate the potential of the Twistact technology to serve as a viable replacement to rare-earth materials used in permanent-magnet direct-drive wind turbine generators. This report compares three detailed generator models, two as baseline technologies and a third incorporating the Twistact technology. These models are then used to calculate the levelized cost of energy (LCOE) for three comparable offshore wind plants using the three generator topologies. The National Renewable Energy Laboratorys techno-economic analysis indicates that Twistact technology can be used to design low-maintenance, brush-free, and wire-wound (instead of rare-earth-element (REE) permanent-magnet), direct-drive wind turbine generators without a significant change in LCOE and generation efficiency. Twistact technology acts as a hedge against sources of uncertain costs for direct-drive generators. On the one hand, for permanent-magnet direct-drive (PMDD) generators, the long-term price of REEs may increase due to increases in future demand, from electric vehicles and other technologies, whereas the supply remains limited and geographically concentrated. The potential higher prices in the future adversely affect the cost competitiveness of PMDD generators and may thwart industry investment in the development of the technology for wind turbine applications. Twistact technology can eliminate industry risk around the uncertainty of REE price and availability. Traditional wire-wound direct-drive generators experience reliability issues and higher maintenance costs because of the wear on the contact brushes necessary for field excitation. The brushes experience significant wear and require regular replacement over the lifetime of operation (on the order of a year or potentially less time). For offshore wind applications, the focus of this study, maintenance costs are higher than typical land-based systems due to the added time it often requires to access the site for repairs. Thus, eliminating the need for regular brush replacements reduces the uncertain costs and energy production losses associated with maintenance and replacement of contact brushes. Further, Twistact has a relatively negligible impact on LCOE but hedges risks associated with the current dominant designs for direct-drive generators for PMDD REE price volatility and wire-wound generator contact brush reliability. A final section looks at the overall supply chain of REEs considering the supply-side and demand-side drivers that encourage the risk of depending on these materials to support future deployment of not only wind energy but other industries as well.

More Details

Robust approaches to quantification of margin and uncertainty for sparse data

Hund, Lauren H.; Schroeder, Benjamin B.; Rumsey, Kelin R.; Murchison, Nicole M.

Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of the risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.

More Details
Results 32001–32200 of 96,771
Results 32001–32200 of 96,771