Publications

Results 74801–75000 of 99,299

Search results

Jump to search filters

Micro-Kelvin cold molecules

Chandler, David; Strecker, Kevin S.

We have developed a novel experimental technique for direct production of cold molecules using a combination of techniques from atomic optical and molecular physics and physical chemistry. The ability to produce samples of cold molecules has application in a broad spectrum of technical fields high-resolution spectroscopy, remote sensing, quantum computing, materials simulation, and understanding fundamental chemical dynamics. Researchers around the world are currently exploring many techniques for producing samples of cold molecules, but to-date these attempts have offered only limited success achieving milli-Kelvin temperatures with low densities. This Laboratory Directed Research and Development project is to develops a new experimental technique for producing micro-Kelvin temperature molecules via collisions with laser cooled samples of trapped atoms. The technique relies on near mass degenerate collisions between the molecule of interest and a laser cooled (micro-Kelvin) atom. A subset of collisions will transfer all (nearly all) of the kinetic energy from the 'hot' molecule, cooling the molecule at the expense of heating the atom. Further collisions with the remaining laser cooled atoms will thermally equilibrate the molecules to the micro-Kelvin temperature of the laser-cooled atoms.

More Details

Quantitative study of rectangular waveguide behavior in the THz

Wanke, Michael C.; Rowen, Adam M.; Nordquist, Christopher D.

This report describes our efforts to quantify the behavior of micro-fabricated THz rectangular waveguides on a configurable, robust semiconductor-based platform. These waveguides are an enabling technology for coupling THz radiation directly from or to lasers, mixers, detectors, antennas, and other devices. Traditional waveguides fabricated on semiconductor platforms such as dielectric guides in the infrared or co-planar waveguides in the microwave regions, suffer high absorption and radiative losses in the THz. The former leads to very short propagation lengths, while the latter will lead to unwanted radiation modes and/or crosstalk in integrated devices. This project exploited the initial developments of THz micro-machined rectangular waveguides developed under the THz Grand Challenge Program, but instead of focusing on THz transceiver integration, this project focused on exploring the propagation loss and far-field radiation patterns of the waveguides. During the 9 month duration of this project we were able to reproduce the waveguide loss per unit of length in the waveguides and started to explore how the loss depended on wavelength. We also explored the far-field beam patterns emitted by H-plane horn antennas attached to the waveguides. In the process we learned that the method of measuring the beam patterns has a significant impact on what is actually measured, and this may have an effect on most of the beam patterns of THz that have been reported to date. The beam pattern measurements improved significantly throughout the project, but more refinements of the measurement are required before a definitive determination of the beam-pattern can be made.

More Details

Cambio : a file format translation and analysis application for the nuclear response emergency community

Lasche, George

Cambio is an application intended to automatically read and display any spectrum file of any format in the world that the nuclear emergency response community might encounter. Cambio also provides an analysis capability suitable for HPGe spectra when detector response and scattering environment are not well known. Why is Cambio needed: (1) Cambio solves the following problem - With over 50 types of formats from instruments used in the field and new format variations appearing frequently, it is impractical for every responder to have current versions of the manufacturer's software from every instrument used in the field; (2) Cambio converts field spectra to any one of several common formats that are used for analysis, saving valuable time in an emergency situation; (3) Cambio provides basic tools for comparing spectra, calibrating spectra, and isotope identification with analysis suited especially for HPGe spectra; and (4) Cambio has a batch processing capability to automatically translate a large number of archival spectral files of any format to one of several common formats, such as the IAEA SPE or the DHS N42. Currently over 540 analysts and members of the nuclear emergency response community worldwide are on the distribution list for updates to Cambio. Cambio users come from all levels of government, university, and commercial partners around the world that support efforts to counter terrorist nuclear activities. Cambio is Unclassified Unlimited Release (UUR) and distributed by internet downloads with email notifications whenever a new build of Cambio provides for new formats, bug fixes, or new or improved capabilities. Cambio is also provided as a DLL to the Karlsruhe Institute for Transuranium Elements so that Cambio's automatic file-reading capability can be included at the Nucleonica web site.

More Details

Dynamic crack initiation toughness : experiments and peridynamic modeling

Foster, John T.

This is a dissertation on research conducted studying the dynamic crack initiation toughness of a 4340 steel. Researchers have been conducting experimental testing of dynamic crack initiation toughness, K{sub Ic}, for many years, using many experimental techniques with vastly different trends in the results when reporting K{sub Ic} as a function of loading rate. The dissertation describes a novel experimental technique for measuring K{sub Ic} in metals using the Kolsky bar. The method borrows from improvements made in recent years in traditional Kolsky bar testing by using pulse shaping techniques to ensure a constant loading rate applied to the sample before crack initiation. Dynamic crack initiation measurements were reported on a 4340 steel at two different loading rates. The steel was shown to exhibit a rate dependence, with the recorded values of K{sub Ic} being much higher at the higher loading rate. Using the knowledge of this rate dependence as a motivation in attempting to model the fracture events, a viscoplastic constitutive model was implemented into a peridynamic computational mechanics code. Peridynamics is a newly developed theory in solid mechanics that replaces the classical partial differential equations of motion with integral-differential equations which do not require the existence of spatial derivatives in the displacement field. This allows for the straightforward modeling of unguided crack initiation and growth. To date, peridynamic implementations have used severely restricted constitutive models. This research represents the first implementation of a complex material model and its validation. After showing results comparing deformations to experimental Taylor anvil impact for the viscoplastic material model, a novel failure criterion is introduced to model the dynamic crack initiation toughness experiments. The failure model is based on an energy criterion and uses the K{sub Ic} values recorded experimentally as an input. The failure model is then validated against one class of problems showing good agreement with experimental results.

More Details

Presto 4.14 users guide

Spencer, Benjamin W.

Presto is a three-dimensional transient dynamics code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. Contact capabilities are parallel and scalable. The Presto 4.14 User's Guide provides information about the functionality in Presto and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Presto is similar to that of the code Adagio [3]. Adagio is a three-dimensional quasi-static code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. Adagio, like Presto, is built on the SIERRA Framework [1]. Contact capabilities for Adagio are also parallel and scalable. A significant feature of Adagio is that it offers a multilevel, nonlinear iterative solver. Because of the similarities in input and usage between Presto and Adagio, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Adagio may be found in the Presto user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example, though both Presto and Adagio have contact functionality, the content of the chapter on contact in the two guides differs. Important references for both Adagio and Presto are given in the references section at the end of this chapter. Adagio was preceded by the codes JAC and JAS3D; JAC is described in Reference 4; JAS3D is described in Reference 5. Presto was preceded by the code Pronto3D. Pronto3D is described in References 6 and 7. Some of the fundamental nonlinear technology used by both Presto and Adagio are described in References 8, 9, and 10. Currently, both Presto and Adagio use the Exodus II database and the XDMF database; Exodus II is more commonly used than XDMF. (Other options may be added in the future.) The Exodus II database format is described in Reference 11, and the XDMF database format is described in Reference 12. Important information about contact is provided in the reference document for ACME [13]. ACME is a third-party library for contact. One of the key concepts for the command structure in the input file is a concept referred to as scope. A detailed explanation of scope is provided in Section 1.2. Most of the command lines in Chapter 2 are related to a certain scope rather than to some particular functionality.

More Details

Adagio 4.14 users guide

Spencer, Benjamin W.

This document is a user's guide for the code Adagio. Adagio is a three-dimensional, implicit solid mechanics code with a versatile element library, nonlinear material models, and capabilities for modeling large deformation and contact. Adagio is a parallel code, and its nonlinear solver and contact capabilities enable scalable solutions of large problems. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. The Adagio 4.14 User's Guide provides information about the functionality in Adagio and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Adagio is similar to that of the code Presto [3]. Presto, like Adagio, is a solid mechanics code built on the SIERRA Framework. The primary difference between the two codes is that Presto uses explicit time integration for transient dynamics analysis, whereas Adagio is an implicit code. Because of the similarities in input and usage between Adagio and Presto, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Presto may be found in the Adagio user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example, though both Presto and Adagio have contact functionality, the content of the chapter on contact in the two guides differs. Important references for both Adagio and Presto are given in the references section at the end of this chapter. Adagio was preceded by the codes JAC and JAS3D; JAC is described in Reference 4; JAS3D is described in Reference 5. Presto was preceded by the code Pronto3D. Pronto3D is described in References 6 and 7. Some of the fundamental nonlinear technology used by both Presto and Adagio are described in References 8, 9, and 10. Currently, both Presto and Adagio use the Exodus II database and the XDMF database; Exodus II is more commonly used than XDMF. (Other options may be added in the future.) The Exodus II database format is described in Reference 11, and the XDMF database format is described in Reference 12. Important information about contact is provided in the reference document for ACME [13]. ACME is a third-party library for contact. One of the key concepts for the command structure in the input file is a concept referred to as scope. A detailed explanation of scope is provided in Section 1.2. Most of the command lines in Chapter 2 are related to a certain scope rather than to some particular functionality.

More Details

Systems engineering management plans

Rodriguez, Tamara S.

The Systems Engineering Management Plan (SEMP) is a comprehensive and effective tool used to assist in the management of systems engineering efforts. It is intended to guide the work of all those involved in the project. The SEMP is comprised of three main sections: technical project planning and control, systems engineering process, and engineering specialty integration. The contents of each section must be tailored to the specific effort. A model outline and example SEMP are provided. The target audience is those who are familiar with the systems engineering approach and who have an interest in employing the SEMP as a tool for systems management. The goal of this document is to provide the reader with an appreciation for the use and importance of the SEMP, as well as provide a framework that can be used to create the management plan.

More Details

Plume rise calculations using a control volume approach and the damped spring oscillator analogy

2008 Proceedings of the ASME Summer Heat Transfer Conference, HT 2008

Brown, Alexander L.; Bixler, Nathan E.

The PUFF code was originally written and designed to calculate the rise of a large detonation or deflagration non-continuous plume (puff) in the atmosphere. It is based on a buoyant spherical control volume approximation. The theory for the model is updated and presented. The model has been observed to result in what are believed to be unrealistic plume elevation oscillations as the plume approaches the terminal elevation. Recognizing a similarity between the equations for a classical damped spring oscillator and the present model, the plume rise model can be analyzed by evaluating equivalent spring constants and damping functions. Such an analysis suggests a buoyant plume in the atmosphere is significantly under-damped, explaining the occurrence of the oscillations in the model. Based on lessons learned from the analogy evaluations and guided by comparisons with early plume rise data, a set of assumptions is proposed to address the excessive oscillations found in the predicted plume near the terminal elevation, and to improve the robustness of the predictions. This is done while retaining the basic context of the present model formulation. The propriety of the present formulation is evaluated. The revised model fits the vast majority of the existing data to +/- 25%, which is considered reasonable given the present model form. Further validation efforts would be advisable, but are impeded by a lack of quality existing datasets. Copyright © 2008 by ASME.

More Details

Optical requirements with turbulence correction for long-range biometrics

Proceedings of SPIE - The International Society for Optical Engineering

Soehnel, Grant; Bagwell, Brett E.; Dixon, Kevin R.; Wick, David V.

Iris recognition utilizes distinct patterns found in the human iris to perform identification. Image acquisition is a critical first step towards successful operation of iris recognition systems. However, the quality of iris images required by standard iris recognition algorithms puts hard constraints on the imaging optical systems which have resulted in demonstrated systems to date requiring a relatively short subject stand-off distance. In this paper, we study long-range iris recognition at distances as large as 200 meters, and determine conditions the imaging system must satisfy for identification at longer stand-off distances. © 2009 SPIE.

More Details

An end-to-end approach to developing biological and chemical detector requirements

Proceedings of SPIE - The International Society for Optical Engineering

Purvis, Liston K.; Foltz, Greg W.; West, Todd H.; Edwards, Donna M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Teclemariam, Nerayo P.

Effective defense against chemical and biological threats requires an "end-to-end" strategy that encompasses the entire problem space, from threat assessment and target hardening to response planning and recovery. A key element of the strategy is the definition of appropriate system requirements for surveillance and detection of threat agents. Our end-toend approach to venue chem/bio defense is captured in the Facilities Weapons of Mass Destruction Decision Analysis Capability (FacDAC), an integrated system-of-systems toolset that can be used to generate requirements across all stages of detector development. For example, in the early stage of detector development the approach can be used to develop performance targets (e.g., sensitivity, selectivity, false positive rate) to provide guidance on what technologies to pursue. In the development phase, after a detector technology has been selected, the approach can aid in determining performance trade-offs and down-selection of competing technologies. During the application stage, the approach can be employed to design optimal defensive architectures that make the best use of available technology to maximize system performance. This presentation will discuss the end-to-end approach to defining detector requirements and demonstrate the capabilities of the FacDAC toolset using examples from a number of studies for the Department of Homeland Security. © 2009 SPIE.

More Details

Model building techniques for analysis

Brooks, Sean; Cordova, Theresa E.; Henry, Ronald C.; Martin, Wilbur D.; Mcdaniel, Karen; Walther, Howard P.

The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

More Details

Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis

Hines, Valerie A.

This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

More Details

A comparison of Lagrangian/Eulerian approaches for tracking the kinematics of high deformation solid motion

Ames, Thomas L.; Robinson, Allen C.

The modeling of solids is most naturally placed within a Lagrangian framework because it requires constitutive models which depend on knowledge of the original material orientations and subsequent deformations. Detailed kinematic information is needed to ensure material frame indifference which is captured through the deformation gradient F. Such information can be tracked easily in a Lagrangian code. Unfortunately, not all problems can be easily modeled using Lagrangian concepts due to severe distortions in the underlying motion. Either a Lagrangian/Eulerian or a pure Eulerian modeling framework must be introduced. We discuss and contrast several Lagrangian/Eulerian approaches for keeping track of the details of material kinematics.

More Details

Investigating Methods of Supporting Dynamically Linked Executables on High Performance Computing Platforms

Laros, James H.; Kelly, Suzanne M.; Levenhagen, Michael; Pedretti, Kevin T.T.

Shared libraries have become ubiquitous and are used to achieve great resource efficiencies on many platforms. The same properties that enable efficiencies on time-shared computers and convenience on small clusters prove to be great obstacles to scalability on large clusters and High Performance Computing platforms. In addition, Light Weight operating systems such as Catamount have historically not supported the use of shared libraries specifically because they hinder scalability. In this report we will outline the methods of supporting shared libraries on High Performance Computing platforms using Light Weight kernels that we investigated. The considerations necessary to evaluate utility in this area are many and sometimes conflicting. While our initial path forward has been determined based on this evaluation we consider this effort ongoing and remain prepared to re-evaluate any technology that might provide a scalable solution. This report is an evaluation of a range of possible methods of supporting dynamically linked executables on capability class1 High Performance Computing platforms. Efforts are ongoing and extensive testing at scale is necessary to evaluate performance. While performance is a critical driving factor, supporting whatever method is used in a production environment is an equally important and challenging task.

More Details

Final report : impacts analysis for cyber attack on electric power systems (national SCADA test bed FY09)

Stamp, Jason E.; Laviolette, Randall A.

The development continues for Finite State Abstraction (FSA) methods to enable Impacts Analysis (IA) for cyber attack against power grid control systems. Building upon previous work, we successfully demonstrated the addition of Bounded Model Checking (BMC) to the FSA method, which constrains grid conditions to reasonable behavior. The new FSA feature was successfully implemented and tested. FSA is an important part of IA for the power grid, complementing steady-state approaches. It enables the simultaneous evaluation of myriad dynamic trajectories for the system, which in turn facilitates IA for whole ranges of system conditions simultaneously. Given the potentially wide range and subtle nature of potential control system attacks, this is a promising research approach. In this report, we will explain the addition of BMC to the previous FSA work and some testing/simulation upon the implemented code using a two-bus test system. The current FSA approach and code allow the calculation of the acceptability of power grid conditions post-cyber attack (over a given time horizon and for a specific grid topology). Future work will enable analysis spanning various topologies (to account for switching events), as well as an understanding of the cyber attack stimuli that can lead to undesirable grid conditions.

More Details

Improving performance via mini-applications

Doerfler, Douglas W.; Crozier, Paul; Edwards, Harold C.; Williams, Alan B.; Rajan, Mahesh; Keiter, Eric R.; Thornquist, Heidi K.

Application performance is determined by a combination of many choices: hardware platform, runtime environment, languages and compilers used, algorithm choice and implementation, and more. In this complicated environment, we find that the use of mini-applications - small self-contained proxies for real applications - is an excellent approach for rapidly exploring the parameter space of all these choices. Furthermore, use of mini-applications enriches the interaction between application, library and computer system developers by providing explicit functioning software and concrete performance results that lead to detailed, focused discussions of design trade-offs, algorithm choices and runtime performance issues. In this paper we discuss a collection of mini-applications and demonstrate how we use them to analyze and improve application performance on new and future computer platforms.

More Details

Computational investigation of thermal gas separation for CO2 capture

Torczynski, John R.; Gallis, Michael A.; Brooks, Carlton F.; Brady, Patrick V.; Bryan, C.R.

This report summarizes the work completed under the Laboratory Directed Research and Development (LDRD) project 09-1351, 'Computational Investigation of Thermal Gas Separation for CO{sub 2} Capture'. Thermal gas separation for a binary mixture of carbon dioxide and nitrogen is investigated using the Direct Simulation Monte Carlo (DSMC) method of molecular gas dynamics. Molecular models for nitrogen and carbon dioxide are developed, implemented, compared to theoretical results, and compared to several experimental thermophysical properties. The molecular models include three translational modes, two fully excited rotational modes, and vibrational modes, whose degree of excitation depends on the temperature. Nitrogen has one vibrational mode, and carbon dioxide has four vibrational modes (two of which are degenerate). These models are used to perform a parameter study for mixtures of carbon dioxide and nitrogen confined between parallel walls over realistic ranges of gas temperatures and nominal concentrations of carbon dioxide. The degree of thermal separation predicted by DSMC is slightly higher than experimental values and is sensitive to the details of the molecular models.

More Details

MEMS reliability: Where are we now?

Microelectronics Reliability

Tanner, Danelle M.

This paper reviews the significant successes in MEMS products from a reliability perspective. MEMS reliability is challenging and can be device and process dependent, but exercising the proper reliability techniques very early in product development has yielded success for many manufacturers. The reliability concerns of various devices are discussed including ink jet printhead, inertial sensors, pressure sensors, micro-mirror arrays, and the emerging applications of RF switches and resonators. Metal contacting RF switches are susceptible to hydrocarbon contamination which can increase the contact resistance over cycle count. Packaging techniques are described in the context of the whole reliability program. © 2009 Elsevier Ltd.

More Details

Science at the interface : grain boundaries in nanocrystalline metals

Foiles, Stephen M.; Medlin, Douglas L.; Holm, Elizabeth A.; Brewer, Luke N.; Hattar, Khalid M.; Knapp, J.A.; Rodriguez, Mark A.

Interfaces are a critical determinant of the full range of materials properties, especially at the nanoscale. Computational and experimental methods developed a comprehensive understanding of nanograin evolution based on a fundamental understanding of internal interfaces in nanocrystalline nickel. It has recently been shown that nanocrystals with a bi-modal grain-size distribution possess a unique combination of high-strength, ductility and wear-resistance. We performed a combined experimental and theoretical investigation of the structure and motion of internal interfaces in nanograined metal and the resulting grain evolution. The properties of grain boundaries are computed for an unprecedented range of boundaries. The presence of roughening transitions in grain boundaries is explored and related to dramatic changes in boundary mobility. Experimental observations show that abnormal grain growth in nanograined materials is unlike conventional scale material in both the level of defects and the formation of unfavored phases. Molecular dynamics simulations address the origins of some of these phenomena.

More Details

Testing military grade magnetics (transformers, inductors and coils)

Vrabel, Paul E.

Engineers and designers are constantly searching for test methods to qualify or 'prove-in' new designs. In the High Reliability world of military parts, design test, qualification tests, in process tests and product characteristic tests, become even more important. The use of in process and function tests has been adopted as a way of demonstrating that parts will operate correctly and survive its 'use' environments. This paper discusses various types of tests to qualify the magnetic components - the current carrying capability of coils, a next assembly 'as used' test, a corona test and inductance at temperature test. Each of these tests addresses a different potential failure on a component. The entire process from design to implementation is described.

More Details

Nanostructures from hydrogen implantation of metals

Ong, Markus D.; Yang, Nancy; Depuit, Ryan J.; Mcwatters, Bruce R.; Causey, Rion A.

This study investigates a pathway to nanoporous structures created by hydrogen implantation in aluminum. Previous experiments for fusion applications have indicated that hydrogen and helium ion implantations are capable of producing bicontinuous nanoporous structures in a variety of metals. This study focuses specifically on hydrogen and helium implantations of aluminum, including complementary experimental results and computational modeling of this system. Experimental results show the evolution of the surface morphology as the hydrogen ion fluence increases from 10{sup 17} cm{sup -2} to 10{sup 18} cm{sup -2}. Implantations of helium at a fluence of 10{sup 18} cm{sup -2} produce porosity on the order of 10 nm. Computational modeling demonstrates the formation of alanes, their desorption, and the resulting etching of aluminum surfaces that likely drives the nanostructures that form in the presence of hydrogen.

More Details

Efficient algorithms for mixed aleatory-epistemic uncertainty quantification with application to radiation-hardened electronics. Part I, algorithms and benchmark results

Eldred, Michael; Swiler, Laura P.

This report documents the results of an FY09 ASC V&V Methods level 2 milestone demonstrating new algorithmic capabilities for mixed aleatory-epistemic uncertainty quantification. Through the combination of stochastic expansions for computing aleatory statistics and interval optimization for computing epistemic bounds, mixed uncertainty analysis studies are shown to be more accurate and efficient than previously achievable. Part I of the report describes the algorithms and presents benchmark performance results. Part II applies these new algorithms to UQ analysis of radiation effects in electronic devices and circuits for the QASPR program.

More Details

RF/microwave properties of nanotubes and nanowires : LDRD Project 105876 final report

Lee, Mark; Highstrete, Clark; Hsu, Julia W.; Scrymgeour, David

LDRD Project 105876 was a research project whose primary goal was to discover the currently unknown science underlying the basic linear and nonlinear electrodynamic response of nanotubes and nanowires in a manner that will support future efforts aimed at converting forefront nanoscience into innovative new high-frequency nanodevices. The project involved experimental and theoretical efforts to discover and understand high frequency (MHz through tens of GHz) electrodynamic response properties of nanomaterials, emphasizing nanowires of silicon, zinc oxide, and carbon nanotubes. While there is much research on DC electrical properties of nanowires, electrodynamic characteristics still represent a major new frontier in nanotechnology. We generated world-leading insight into how the low dimensionality of these nanomaterials yields sometimes desirable and sometimes problematic high-frequency properties that are outside standard model electron dynamics. In the cases of silicon nanowires and carbon nanotubes, evidence of strong disorder or glass-like charge dynamics was measured, indicating that these materials still suffer from serious inhomogeneities that limit there high frequency performance. Zinc oxide nanowires were found to obey conventional Drude dynamics. In all cases, a significant practical problem involving large impedance mismatch between the high intrinsic impedance of all nanowires and nanotubes and high-frequency test equipment had to be overcome.

More Details

THz transceiver characterization : LDRD project 139363 final report

Lee, Mark; Wanke, Michael C.; Nordquist, Christopher D.; Cich, Michael J.; Wendt, Joel R.; Fuller, Charles T.; Reno, John L.

LDRD Project 139363 supported experiments to quantify the performance characteristics of monolithically integrated Schottky diode + quantum cascade laser (QCL) heterodyne mixers at terahertz (THz) frequencies. These integrated mixers are the first all-semiconductor THz devices to successfully incorporate a rectifying diode directly into the optical waveguide of a QCL, obviating the conventional optical coupling between a THz local oscillator and rectifier in a heterodyne mixer system. This integrated mixer was shown to function as a true heterodyne receiver of an externally received THz signal, a breakthrough which may lead to more widespread acceptance of this new THz technology paradigm. In addition, questions about QCL mode shifting in response to temperature, bias, and external feedback, and to what extent internal frequency locking can improve stability have been answered under this project.

More Details

Overview of geologic storage of natural gas with an emphasis on assessing the feasibility of storing hydrogen

Lord, Anna S.

In many regions across the nation geologic formations are currently being used to store natural gas underground. Storage options are dictated by the regional geology and the operational need. The U.S. Department of Energy (DOE) has an interest in understanding theses various geologic storage options, the advantages and disadvantages, in the hopes of developing an underground facility for the storage of hydrogen as a low cost storage option, as part of the hydrogen delivery infrastructure. Currently, depleted gas/oil reservoirs, aquifers, and salt caverns are the three main types of underground natural gas storage in use today. The other storage options available currently and in the near future, such as abandoned coal mines, lined hard rock caverns, and refrigerated mined caverns, will become more popular as the demand for natural gas storage grows, especially in regions were depleted reservoirs, aquifers, and salt deposits are not available. The storage of hydrogen within the same type of facilities, currently used for natural gas, may add new operational challenges to the existing cavern storage industry, such as the loss of hydrogen through chemical reactions and the occurrence of hydrogen embrittlement. Currently there are only three locations worldwide, two of which are in the United States, which store hydrogen. All three sites store hydrogen within salt caverns.

More Details

Application of advanced laser diagnostics to hypersonic wind tunnels and combustion systems

Hsu, Andrea G.; Frank, Jonathan H.

This LDRD was a Sandia Fellowship that supported Andrea Hsu's PhD research at Texas A&M University and her work as a visitor at Sandia's Combustion Research Facility. The research project at Texas A&M University is concerned with the experimental characterization of hypersonic (Mach>5) flowfields using experimental diagnostics. This effort is part of a Multidisciplinary University Research Initiative (MURI) and is a collaboration between the Chemistry and Aerospace Engineering departments. Hypersonic flight conditions often lead to a non-thermochemical equilibrium (NTE) state of air, where the timescale of reaching a single (equilibrium) Boltzmann temperature is much longer than the timescale of the flow. Certain molecular modes, such as vibrational modes, may be much more excited than the translational or rotational modes of the molecule, leading to thermal-nonequilibrium. A nontrivial amount of energy is therefore contained within the vibrational mode, and this energy cascades into the flow as thermal energy, affecting flow properties through vibrational-vibrational (V-V) and vibrational-translational (V-T) energy exchanges between the flow species. The research is a fundamental experimental study of these NTE systems and involves the application of advanced laser and optical diagnostics towards hypersonic flowfields. The research is broken down into two main categories: the application and adaptation of existing laser and optical techniques towards characterization of NTE, and the development of new molecular tagging velocimetry techniques which have been demonstrated in an underexpanded jet flowfield, but may be extended towards a variety of flowfields. In addition, Andrea's work at Sandia National Labs involved the application of advanced laser diagnostics to flames and turbulent non-reacting jets. These studies included quench-free planar laser-induced fluorescence measurements of nitric oxide (NO) and mixture fraction measurements via Rayleigh scattering.

More Details

Quantitative resilience analysis through control design

Vugrin, Eric; Camphouse, Russell; Sunderland, Daniel

Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

More Details

Plasmonic devices and sensors built from ordered nanoporous materials

Allendorf, Mark; Houk, Ronald H.; Jacobs, Benjamin J.; El Gabaly, Farid

The objective of this project is to lay the foundation for using ordered nanoporous materials known as metal-organic frameworks (MOFs) to create devices and sensors whose properties are determined by the dimensions of the MOF lattice. Our hypothesis is that because of the very short (tens of angstroms) distances between pores within the unit cell of these materials, enhanced electro-optical properties will be obtained when the nanopores are infiltrated to create nanoclusters of metals and other materials. Synthetic methods used to produce metal nanoparticles in disordered templates or in solution typically lead to a distribution of particle sizes. In addition, creation of the smallest clusters, with sizes of a few to tens of atoms, remains very challenging. Nanoporous metal-organic frameworks (MOFs) are a promising solution to these problems, since their long-range crystalline order creates completely uniform pore sizes with potential for both steric and chemical stabilization. We report results of synthetic efforts. First, we describe a systematic investigation of silver nanocluster formation within MOFs using three representative MOF templates. The as-synthesized clusters are spectroscopically consistent with dimensions {le} 1 nm, with a significant fraction existing as Ag{sub 3} clusters, as shown by electron paramagnetic resonance. Importantly, we show conclusively that very rapid TEM-induced MOF degradation leads to agglomeration and stable, easily imaged particles, explaining prior reports of particles larger than MOF pores. These results solve an important riddle concerning MOF-based templates and suggest that heterostructures composed of highly uniform arrays of nanoparticles within MOFs are feasible. Second, a preliminary study of methods to incorporate fulleride (K{sub 3}C{sub 60}) guest molecules within MOF pores that will impart electrical conductivity is described.

More Details

Optimized nanoporous materials

Robinson, David; Jacobs, Benjamin J.; Ong, Markus D.; Tran, Kim L.; Langham, Mary E.; Ha, Cindy M.

Nanoporous materials have maximum practical surface areas for electrical charge storage; every point in an electrode is within a few atoms of an interface at which charge can be stored. Metal-electrolyte interfaces make best use of surface area in porous materials. However, ion transport through long, narrow pores is slow. We seek to understand and optimize the tradeoff between capacity and transport. Modeling and measurements of nanoporous gold electrodes has allowed us to determine design principles, including the fact that these materials can deplete salt from the electrolyte, increasing resistance. We have developed fabrication techniques to demonstrate architectures inspired by these principles that may overcome identified obstacles. A key concept is that electrodes should be as close together as possible; this is likely to involve an interpenetrating pore structure. However, this may prove extremely challenging to fabricate at the finest scales; a hierarchically porous structure can be a worthy compromise.

More Details

Material compatibility and thermal aging of thermoelectric materials

Morales, Alfredo M.; Chames, Jeffery M.; Clift, W.M.; Gardea, Andrew D.; Whalen, Scott A.

In order to design a thermoelectric (TE) module suitable for long-term elevated temperature use, the Department 8651 has conducted parametric experiments to study material compatibility and thermal aging of TE materials. In addition, a comprehensive material characterization has been preformed to examine thermal stability of P- and N-based alloys and their interaction with interconnect diffusion barrier(s) and solder. At present, we have completed the 7-days aging experiments for 36 tiles, from ambient to 250 C. The thermal behavior of P- and N-based alloys and their thermal interaction with both Ni and Co diffusion barriers and Au-Sn solder were examined. The preliminary results show the microstructure, texture, alloy composition, and hardness of P-(Bi,Sb){sub 2}Te{sub 3} and N-Bi{sub 2}(Te,Se){sub 3} alloys are thermally stable up to 7 days annealing at 250 C. However, metallurgical reactions between the Ni-phosphor barriers and P-type base alloy were evident at temperatures {ge} 175 C. At 250 C, the depth (or distance) of the metallurgical reaction and/or Ni diffusion into P-(Bi,Sb){sub 2}Te{sub 3} is approximately 10-15 {micro}m. This thermal instability makes the Ni-phosphor barrier unsuitable for use at temperatures {ge} 175 C. The Co barrier appeared to be thermally stable and compatible with P(Bi,Sb){sub 2}Te{sub 3} at all annealing temperatures, with the exception of a minor Co diffusion into Au-Sn solder at {ge} 175 C. The effects of Co diffusion on long-term system reliability and/or the thermal stability of the Co barrier are yet to be determined. Te evaporation and its subsequent reaction with Au-Sn solder and Ni and Co barriers on the ends of the tiles at temperatures {ge} 175 C were evident. The Te loss and its effect on the long-term required stoichiometry of P-(Bi, Sb){sub 2}Te{sub 3} are yet to be understood. The aging experiments of 90 days and 180 days are ongoing and scheduled to be completed in 30 days and 150 days, respectively. Material characterization activities are continuing for the remaining tiles.

More Details

Compositional ordering and stability in nanostructured, bulk thermoelectric alloys

Medlin, Douglas L.

Thermoelectric materials have many applications in the conversion of thermal energy to electrical power and in solid-state cooling. One route to improving thermoelectric energy conversion efficiency in bulk material is to embed nanoscale inclusions. This report summarize key results from a recently completed LDRD project exploring the science underpinning the formation and stability of nanostructures in bulk thermoelectric and the quantitative relationships between such structures and thermoelectric properties.

More Details

Extreme solid state refrigeration using nanostructured Bi-Te alloys

Sharma, Peter A.; Morales, Alfredo M.; Spataru, Catalin D.

Materials are desperately needed for cryogenic solid state refrigeration. We have investigated nanostructured Bi-Te alloys for their potential use in Ettingshausen refrigeration to liquid nitrogen temperatures. These alloys form alternating layers of Bi{sub 2} and Bi{sub 2}Te{sub 3} blocks in equilibrium. The composition Bi{sub 4}Te{sub 3} was identified as having the greatest potential for having a high Ettingshausen figure of merit. Both single crystal and polycrystalline forms of this material were synthesized. After evaluating the Ettingshausen figure of merit for a large, high quality polycrystal, we simulated the limits of practical refrigeration in this material from 200 to 77 K using a simple device model. The band structure was also computed and compared to experiments. We discuss the crystal growth, transport physics, and practical refrigeration potential of Bi-Te alloys.

More Details

Plasmonic filters

Shaner, Eric A.; Passmore, Brandon S.; Barrick, T.A.

Metal films perforated with subwavelength hole arrays have been show to demonstrate an effect known as Extraordinary Transmission (EOT). In EOT devices, optical transmission passbands arise that can have up to 90% transmission and a bandwidth that is only a few percent of the designed center wavelength. By placing a tunable dielectric in proximity to the EOT mesh, one can tune the center frequency of the passband. We have demonstrated over 1 micron of passive tuning in structures designed for an 11 micron center wavelength. If a suitable midwave (3-5 micron) tunable dielectric (perhaps BaTiO{sub 3}) were integrated with an EOT mesh designed for midwave operation, it is possible that a fast, voltage tunable, low temperature filter solution could be demonstrated with a several hundred nanometer passband. Such an element could, for example, replace certain components in a filter wheel solution.

More Details

Advanced optical measurements for characterizing photophysical properties of single nanoparticles

Davis, Ryan W.; Hayes, Dulce C.; Wheeler, David R.; Polsky, Ronen; Brozik, Susan M.

Formation of complex nanomaterials would ideally involve single-pot reaction conditions with one reactive site per nanoparticle, resulting in a high yield of incrementally modified or oriented structures. Many studies in nanoparticle functionalization have sought to generate highly uniform nanoparticles with tailorable surface chemistry necessary to produce such conjugates, with limited success. In order to overcome these limitations, we have modified commercially available nanoparticles with multiple potential reaction sites for conjugation with single ssDNAs, proteins, and small unilamellar vesicles. These approaches combined heterobifunctional and biochemical template chemistries with single molecule optical methods for improved control of nanomaterial functionalization. Several interesting analytical results have been achieved by leveraging techniques unique to SNL, and provide multiple paths for future improvements for multiplex nanoparticle synthesis and characterization. Hyperspectral imaging has proven especially useful for assaying substrate immobilized fluorescent particles. In dynamic environments, temporal correlation spectroscopies have been employed for tracking changes in diffusion/hydrodynamic radii, particle size distributions, and identifying mobile versus immobile sample fractions at unbounded dilution. Finally, Raman fingerprinting of biological conjugates has been enabled by resonant signal enhancement provided by intimate interactions with nanoparticles and composite nanoshells.

More Details

Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems

Burton, David; Tarman, Thomas D.; Van Leeuwen, Brian P.; Onunkwo, Uzoma; Urias, Vincent; Mcdonald, Michael J.

This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

More Details

Viscoelastic coupling of nanoelectromechanical resonators

Simonson, Robert J.; Staton, Alan W.

This report summarizes work to date on a new collaboration between Sandia National Laboratories and the California Institute of Technology (Caltech) to utilize nanoelectromechanical resonators designed at Caltech as platforms to measure the mechanical properties of polymeric materials at length scales on the order of 10-50 nm. Caltech has succeeded in reproducibly building cantilever resonators having major dimensions on the order of 2-5 microns. These devices are fabricated in pairs, with free ends separated by reproducible gaps having dimensions on the order of 10-50 nm. By controlled placement of materials that bridge the very small gap between resonators, the mechanical devices become coupled through the test material, and the transmission of energy between the devices can be monitored. This should allow for measurements of viscoelastic properties of polymeric materials at high frequency over short distances. Our work to date has been directed toward establishing this measurement capability at Sandia.

More Details

Parallel contingency statistics with Titan

Pebay, Philippe P.; Thompson, David

This report summarizes existing statistical engines in VTK/Titan and presents the recently parallelized contingency statistics engine. It is a sequel to [PT08] and [BPRT09] which studied the parallel descriptive, correlative, multi-correlative, and principal component analysis engines. The ease of use of this new parallel engines is illustrated by the means of C++ code snippets. Furthermore, this report justifies the design of these engines with parallel scalability in mind; however, the very nature of contingency tables prevent this new engine from exhibiting optimal parallel speed-up as the aforementioned engines do. This report therefore discusses the design trade-offs we made and study performance with up to 200 processors.

More Details

Toward improved branch prediction through data mining

Hemmert, Karl S.

Data mining and machine learning techniques can be applied to computer system design to aid in optimizing design decisions, improving system runtime performance. Data mining techniques have been investigated in the context of branch prediction. Specifically, a comparison of traditional branch predictor performance has been made to data mining algorithms. Additionally, the possiblity of whether additional features available within the architectural state might serve to further improve branch prediction has been evaluated. Results show that data mining techniques indicate potential for improved branch prediction, especially when register file contents are included as a feature set.

More Details

Transmissive infrared frequency selective surfaces and infrared antennas : final report for LDRD 105749

Davids, Paul; Cruz-Cabrera, Alvaro A.; Basilio, Lorena I.; Wendt, Joel R.; Kemme, Shanalyn A.; Johnson, William A.; Loui, Hung

Plasmonic structures open up new opportunities in photonic devices, sometimes offering an alternate method to perform a function and sometimes offering capabilities not possible with standard optics. In this LDRD we successfully demonstrated metal coatings on optical surfaces that do not adversely affect the transmission of those surfaces at the design frequency. This technology could be applied as an RF noise blocking layer across an optical aperture or as a method to apply an electric field to an active electro-optic device without affecting optical performance. We also demonstrated thin optical absorbers using similar patterned surfaces. These infrared optical antennas show promise as a method to improve performance in mercury cadmium telluride detectors. Furthermore, these structures could be coupled with other components to lead to direct rectification of infrared radiation. This possibility leads to a new method for infrared detection and energy harvesting of infrared radiation.

More Details

Molecular fountain

Strecker, Kevin S.; Chandler, David

A molecular fountain directs slowly moving molecules against gravity to further slow them to translational energies that they can be trapped and studied. If the molecules are initially slow enough they will return some time later to the position from which they were launched. Because this round trip time can be on the order of a second a single molecule can be observed for times sufficient to perform Hz level spectroscopy. The goal of this LDRD proposal was to construct a novel Molecular Fountain apparatus capable of producing dilute samples of molecules at near zero temperatures in well-defined user-selectable, quantum states. The slowly moving molecules used in this research are produced by the previously developed Kinematic Cooling technique, which uses a crossed atomic and molecular beam apparatus to generate single rotational level molecular samples moving slowly in the laboratory reference frame. The Kinematic Cooling technique produces cold molecules from a supersonic molecular beam via single collisions with a supersonic atomic beam. A single collision of an atom with a molecule occurring at the correct energy and relative velocity can cause a small fraction of the molecules to move very slowly vertically against gravity in the laboratory. These slowly moving molecules are captured by an electrostatic hexapole guiding field that both orients and focuses the molecules. The molecules are focused into the ionization region of a time-of-flight mass spectrometer and are ionized by laser radiation. The new molecular fountain apparatus was built utilizing a new design for molecular beam apparatus that has allowed us to miniaturize the apparatus. This new design minimizes the volumes and surface area of the machine allowing smaller pumps to maintain the necessary background pressures needed for these experiments.

More Details

Final LDRD report : enhanced spontaneous emission rate in visible III-nitride LEDs using 3D photonic crystal cavities

Fischer, Arthur J.; Subramania, Ganapathi S.; Lee, Yun-Ju; Koleske, Daniel; Li, Qiming L.; Wang, George T.; Luk, Ting S.; Fullmer, Kristine W.

The fundamental spontaneous emission rate for a photon source can be modified by placing the emitter inside a periodic dielectric structure allowing the emission to be dramatically enhanced or suppressed depending on the intended application. We have investigated the relatively unexplored realm of interaction between semiconductor emitters and three dimensional photonic crystals in the visible spectrum. Although this interaction has been investigated at longer wavelengths, very little work has been done in the visible spectrum. During the course of this LDRD, we have fabricated TiO{sub 2} logpile photonic crystal structures with the shortest wavelength band gap ever demonstrated. A variety of different emitters with emission between 365 nm and 700 nm were incorporated into photonic crystal structures. Time-integrated and time-resolved photoluminescence measurements were performed to measure changes to the spontaneous emission rate. Both enhanced and suppressed emission were demonstrated and attributed to changes to the photonic density of states.

More Details

Scalable analysis tools for sensitivity analysis and UQ (3160) results

Ice, Lisa; Fabian, Nathan; Moreland, Kenneth D.; Bennett, Janine C.; Thompson, David; Karelitz, David B.

The 9/30/2009 ASC Level 2 Scalable Analysis Tools for Sensitivity Analysis and UQ (Milestone 3160) contains feature recognition capability required by the user community for certain verification and validation tasks focused around sensitivity analysis and uncertainty quantification (UQ). These feature recognition capabilities include crater detection, characterization, and analysis from CTH simulation data; the ability to call fragment and crater identification code from within a CTH simulation; and the ability to output fragments in a geometric format that includes data values over the fragments. The feature recognition capabilities were tested extensively on sample and actual simulations. In addition, a number of stretch criteria were met including the ability to visualize CTH tracer particles and the ability to visualize output from within an S3D simulation.

More Details

Advanced fuel chemistry for advanced engines

Taatjes, Craig A.; Miller, James A.; Fernandes, Ravi X.; Zador, Judit; Jusinski, Leonard E.

Autoignition chemistry is central to predictive modeling of many advanced engine designs that combine high efficiency and low inherent pollutant emissions. This chemistry, and especially its pressure dependence, is poorly known for fuels derived from heavy petroleum and for biofuels, both of which are becoming increasingly prominent in the nation's fuel stream. We have investigated the pressure dependence of key ignition reactions for a series of molecules representative of non-traditional and alternative fuels. These investigations combined experimental characterization of hydroxyl radical production in well-controlled photolytically initiated oxidation and a hybrid modeling strategy that linked detailed quantum chemistry and computational kinetics of critical reactions with rate-equation models of the global chemical system. Comprehensive mechanisms for autoignition generally ignore the pressure dependence of branching fractions in the important alkyl + O{sub 2} reaction systems; however we have demonstrated that pressure-dependent 'formally direct' pathways persist at in-cylinder pressures.

More Details

Approaches for scalable modeling and emulation of cyber systems : LDRD final report

Mayo, Jackson R.; Minnich, Ronald G.; Rudish, Donald W.; Armstrong, Robert C.

The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminary theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.

More Details

Practical reliability and uncertainty quantification in complex systems : final report

Grace, Matthew D.; Red-Horse, John R.; Pebay, Philippe P.; Ringland, James T.; Zurn, Rena M.; Diegert, Kathleen V.

The purpose of this project was to investigate the use of Bayesian methods for the estimation of the reliability of complex systems. The goals were to find methods for dealing with continuous data, rather than simple pass/fail data; to avoid assumptions of specific probability distributions, especially Gaussian, or normal, distributions; to compute not only an estimate of the reliability of the system, but also a measure of the confidence in that estimate; to develop procedures to address time-dependent or aging aspects in such systems, and to use these models and results to derive optimal testing strategies. The system is assumed to be a system of systems, i.e., a system with discrete components that are themselves systems. Furthermore, the system is 'engineered' in the sense that each node is designed to do something and that we have a mathematical description of that process. In the time-dependent case, the assumption is that we have a general, nonlinear, time-dependent function describing the process. The major results of the project are described in this report. In summary, we developed a sophisticated mathematical framework based on modern probability theory and Bayesian analysis. This framework encompasses all aspects of epistemic uncertainty and easily incorporates steady-state and time-dependent systems. Based on Markov chain, Monte Carlo methods, we devised a computational strategy for general probability density estimation in the steady-state case. This enabled us to compute a distribution of the reliability from which many questions, including confidence, could be addressed. We then extended this to the time domain and implemented procedures to estimate the reliability over time, including the use of the method to predict the reliability at a future time. Finally, we used certain aspects of Bayesian decision analysis to create a novel method for determining an optimal testing strategy, e.g., we can estimate the 'best' location to take the next test to minimize the risk of making a wrong decision about the fitness of a system. We conclude this report by proposing additional fruitful areas of research.

More Details

Vibrational spectra of nanowires measured using laser doppler vibrometry and STM studies of epitaxial graphene : an LDRD fellowship report

Biedermann, Laura B.

A few of the many applications for nanowires are high-aspect ratio conductive atomic force microscope (AFM) cantilever tips, force and mass sensors, and high-frequency resonators. Reliable estimates for the elastic modulus of nanowires and the quality factor of their oscillations are of interest to help enable these applications. Furthermore, a real-time, non-destructive technique to measure the vibrational spectra of nanowires will help enable sensor applications based on nanowires and the use of nanowires as AFM cantilevers (rather than as tips for AFM cantilevers). Laser Doppler vibrometry is used to measure the vibration spectra of individual cantilevered nanowires, specifically multiwalled carbon nanotubes (MWNTs) and silver gallium nanoneedles. Since the entire vibration spectrum is measured with high frequency resolution (100 Hz for a 10 MHz frequency scan), the resonant frequencies and quality factors of the nanowires are accurately determined. Using Euler-Bernoulli beam theory, the elastic modulus and spring constant can be calculated from the resonance frequencies of the oscillation spectrum and the dimensions of the nanowires, which are obtained from parallel SEM studies. Because the diameters of the nanowires studied are smaller than the wavelength of the vibrometer's laser, Mie scattering is used to estimate the lower diameter limit for nanowires whose vibration can be measured in this way. The techniques developed in this thesis can be used to measure the vibrational spectra of any suspended nanowire with high frequency resolution Two different nanowires were measured - MWNTs and Ag{sub 2}Ga nanoneedles. Measurements of the thermal vibration spectra of MWNTs under ambient conditions showed that the elastic modulus, E, of plasma-enhanced chemical vapor deposition (PECVD) MWNTs is 37 {+-} 26 GPa, well within the range of E previously reported for CVD-grown MWNTs. Since the Ag{sub 2}Ga nanoneedles have a greater optical scattering efficiency than MWNTs, their vibration spectra was more extensively studied. The thermal vibration spectra of Ag{sub 2}Ga nanoneedles was measured under both ambient and low-vacuum conditions. The operational deflection shapes of the vibrating Ag{sub 2}Ga nanoneedles was also measured, allowing confirmation of the eigenmodes of vibration. The modulus of the crystalline nanoneedles was 84.3 {+-} 1.0 GPa. Gas damping is the dominate mechanism of energy loss for nanowires oscillating under ambient conditions. The measured quality factors, Q, of oscillation are in line with theoretical predictions of air damping in the free molecular gas damping regime. In the free molecular regime, Q{sub gas} is linearly proportional to the density and diameter of the nanowire and inversely proportional to the air pressure. Since the density of the Ag{sub 2}Ga nanoneedles is three times that of the MWNTs, the Ag{sub 2}Ga nanoneedles have greater Q at atmospheric pressures. Our initial measurements of Q for Ag{sub 2}Ga nanoneedles in low-vacuum (10 Torr) suggest that the intrinsic Q of these nanoneedles may be on the order of 1000. The epitaxial carbon that grows after heating (000{bar 1}) silicon carbide (SiC) to high temperatures (1450-1600) in vacuum was also studied. At these high temperatures, the surface Si atoms sublime and the remaining C atoms reconstruct to form graphene. X-ray photoelectron spectroscopy (XPS) and scanning tunneling microscopy (STM) were used to characterize the quality of the few-layer graphene (FLG) surface. The XPS studies were useful in confirming the graphitic composition and measuring the thickness of the FLG samples. STM studies revealed a wide variety of nanometer-scale features that include sharp carbon-rich ridges, moire superlattices, one-dimensional line defects, and grain boundaries. By imaging these features with atomic scale resolution, considerable insight into the growth mechanisms of FLG on the carbon-face of SiC is obtained.

More Details

A fully implicit method for 3D quasi-steady state magnetic advection-diffusion

Siefert, Christopher; Robinson, Allen C.

We describe the implementation of a prototype fully implicit method for solving three-dimensional quasi-steady state magnetic advection-diffusion problems. This method allows us to solve the magnetic advection diffusion equations in an Eulerian frame with a fixed, user-prescribed velocity field. We have verified the correctness of method and implementation on two standard verification problems, the Solberg-White magnetic shear problem and the Perry-Jones-White rotating cylinder problem.

More Details

Parallelism of the SANDstorm hash algorithm

Schroeppel, Richard C.; Torgerson, Mark D.

Mainstream cryptographic hashing algorithms are not parallelizable. This limits their speed and they are not able to take advantage of the current trend of being run on multi-core platforms. Being limited in speed limits their usefulness as an authentication mechanism in secure communications. Sandia researchers have created a new cryptographic hashing algorithm, SANDstorm, which was specifically designed to take advantage of multi-core processing and be parallelizable on a wide range of platforms. This report describes a late-start LDRD effort to verify the parallelizability claims of the SANDstorm designers. We have shown, with operating code and bench testing, that the SANDstorm algorithm may be trivially parallelized on a wide range of hardware platforms. Implementations using OpenMP demonstrates a linear speedup with multiple cores. We have also shown significant performance gains with optimized C code and the use of assembly instructions to exploit particular platform capabilities.

More Details

Nanoengineering for solid-state lighting

Crawford, Mary H.; Fischer, Arthur J.; Koleske, Daniel; Lee, Stephen R.; Missert, Nancy

This report summarizes results from a 3-year Laboratory Directed Research and Development project performed in collaboration with researchers at Rensselaer Polytechnic Institute. Our collaborative effort was supported by Sandia's National Institute for Nanoengineering and focused on the study and application of nanoscience and nanoengineering concepts to improve the efficiency of semiconductor light-emitting diodes for solid-state lighting applications. The project explored LED efficiency advances with two primary thrusts: (1) the study of nanoscale InGaN materials properties, particularly nanoscale crystalline defects, and their impact on internal quantum efficiency, and (2) nanoscale engineering of dielectric and metal materials and integration with LED heterostructures for enhanced light extraction efficiency.

More Details

Low impedance z-pinch drivers without post-hole convolute current adders

Savage, Mark E.

Present-day pulsed-power systems operating in the terawatt regime typically use post-hole convolute current adders to operate at sufficiently low impedance. These adders necessarily involve magnetic nulls that connect the positive and negative electrodes. The resultant loss of magnetic insulation results in electron losses in the vicinity of the nulls that can severely limit the efficiency of the delivery of the system's energy to a load. In this report, we describe an alternate transformer-based approach to obtaining low impedance. The transformer consists of coils whose windings are in parallel rather than in series, and does not suffer from the presence of magnetic nulls. By varying the pitch of the coils windings, the current multiplication ratio can be varied, leading to a more versatile driver. The coupling efficiency of the transformer, its behavior in the presence of electron flow, and its mechanical strength are issues that need to be addressed to evaluate the potential of transformer-based current multiplication as a viable alternative to conventional current adder technology.

More Details

Low dislocation GaN via defect-filtering, self-assembled SiO2-sphere layers

Wang, George T.; Li, Qiming L.

The III-nitride (AlGaInN) materials system forms the foundation for white solid-state lighting, the adoption of which could significantly reduce U.S. energy needs. While the growth of GaN-based devices relies on heteroepitaxy on foreign substrates, the heteroepitaxial layers possess a high density of dislocations due to poor lattice and thermal expansion match. These high dislocation densities have been correlated with reduced internal quantum efficiency and lifetimes for GaN-based LEDs. Here, we demonstrate an inexpensive method for dislocation reduction in GaN grown on sapphire and silicon substrates. This technique, which requires no lithographic patterning, GaN is selectively grown through self-assembled layers of silica microspheres which act to filter out dislocations. Using this method, the threading dislocation density for GaN on sapphire was reduced from 3.3 x 10{sup 9} cm{sup -2} to 4.0 x 10{sup 7} cm{sup -2}, and from the 10{sup 10} cm{sup -2} range to {approx}6.0 x 10{sup 7} cm{sup -2} for GaN on Si(111). This large reduction in dislocation density is attributed to a dislocation blocking and bending by the unique interface between GaN and silica microspheres.

More Details

Nano-engineering by optically directed self-assembly

Grillet, Anne M.; Koehler, Timothy P.; Brotherton, Christopher M.; Bell, Nelson S.; Gorby, Allen D.; Reichert, Matthew D.; Brinker, C.J.; Bogart, Katherine H.A.

Lack of robust manufacturing capabilities have limited our ability to make tailored materials with useful optical and thermal properties. For example, traditional methods such as spontaneous self-assembly of spheres cannot generate the complex structures required to produce a full bandgap photonic crystals. The goal of this work was to develop and demonstrate novel methods of directed self-assembly of nanomaterials using optical and electric fields. To achieve this aim, our work employed laser tweezers, a technology that enables non-invasive optical manipulation of particles, from glass microspheres to gold nanoparticles. Laser tweezers were used to create ordered materials with either complex crystal structures or using aspherical building blocks.

More Details

Quantifying uncertainty from material inhomogeneity

Battaile, Corbett C.; Brewer, Luke N.; Emery, John M.; Boyce, Brad L.

Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the deformation behavior was found to depend strongly on the character of the nearby microstructure.

More Details

Antibacterial polymer coatings

Hibbs, Michael; Allen, Ashley N.; Wilson, Mollye; Tucker, Mark D.

A series of poly(sulfone)s with quaternary ammonium groups and another series with aldehyde groups are synthesized and tested for biocidal activity against vegetative bacteria and spores, respectively. The polymers are sprayed onto substrates as coatings which are then exposed to aqueous suspensions of organisms. The coatings are inherently biocidal and do not release any agents into the environment. The coatings adhere well to both glass and CARC-coated coupons and they exhibit significant biotoxicity. The most effective quaternary ammonium polymers kills 99.9% of both gram negative and gram positive bacteria and the best aldehyde coating kills 81% of the spores on its surface.

More Details

Density-functional-theory results for Ga and As vacancies in GaAs obtained using the Socorro code

Wright, Alan F.

The Socorro code has been used to obtain density-functional theory results for the Ga vacancy (V{sub Ga}) and the As vacancy (V{sub As}) in GaAs. Calculations were performed in a nominal 216-atom simulation cell using the local-density approximation for exchange and correlation. The results from these calculations include: (1) the charge states, the atomic configurations of stable and metastable states, (2) energy levels in the gap, and (3) activation energies for migration. Seven charge states were found for the Ga vacancy (-3, -2, -1, 0, +1, +2, +3). The stable structures of the -3, -2, -1, and 0 charge states consist of an empty Ga site with four As neighbors displaying T{sub d} symmetry. The stable structures of the +1, +2, and +3 charge states consist of an As antisite next to an As vacancy; AsGa-V{sub As}. Five charge states were found for the As vacancy (-3, -2, -1, 0, +1). The stable structures of the -1, 0, and +1 charge states consist of an empty As site with four Ga neighbors displaying C{sub 2v} symmetry. The stable structures of the -3 and -2 charge states consist of a Ga antisite next to a Ga vacancy; Ga{sub As}-V{sub Ga}. The energy levels of V{sub Ga} lie below mid-gap while the energy levels of As{sub Ga}-V{sub As} lie above and below mid-gap. All but one of the V{sub As} energy levels lie above mid-gap while the As{sub Ga}-V{sub As} energy level lies below mid-gap. The migration activation energies of the defect states were all found to be larger than 1.35 eV.

More Details

Simultaneous electronic and lattice characterization using coupled femtosecond spectroscopic techniques

Serrano, Justin R.; Hopkins, Patrick E.

High-power electronics are central in the development of radar, solid-state lighting, and laser systems. Large powers, however, necessitate improved heat dissipation as heightened temperatures deleteriously affect both performance and reliability. Heat dissipation, in turn, is determined by the cascade of energy from the electronic to lattice system. Full characterization of the transport then requires analysis of each. In response, this four-month late start effort has developed a transient thermoreflectance (TTR) capability that probes the thermal response of electronic carriers with 100 fs resolution. Simultaneous characterization of the lattice carriers with this electronic assessment was then investigated by equipping the optical arrangement to acquire a Raman signal from radiation discarded during the TTR experiment. Initial results show only tentative acquisition of a Raman response at these timescales. Using simulations of the response, challenges responsible for these difficulties are then examined and indicate that with outlined refinements simultaneous acquisition of TTR/Raman signals remains attainable in the near term.

More Details

Stress-induced chemical detection using flexible metal-organic frameworks

Allendorf, Mark; Houk, Ronald H.

In this work we demonstrate the concept of stress-induced chemical detection using metal-organic frameworks (MOFs) by integrating a thin film of the MOF HKUST-1 with a microcantilever surface. The results show that the energy of molecular adsorption, which causes slight distortions in the MOF crystal structure, can be efficiently converted to mechanical energy to create a highly responsive, reversible, and selective sensor. This sensor responds to water, methanol, and ethanol vapors, but yields no response to either N{sub 2} or O{sub 2}. The magnitude of the signal, which is measured by a built-in piezoresistor, is correlated with the concentration and can be fitted to a Langmuir isotherm. Furthermore, we show that the hydration state of the MOF layer can be used to impart selectivity to CO{sub 2}. We also report the first use of surface-enhanced Raman spectroscopy to characterize the structure of a MOF film. We conclude that the synthetic versatility of these nanoporous materials holds great promise for creating recognition chemistries to enable selective detection of a wide range of analytes. A force field model is described that successfully predicts changes in MOF properties and the uptake of gases. This model is used to predict adsorption isotherms for a number of representative compounds, including explosives, nerve agents, volatile organic compounds, and polyaromatic hydrocarbons. The results show that, as a result of relatively large heats of adsorption (> 20 kcal mol{sup -1}) in most cases, we expect an onset of adsorption by MOF as low as 10{sup -6} kPa, suggesting the potential to detect compounds such as RDX at levels as low as 10 ppb at atmospheric pressure.

More Details

Intelligent front-end sample preparation tool using acoustic streaming

Vreeland, Erika; Smith, Gennifer; Edwards, Thayne L.; James, Conrad D.; Mcclain, Jaime; Murton, Jaclyn K.; Kotulski, Joseph D.; Clem, Paul

We have successfully developed a nucleic acid extraction system based on a microacoustic lysis array coupled to an integrated nucleic acid extraction system all on a single cartridge. The microacoustic lysing array is based on 36{sup o} Y cut lithium niobate, which couples bulk acoustic waves (BAW) into the microchannels. The microchannels were fabricated using Mylar laminates and fused silica to form acoustic-fluidic interface cartridges. The transducer array consists of four active elements directed for cell lysis and one optional BAW element for mixing on the cartridge. The lysis system was modeled using one dimensional (1D) transmission line and two dimensional (2D) FEM models. For input powers required to lyse cells, the flow rate dictated the temperature change across the lysing region. From the computational models, a flow rate of 10 {micro}L/min produced a temperature rise of 23.2 C and only 6.7 C when flowing at 60 {micro}L/min. The measured temperature changes were 5 C less than the model. The computational models also permitted optimization of the acoustic coupling to the microchannel region and revealed the potential impact of thermal effects if not controlled. Using E. coli, we achieved a lysing efficacy of 49.9 {+-} 29.92 % based on a cell viability assay with a 757.2 % increase in ATP release within 20 seconds of acoustic exposure. A bench-top lysing system required 15-20 minutes operating up to 58 Watts to achieve the same level of cell lysis. We demonstrate that active mixing on the cartridge was critical to maximize binding and release of nucleic acid to the magnetic beads. Using a sol-gel silica bead matrix filled microchannel the extraction efficacy was 40%. The cartridge based magnetic bead system had an extraction efficiency of 19.2%. For an electric field based method that used Nafion films, a nucleic acid extraction efficiency of 66.3 % was achieved at 6 volts DC. For the flow rates we tested (10-50 {micro}L/min), the nucleic acid extraction time was 5-10 minutes for a volume of 50 {micro}L. Moreover, a unique feature of this technology is the ability to replace the cartridges for subsequent nucleic acid extractions.

More Details

Feasibility of neuro-morphic computing to emulate error-conflict based decision making

James, Conrad D.

A key aspect of decision making is determining when errors or conflicts exist in information and knowing whether to continue or terminate an action. Understanding the error-conflict processing is crucial in order to emulate higher brain functions in hardware and software systems. Specific brain regions, most notably the anterior cingulate cortex (ACC) are known to respond to the presence of conflicts in information by assigning a value to an action. Essentially, this conflict signal triggers strategic adjustments in cognitive control, which serve to prevent further conflict. The most probable mechanism is the ACC reports and discriminates different types of feedback, both positive and negative, that relate to different adaptations. Unique cells called spindle neurons that are primarily found in the ACC (layer Vb) are known to be responsible for cognitive dissonance (disambiguation between alternatives). Thus, the ACC through a specific set of cells likely plays a central role in the ability of humans to make difficult decisions and solve challenging problems in the midst of conflicting information. In addition to dealing with cognitive dissonance, decision making in high consequence scenarios also relies on the integration of multiple sets of information (sensory, reward, emotion, etc.). Thus, a second area of interest for this proposal lies in the corticostriatal networks that serve as an integration region for multiple cognitive inputs. In order to engineer neurological decision making processes in silicon devices, we will determine the key cells, inputs, and outputs of conflict/error detection in the ACC region. The second goal is understand in vitro models of corticostriatal networks and the impact of physical deficits on decision making, specifically in stressful scenarios with conflicting streams of data from multiple inputs. We will elucidate the mechanisms of cognitive data integration in order to implement a future corticostriatal-like network in silicon devices for improved decision processing.

More Details

Nanomechanics of hard films on compliant substrates

Moody, Neville R.; Reedy, Earl D.; Corona, Edmundo; Adams, David P.; Zhou, Xiaowang

Development of flexible thin film systems for biomedical, homeland security and environmental sensing applications has increased dramatically in recent years [1,2,3,4]. These systems typically combine traditional semiconductor technology with new flexible substrates, allowing for both the high electron mobility of semiconductors and the flexibility of polymers. The devices have the ability to be easily integrated into components and show promise for advanced design concepts, ranging from innovative microelectronics to MEMS and NEMS devices. These devices often contain layers of thin polymer, ceramic and metallic films where differing properties can lead to large residual stresses [5]. As long as the films remain substrate-bonded, they may deform far beyond their freestanding counterpart. Once debonded, substrate constraint disappears leading to film failure where compressive stresses can lead to wrinkling, delamination, and buckling [6,7,8] while tensile stresses can lead to film fracture and decohesion [9,10,11]. In all cases, performance depends on film adhesion. Experimentally it is difficult to measure adhesion. It is often studied using tape [12], pull off [13,14,15], and peel tests [16,17]. More recent techniques for measuring adhesion include scratch testing [18,19,20,21], four point bending [22,23,24], indentation [25,26,27], spontaneous blisters [28,29] and stressed overlayers [7,26,30,31,32,33]. Nevertheless, sample design and test techniques must be tailored for each system. There is a large body of elastic thin film fracture and elastic contact mechanics solutions for elastic films on rigid substrates in the published literature [5,7,34,35,36]. More recent work has extended these solutions to films on compliant substrates and show that increasing compliance markedly changes fracture energies compared with rigid elastic solution results [37,38]. However, the introduction of inelastic substrate response significantly complicates the problem [10,39,40]. As a result, our understanding of the critical relationship between adhesion, properties, and fracture for hard films on compliant substrates is limited. To address this issue, we integrated nanomechanical testing and mechanics-based modeling in a program to define the critical relationship between deformation and fracture of nanoscale films on compliant substrates. The approach involved designing model film systems and employing nano-scale experimental characterization techniques to isolate effects of compliance, viscoelasticity, and plasticity on deformation and fracture of thin hard films on substrates that spanned more than two orders of compliance magnitude exhibit different interface structures, have different adhesion strengths, and function differently under stress. The results of this work are described in six chapters. Chapter 1 provides the motivation for this work. Chapter 2 presents experimental results covering film system design, sample preparation, indentation response, and fracture including discussion on the effects of substrate compliance on fracture energies and buckle formation from existing models. Chapter 3 describes the use of analytical and finite element simulations to define the role of substrate compliance and film geometry on the indentation response of thin hard films on compliant substrates. Chapter 4 describes the development and application of cohesive zone model based finite element simulations to determine how substrate compliance affects debond growth. Chapter 5 describes the use of molecular dynamics simulations to define the effects of substrate compliance on interfacial fracture of thin hard tungsten films on silicon substrates. Chapter 6 describes the Workshops sponsored through this program to advance understanding of material and system behavior.

More Details

Highly scalable linear solvers on thousands of processors

Siefert, Christopher; Tuminaro, Raymond S.; Domino, Stefan P.; Robinson, Allen C.

In this report we summarize research into new parallel algebraic multigrid (AMG) methods. We first provide a introduction to parallel AMG. We then discuss our research in parallel AMG algorithms for very large scale platforms. We detail significant improvements in the AMG setup phase to a matrix-matrix multiplication kernel. We present a smoothed aggregation AMG algorithm with fewer communication synchronization points, and discuss its links to domain decomposition methods. Finally, we discuss a multigrid smoothing technique that utilizes two message passing layers for use on multicore processors.

More Details

Neural assembly models derived through nano-scale measurements

Fan, Hongyou; Forsythe, James C.; Branda, Catherine; Warrender, Christina E.; Schiek, Richard

This report summarizes accomplishments of a three-year project focused on developing technical capabilities for measuring and modeling neuronal processes at the nanoscale. It was successfully demonstrated that nanoprobes could be engineered that were biocompatible, and could be biofunctionalized, that responded within the range of voltages typically associated with a neuronal action potential. Furthermore, the Xyce parallel circuit simulator was employed and models incorporated for simulating the ion channel and cable properties of neuronal membranes. The ultimate objective of the project had been to employ nanoprobes in vivo, with the nematode C elegans, and derive a simulation based on the resulting data. Techniques were developed allowing the nanoprobes to be injected into the nematode and the neuronal response recorded. To the authors's knowledge, this is the first occasion in which nanoparticles have been successfully employed as probes for recording neuronal response in an in vivo animal experimental protocol.

More Details

Automated Monte Carlo biasing for photon-generated electrons near surfaces

Franke, Brian C.; Kensek, Ronald P.

This report describes efforts to automate the biasing of coupled electron-photon Monte Carlo particle transport calculations. The approach was based on weight-windows biasing. Weight-window settings were determined using adjoint-flux Monte Carlo calculations. A variety of algorithms were investigated for adaptivity of the Monte Carlo tallies. Tree data structures were used to investigate spatial partitioning. Functional-expansion tallies were used to investigate higher-order spatial representations.

More Details

Building more powerful less expensive supercomputers using Processing-In-Memory (PIM) LDRD final report

Murphy, Richard C.

This report details the accomplishments of the 'Building More Powerful Less Expensive Supercomputers Using Processing-In-Memory (PIM)' LDRD ('PIM LDRD', number 105809) for FY07-FY09. Latency dominates all levels of supercomputer design. Within a node, increasing memory latency, relative to processor cycle time, limits CPU performance. Between nodes, the same increase in relative latency impacts scalability. Processing-In-Memory (PIM) is an architecture that directly addresses this problem using enhanced chip fabrication technology and machine organization. PIMs combine high-speed logic and dense, low-latency, high-bandwidth DRAM, and lightweight threads that tolerate latency by performing useful work during memory transactions. This work examines the potential of PIM-based architectures to support mission critical Sandia applications and an emerging class of more data intensive informatics applications. This work has resulted in a stronger architecture/implementation collaboration between 1400 and 1700. Additionally, key technology components have impacted vendor roadmaps, and we are in the process of pursuing these new collaborations. This work has the potential to impact future supercomputer design and construction, reducing power and increasing performance. This final report is organized as follow: this summary chapter discusses the impact of the project (Section 1), provides an enumeration of publications and other public discussion of the work (Section 1), and concludes with a discussion of future work and impact from the project (Section 1). The appendix contains reprints of the refereed publications resulting from this work.

More Details

Richtmyer-Meshkov instability on a low atwood number interface after reshock

Weber, Chris

The Richtmyer-Meshkov instability after reshock is investigated in shock tube experiments at the Wisconsin Shock Tube Laboratory using planar laser imaging and a new high speed interface tracking technique. The interface is a 50-50% volume fraction mixture of helium and argon stratified over pure argon. This interface has an Atwood number of 0.29 and near single mode, two-dimensional, standing wave perturbation with an average amplitude of 0.35 cm and a wavelength of 19.4 cm. The incident shock wave of Mach number 1.92 accelerates the interface before it is reshocked by a reflected Mach 1.70 shock wave. The amplitude growth after reshock is reported for variations in this initial amplitude, and several amplitude growth rate models are compared to the experimental growth rate after reshock. A new growth model is introduced, based on a model of circulation deposition calculated from one-dimensional gas dynamics parameters. This model is shown to compare well with the amplitude growth rate after reshock and the circulation over a half-wavelength of the interface after the first shock wave and after reshock.

More Details

Plasmonic enhanced ultrafast switch

Shaner, Eric A.; Passmore, Brandon S.; Barrick, T.A.; Subramania, Ganapathi S.; Reno, John L.

Ultrafast electronic switches fabricated from defective material have been used for several decades in order to produce picosecond electrical transients and TeraHertz radiation. Due to the ultrashort recombination time in the photoconductor materials used, these switches are inefficient and are ultimately limited by the amount of optical power that can be applied to the switch before self-destruction. The goal of this work is to create ultrafast (sub-picosecond response) photoconductive switches on GaAs that are enhanced through plasmonic coupling structures. Here, the plasmonic coupler primarily plays the role of being a radiation condenser which will cause carriers to be generated adjacent to metallic electrodes where they can more efficiently be collected.

More Details

HOPSPACK 2.0 user manual

Plantenga, Todd

HOPSPACK (Hybrid Optimization Parallel Search PACKage) solves derivative-free optimization problems using an open source, C++ software framework. The framework enables parallel operation using MPI or multithreading, and allows multiple solvers to run simultaneously and interact to find solution points. HOPSPACK comes with an asynchronous pattern search solver that handles general optimization problems with linear and nonlinear constraints, and continuous and integer-valued variables. This user manual explains how to install and use HOPSPACK to solve problems, and how to create custom solvers within the framework.

More Details

Final Report on LDRD project 130784 : functional brain imaging by tunable multi-spectral Event-Related Optical Signal (EROS)

Hsu, Alan Y.; Speed, Ann E.

Functional brain imaging is of great interest for understanding correlations between specific cognitive processes and underlying neural activity. This understanding can provide the foundation for developing enhanced human-machine interfaces, decision aides, and enhanced cognition at the physiological level. The functional near infrared spectroscopy (fNIRS) based event-related optical signal (EROS) technique can provide direct, high-fidelity measures of temporal and spatial characteristics of neural networks underlying cognitive behavior. However, current EROS systems are hampered by poor signal-to-noise-ratio (SNR) and depth of measure, limiting areas of the brain and associated cognitive processes that can be investigated. We propose to investigate a flexible, tunable, multi-spectral fNIRS EROS system which will provide up to 10x greater SNR as well as improved spatial and temporal resolution through significant improvements in electronics, optoelectronics and optics, as well as contribute to the physiological foundation of higher-order cognitive processes and provide the technical foundation for miniaturized portable neuroimaging systems.

More Details

LDRD final report : massive multithreading applied to national infrastructure and informatics

Barrett, Brian; Hendrickson, Bruce A.; Laviolette, Randall A.; Leung, Vitus J.; Mackey, Greg E.; Murphy, Richard C.; Phillips, Cynthia A.; Pinar, Ali P.

Large relational datasets such as national-scale social networks and power grids present different computational challenges than do physical simulations. Sandia's distributed-memory supercomputers are well suited for solving problems concerning the latter, but not the former. The reason is that problems such as pattern recognition and knowledge discovery on large networks are dominated by memory latency and not by computation. Furthermore, most memory requests in these applications are very small, and when the datasets are large, most requests miss the cache. The result is extremely low utilization. We are unlikely to be able to grow out of this problem with conventional architectures. As the power density of microprocessors has approached that of a nuclear reactor in the past two years, we have seen a leveling of Moores Law. Building larger and larger microprocessor-based supercomputers is not a solution for informatics and network infrastructure problems since the additional processors are utilized to only a tiny fraction of their capacity. An alternative solution is to use the paradigm of massive multithreading with a large shared memory. There is only one instance of this paradigm today: the Cray MTA-2. The proposal team has unique experience with and access to this machine. The XMT, which is now being delivered, is a Red Storm machine with up to 8192 multithreaded 'Threadstorm' processors and 128 TB of shared memory. For many years, the XMT will be the only way to address very large graph problems efficiently, and future generations of supercomputers will include multithreaded processors. Roughly 10 MTA processor can process a simple short paths problem in the time taken by the Gordon Bell Prize-nominated distributed memory code on 32,000 processors of Blue Gene/Light. We have developed algorithms and open-source software for the XMT, and have modified that software to run some of these algorithms on other multithreaded platforms such as the Sun Niagara and Opteron multi-core chips.

More Details

Palacios and Kitten : high performance operating systems for scalable virtualized and native supercomputing

Pedretti, Kevin T.T.; Levenhagen, Michael; Brightwell, Ronald B.

Palacios and Kitten are new open source tools that enable applications, whether ported or not, to achieve scalable high performance on large machines. They provide a thin layer over the hardware to support both full-featured virtualized environments and native code bases. Kitten is an OS under development at Sandia that implements a lightweight kernel architecture to provide predictable behavior and increased flexibility on large machines, while also providing Linux binary compatibility. Palacios is a VMM that is under development at Northwestern University and the University of New Mexico. Palacios, which can be embedded into Kitten and other OSes, supports existing, unmodified applications and operating systems by using virtualization that leverages hardware technologies. We describe the design and implementation of both Kitten and Palacios. Our benchmarks show that they provide near native, scalable performance. Palacios and Kitten provide an incremental path to using supercomputer resources that is not performance-compromised.

More Details

A Life Cycle Cost Analysis Framework for Geologic Storage of Hydrogen

Lord, Anna S.; Kobos, Peter; Borns, David J.

Large scale geostorage options for fuels including natural gas and petroleum offer substantial buffer capacity to meet or hedge against supply disruptions. This same notion may be applied to large scale hydrogen storage to meet industrial or transportation sector needs. This study develops an assessment tool to calculate the potential ‘gate-to-gate’ life cycle costs for large scale hydrogen geostorage options in salt caverns, and continues to develop modules for depleted oil/gas reservoirs and aquifers. The U.S. Department of Energy has an interest in these types of storage to assess the geological, geomechanical and economic viability for this type of hydrogen storage. Understanding, and looking to quantify, the value of large-scale storage in a larger hydrogen supply and demand infrastructure may prove extremely beneficial for larger infrastructure modeling efforts when looking to identify the most efficient means to fuel a hydrogen demand (e.g., industrial or transportation-centric demand). Drawing from the knowledge gained in the underground large scale storage options for natural gas and petroleum in the U.S., the potential to store relatively large volumes of CO2 in geological formations, the hydrogen storage assessment modeling will continue to build on these strengths while maintaining modeling transparency such that other modeling efforts may draw from this project.

More Details

Land-surface studies with a directional neutron detector

Desilets, Darin M.; Marleau, P.; Brennan, J.

Direct measurements of cosmic-ray neutron intensity were recorded with a neutron scatter camera developed at SNL. The instrument used in this work is a prototype originally designed for nuclear non-proliferation work, but in this project it was used to characterize the response of ambient neutrons in the 0.5-10 MeV range to water located on or above the land surface. Ambient neutron intensity near the land surface responds strongly to the presence of water, suggesting the possibility of an indirect method for monitoring soil water content, snow water equivalent depth, or canopy intercepted water. For environmental measurements the major advantage of measuring neutrons with the scatter camera is the limited (60{sup o}) field of view that can be obtained, which allows observations to be conducted at a previously unattainable spatial scales. This work is intended to provide new measurements of directional fluxes which can be used in the design of new instruments for passively and noninvasively observing land-surface water. Through measurements and neutron transport modeling we have demonstrated that such a technique is feasible.

More Details

Electrostatic microvalves utilizing conductive nanoparticles for improved speed, lower power, and higher force actuation

Ten Eyck, Gregory A.; Branson, Eric D.; Cook, Adam; Collord, Andrew D.; Givler, Richard C.

We have designed and built electrostatically actuated microvalves compatible with integration into a PDMS based microfluidic system. The key innovation for electrostatic actuation was the incorporation of carbon nanotubes into the PDMS valve membrane, allowing for electrostatic charging of the PDMS layer and subsequent discharging, while still allowing for significant distention of the valveseat for low voltage control of the system. Nanoparticles were applied to semi-cured PDMS using a stamp transfer method, and then cured fully to make the valve seats. DC actuation in air of these valves yielded operational voltages as low as 15V, by using a supporting structure above the valve seat that allowed sufficient restoring forces to be applied while not enhancing actuation forces to raise the valve actuation potential. Both actuate to open and actuate to close valves have been demonstrated, and integrated into a microfluidic platform, and demonstrated fluidic control using electrostatic valves.

More Details

Radiation microscope for SEE testing using GeV ions

Vizkelethy, Gyorgy; Villone, Janelle; Hattar, Khalid M.; Doyle, B.L.; Knapp, J.A.

Radiation Effects Microscopy is an extremely useful technique in failure analysis of electronic parts used in radiation environment. It also provides much needed support for development of radiation hard components used in spacecraft and nuclear weapons. As the IC manufacturing technology progresses, more and more overlayers are used; therefore, the sensitive region of the part is getting farther and farther from the surface. The thickness of these overlayers is so large today that the traditional microbeams, which are used for REM are unable to reach the sensitive regions. As a result, higher ion beam energies have to be used (> GeV), which are available only at cyclotrons. Since it is extremely complicated to focus these GeV ion beams, a new method has to be developed to perform REM at cyclotrons. We developed a new technique, Ion Photon Emission Microscopy, where instead of focusing the ion beam we use secondary photons emitted from a fluorescence layer on top of the devices being tested to determine the position of the ion hit. By recording this position information in coincidence with an SEE signal we will be able to indentify radiation sensitive regions of modern electronic parts, which will increase the efficiency of radiation hard circuits.

More Details

Final LDRD report : the physics of 1D and 2D electron gases in III-nitride heterostructure NWs

Wang, George T.; Armstrong, Andrew A.; Li, Qiming L.; Lin, Yong

The proposed work seeks to demonstrate and understand new phenomena in novel, freestanding III-nitride core-shell nanowires, including 1D and 2D electron gas formation and properties, and to investigate the role of surfaces and heterointerfaces on the transport and optical properties of nanowires, using a combined experimental and theoretical approach. Obtaining an understanding of these phenomena will be a critical step that will allow development of novel, ultrafast and ultraefficient nanowire-based electronic and photonic devices.

More Details

ALEGRA-HEDP simulations of the dense plasma focus

Flicker, Dawn

We have carried out 2D simulations of three dense plasma focus (DPF) devices using the ALEGRA-HEDP code and validated the results against experiments. The three devices included two Mather-type machines described by Bernard et. al. and the Tallboy device currently in operation at NSTec in North Las Vegas. We present simulation results and compare to detailed plasma measurements for one Bernard device and to current and neutron yields for all three. We also describe a new ALEGRA capability to import data from particle-in-cell calculations of initial gas breakdown, which will allow the first ever simulations of DPF operation from the beginning of the voltage discharge to the pinch phase for arbitrary operating conditions and without assumptions about the early sheath structure. The next step in understanding DPF pinch physics must be three-dimensional modeling of conditions going into the pinch, and we have just launched our first 3D simulation of the best-diagnosed Bernard device.

More Details

Ultrathin Optics for Low-Profile Innocuous Imager

Boye, Robert; Nelson, Cynthia L.; Brady, Gregory R.; Briggs, Ronald D.; Jared, Bradley H.; Warren, Mial E.

This project demonstrates the feasibility of a novel imager with a thickness measured in microns rather than inches. Traditional imaging systems, i.e. cameras, cannot provide both the necessary resolution and innocuous form factor required in many data acquisition applications. Designing an imaging system with an extremely thin form factor (less than 1 mm) immediately presents several technical challenges. For instance, the thickness of the optical lens must be reduced drastically from currently available lenses. Additionally, the image circle is reduced by a factor equal to the reduction in focal length. This translates to fewer detector pixels across the image. To reduce the optical total track requires the use of specialized micro-optics and the required resolution necessitates the use of a new imaging modality. While a single thin imager will not produce the desired output, several thin imagers can be multiplexed and their low resolution (LR) outputs used together in post-processing to produce a high resolution (HR) image. The utility of an Iterative Back Projection (IBP) algorithm has been successfully demonstrated for performing the required post-processing. Advanced fabrication of a thin lens was also demonstrated and experimental results using this lens as well as commercially available lenses are presented.

More Details

Evaluation of the Geotech SMART24BH 20Vpp/5Vpp data acquisition system with active fortezza crypto card data signing and authentication

Hart, Darren M.; Rembold, Randy K.

Sandia National Laboratories has tested and evaluated Geotech SMART24BH borehole data acquisition system with active Fortezza crypto card data signing and authentication. The test results included in this report were in response to static and tonal-dynamic input signals. Most test methodologies used were based on IEEE Standards 1057 for Digitizing Waveform Recorders and 1241 for Analog to Digital Converters; others were designed by Sandia specifically for infrasound application evaluation and for supplementary criteria not addressed in the IEEE standards. The objective of this work was to evaluate the overall technical performance of two Geotech SMART24BH digitizers with a Fortezza PCMCIA crypto card actively implementing the signing of data packets. The results of this evaluation were compared to relevant specifications provided within manufacturer's documentation notes. The tests performed were chosen to demonstrate different performance aspects of the digitizer under test. The performance aspects tested include determining noise floor, least significant bit (LSB), dynamic range, cross-talk, relative channel-to-channel timing, time-tag accuracy/statistics/drift, analog bandwidth.

More Details

Tuned cavity magnetometer sensitivity

Okandan, Murat; Schwindt, Peter D.

We have developed a high sensitivity (<pico Tesla/{radical}Hz), non-cryogenic magnetometer that utilizes a novel optical (interferometric) detection technique. Further miniaturization and low-power operation are key advantages of this magnetometer, when compared to systems using SQUIDs which require liquid Helium temperatures and associated overhead to achieve similar sensitivity levels.

More Details

Room temperature synthesis of Ni-based alloy nanoparticles by radiolysis

Leung, Kevin; Hanson, Donald J.; Stumpf, Roland R.; Huang, Jian Y.; Robinson, David; Lu, Ping; Provencio, P.N.; Jacobs, Benjamin J.

Room temperature radiolysis, density functional theory, and various nanoscale characterization methods were used to synthesize and fully describe Ni-based alloy nanoparticles (NPs) that were synthesized at room temperature. These complementary methods provide a strong basis in understanding and describing metastable phase regimes of alloy NPs whose reaction formation is determined by kinetic rather than thermodynamic reaction processes. Four series of NPs, (Ag-Ni, Pd-Ni, Co-Ni, and W-Ni) were analyzed and characterized by a variety of methods, including UV-vis, TEM/HRTEM, HAADF-STEM and EFTEM mapping. In the first focus of research, AgNi and PdNi were studied. Different ratios of Ag{sub x}- Ni{sub 1-x} alloy NPs and Pd{sub 0.5}- Ni{sub 0.5} alloy NP were prepared using a high dose rate from gamma irradiation. Images from high-angle annular dark-field (HAADF) show that the Ag-Ni NPs are not core-shell structure but are homogeneous alloys in composition. Energy filtered transmission electron microscopy (EFTEM) maps show the homogeneity of the metals in each alloy NP. Of particular interest are the normally immiscible Ag-Ni NPs. All evidence confirmed that homogeneous Ag-Ni and Pd-Ni alloy NPs presented here were successfully synthesized by high dose rate radiolytic methodology. A mechanism is provided to explain the homogeneous formation of the alloy NPs. Furthermore, studies of Pd-Ni NPs by in situ TEM (with heated stage) shows the ability to sinter these NPs at temperatures below 800 C. In the second set of work, CoNi and WNi superalloy NPs were attempted at 50/50 concentration ratios using high dose rates from gamma irradiation. Preliminary results on synthesis and characterization have been completed and are presented. As with the earlier alloy NPs, no evidence of core-shell NP formation occurs. Microscopy results seem to indicate alloying occurred with the CoNi alloys. However, there appears to be incomplete reduction of the Na{sub 2}WO{sub 4} to form the W{sup 2+} ion in solution; the predominance of WO{sup +} appears to have resulted in a W-O-Ni complex that has not yet been fully characterized.

More Details

Quantum Cascade Lasers (QCLs) for standoff explosives detection : LDRD 138733 final report

Theisen, Lisa A.; Linker, Kevin L.

Continued acts of terrorism using explosive materials throughout the world have led to great interest in explosives detection technology, especially technologies that have a potential for remote or standoff detection. This LDRD was undertaken to investigate the benefit of the possible use of quantum cascade lasers (QCLs) in standoff explosives detection equipment. Standoff detection of explosives is currently one of the most difficult problems facing the explosives detection community. Increased domestic and troop security could be achieved through the remote detection of explosives. An effective remote or standoff explosives detection capability would save lives and prevent losses of mission-critical resources by increasing the distance between the explosives and the intended targets and/or security forces. Many sectors of the US government are urgently attempting to obtain useful equipment to deploy to our troops currently serving in hostile environments. This LDRD was undertaken to investigate the potential benefits of utilizing quantum cascade lasers (QCLs) in standoff detection systems. This report documents the potential opportunities that Sandia National Laboratories can contribute to the field of QCL development. The following is a list of areas where SNL can contribute: (1) Determine optimal wavelengths for standoff explosives detection utilizing QCLs; (2) Optimize the photon collection and detection efficiency of a detection system for optical spectroscopy; (3) Develop QCLs with broader wavelength tunability (current technology is a 10% change in wavelength) while maintaining high efficiency; (4) Perform system engineering in the design of a complete detection system and not just the laser head; and (5) Perform real-world testing with explosive materials with commercial prototype detection systems.

More Details

Radiation effects from first principles : the role of excitons in electronic-excited processes

Wong, Bryan M.

Electron-hole pairs, or excitons, are created within materials upon optical excitation or irradiation with X-rays/charged particles. The ability to control and predict the role of excitons in these energetically-induced processes would have a tremendous impact on understanding the effects of radiation on materials. In this report, the excitonic effects in large cycloparaphenylene carbon structures are investigated using various first-principles methods. These structures are particularly interesting since they allow a study of size-scaling properties of excitons in a prototypical semi-conducting material. In order to understand these properties, electron-hole transition density matrices and exciton binding energies were analyzed as a function of size. The transition density matrices allow a global view of electronic coherence during an electronic excitation, and the exciton binding energies give a quantitative measure of electron-hole interaction energies in these structures. Based on overall trends in exciton binding energies and their spatial delocalization, we find that excitonic effects play a vital role in understanding the unique photoinduced dynamics in these systems.

More Details

Polymer/inorganic superhydrophobic surfaces

Branson, Eric D.; Collord, Andrew D.; Apblett, Christopher A.; Brinker, C.J.

We have designed and built electrostatically actuated microvalves compatible with integration into a PDMS based microfluidic system. The key innovation for electrostatic actuation was the incorporation of carbon nanotubes into the PDMS valve membrane, allowing for electrostatic charging of the PDMS layer and subsequent discharging, while still allowing for significant distention of the valveseat for low voltage control of the system. Nanoparticles were applied to semi-cured PDMS using a stamp transfer method, and then cured fully to make the valve seats. DC actuation in air of these valves yielded operational voltages as low as 15V, by using a supporting structure above the valve seat that allowed sufficient restoring forces to be applied while not enhancing actuation forces to raise the valve actuation potential. Both actuate to open and actuate to close valves have been demonstrated, and integrated into a microfluidic platform, and demonstrated fluidic control using electrostatic valves.

More Details

Benchmarks for GADRAS performance validation

Mattingly, John K.; Mitchell, Dean J.; Rhykerd Jr., Charles L.

The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

More Details

Diagnostic development for determining the joint temperature/soot statistics in hydrocarbon-fueled pool fires : LDRD final report

Frederickson, Kraig; Grasser, Thomas; Castaeda, Jaime N.; Hewson, John C.; Luketa, Anay

A joint temperature/soot laser-based optical diagnostic was developed for the determination of the joint temperature/soot probability density function (PDF) for hydrocarbon-fueled meter-scale turbulent pool fires. This Laboratory Directed Research and Development (LDRD) effort was in support of the Advanced Simulation and Computing (ASC) program which seeks to produce computational models for the simulation of fire environments for risk assessment and analysis. The development of this laser-based optical diagnostic is motivated by the need for highly-resolved spatio-temporal information for which traditional diagnostic probes, such as thermocouples, are ill-suited. The in-flame gas temperature is determined from the shape of the nitrogen Coherent Anti-Stokes Raman Scattering (CARS) signature and the soot volume fraction is extracted from the intensity of the Laser-Induced Incandescence (LII) image of the CARS probed region. The current state of the diagnostic will be discussed including the uncertainty and physical limits of the measurements as well as the future applications of this probe.

More Details

Microstructure-based approach for predicting crack initiation and early growth in metals

Battaile, Corbett C.; Bartel, Timothy J.; Reedy, Earl D.; Cox, James; Foulk, James W.; Puskar, J.D.; Boyce, Brad L.; Emery, John M.

Fatigue cracking in metals has been and is an area of great importance to the science and technology of structural materials for quite some time. The earliest stages of fatigue crack nucleation and growth are dominated by the microstructure and yet few models are able to predict the fatigue behavior during these stages because of a lack of microstructural physics in the models. This program has developed several new simulation tools to increase the microstructural physics available for fatigue prediction. In addition, this program has extended and developed microscale experimental methods to allow the validation of new microstructural models for deformation in metals. We have applied these developments to fatigue experiments in metals where the microstructure has been intentionally varied.

More Details

High fidelity nuclear energy system optimization towards an environmentally benign, sustainable, and secure energy source

Rochau, Gary E.; Rodriguez, Salvador B.

The impact associated with energy generation and utilization is immeasurable due to the immense, widespread, and myriad effects it has on the world and its inhabitants. The polar extremes are demonstrated on the one hand, by the high quality of life enjoyed by individuals with access to abundant reliable energy sources, and on the other hand by the global-scale environmental degradation attributed to the affects of energy production and use. Thus, nations strive to increase their energy generation, but are faced with the challenge of doing so with a minimal impact on the environment and in a manner that is self-reliant. Consequently, a revival of interest in nuclear energy has followed, with much focus placed on technologies for transmuting nuclear spent fuel. The performed research investigates nuclear energy systems that optimize the destruction of nuclear waste. In the context of this effort, nuclear energy system is defined as a configuration of nuclear reactors and corresponding fuel cycle components. The proposed system has unique characteristics that set it apart from other systems. Most notably the dedicated High-Energy External Source Transmuter (HEST), which is envisioned as an advanced incinerator used in combination with thermal reactors. The system is configured for examining environmentally benign fuel cycle options by focusing on minimization or elimination of high level waste inventories. Detailed high-fidelity exact-geometry models were developed for representative reactor configurations. They were used in preliminary calculations with Monte Carlo N-Particle eXtented (MCNPX) and Standardized Computer Analysis for Licensing Evaluation (SCALE) code systems. The reactor models have been benchmarked against existing experimental data and design data. Simulink{reg_sign}, an extension of MATLAB{reg_sign}, is envisioned as the interface environment for constructing the nuclear energy system model by linking the individual reactor and fuel component sub-models for overall analysis of the system. It also provides control over key user input parameters and the ability to effectively consolidate vital output results for uncertainty/sensitivity analysis and optimization procedures. The preliminary analysis has shown promising advanced fuel cycle scenarios that include Pressure Water Reactors Pressurized Water Reactors (PWRs), Very High Temperature Reactors (VHTRs) and dedicated HEST waste incineration facilities. If deployed, these scenarios may substantially reduce nuclear waste inventories approaching environmentally benign nuclear energy system characteristics. Additionally, a spent fuel database of the isotopic compositions for multiple design and control parameters has been created for the VHTR-HEST input fuel streams. Computational approaches, analysis metrics, and benchmark strategies have been established for future detailed studies.

More Details

A smoothed two-and three-dimensional interface reconstruction method

Computing and Visualization in Science

Mosso, Stewart; Garasi, Christopher J.; Drake, Richard R.

The Patterned Interface Reconstruction algorithm reduces the discontinuity between material interfaces in neighboring computational elements. This smoothing improves the accuracy of the reconstruction for smooth bodies. The method can be used in two- and three-dimensional Cartesian and unstructured meshes. Planar interfaces will be returned for planar volume fraction distributions. The algorithm is second-order accurate for smooth volume fraction distributions. © 2008 Springer-Verlag.

More Details

Experience with approximations in the trust-region parallel direct search algorithm

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Shontz, S.M.; Howle, V.E.; Hough, Patricia D.

Recent years have seen growth in the number of algorithms designed to solve challenging simulation-based nonlinear optimization problems. One such algorithm is the Trust-Region Parallel Direct Search (TRPDS) method developed by Hough and Meza. In this paper, we take advantage of the theoretical properties of TRPDS to make use of approximation models in order to reduce the computational cost of simulation-based optimization. We describe the extension, which we call mTRPDS, and present the results of a case study for two earth penetrator design problems. In the case study, we conduct computational experiments with an array of approximations within the mTRPDS algorithm and compare the numerical results to the original TRPDS algorithm and a trust-region method implemented using the speculative gradient approach described by Byrd, Schnabel, and Shultz. The results suggest new ways to improve the algorithm. © 2009 Springer Berlin Heidelberg.

More Details

Hardness assurance test guideline for qualifying devices for use in proton environments

IEEE Transactions on Nuclear Science

Schwank, James R.; Shaneyfelt, Marty R.; Dodd, Paul E.; Felix, James A.; Baggio, J.; Ferlet-Cavrois, V.; Paillet, P.; Label, K.A.; Pease, R.L.; Simons, M.; Cohn, L.M.

Proton-induced singl -event effects hardness assurance guidelines are developed to address issues raised by recent test results in advanced IC technologies for use in space environments. Specifically, guidelines are developed that address the effects of proton energy and angle of incidence on single-event latchup and the effects of total dose on single-event upset. The guidelines address both single-event upset (SEU), single-event latchup (SEL), and combined SEU and total ionizing dose (TID) effects. © 2006 IEEE.

More Details
Results 74801–75000 of 99,299
Results 74801–75000 of 99,299