Overview of the Prioritization Analysis Tool for all-Hazards / Analyzer for Wide-Area Restoration Effectiveness (PATH/AWARE)
Abstract not provided.
Abstract not provided.
We have developed a novel experimental technique for direct production of cold molecules using a combination of techniques from atomic optical and molecular physics and physical chemistry. The ability to produce samples of cold molecules has application in a broad spectrum of technical fields high-resolution spectroscopy, remote sensing, quantum computing, materials simulation, and understanding fundamental chemical dynamics. Researchers around the world are currently exploring many techniques for producing samples of cold molecules, but to-date these attempts have offered only limited success achieving milli-Kelvin temperatures with low densities. This Laboratory Directed Research and Development project is to develops a new experimental technique for producing micro-Kelvin temperature molecules via collisions with laser cooled samples of trapped atoms. The technique relies on near mass degenerate collisions between the molecule of interest and a laser cooled (micro-Kelvin) atom. A subset of collisions will transfer all (nearly all) of the kinetic energy from the 'hot' molecule, cooling the molecule at the expense of heating the atom. Further collisions with the remaining laser cooled atoms will thermally equilibrate the molecules to the micro-Kelvin temperature of the laser-cooled atoms.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This report describes our efforts to quantify the behavior of micro-fabricated THz rectangular waveguides on a configurable, robust semiconductor-based platform. These waveguides are an enabling technology for coupling THz radiation directly from or to lasers, mixers, detectors, antennas, and other devices. Traditional waveguides fabricated on semiconductor platforms such as dielectric guides in the infrared or co-planar waveguides in the microwave regions, suffer high absorption and radiative losses in the THz. The former leads to very short propagation lengths, while the latter will lead to unwanted radiation modes and/or crosstalk in integrated devices. This project exploited the initial developments of THz micro-machined rectangular waveguides developed under the THz Grand Challenge Program, but instead of focusing on THz transceiver integration, this project focused on exploring the propagation loss and far-field radiation patterns of the waveguides. During the 9 month duration of this project we were able to reproduce the waveguide loss per unit of length in the waveguides and started to explore how the loss depended on wavelength. We also explored the far-field beam patterns emitted by H-plane horn antennas attached to the waveguides. In the process we learned that the method of measuring the beam patterns has a significant impact on what is actually measured, and this may have an effect on most of the beam patterns of THz that have been reported to date. The beam pattern measurements improved significantly throughout the project, but more refinements of the measurement are required before a definitive determination of the beam-pattern can be made.
Cambio is an application intended to automatically read and display any spectrum file of any format in the world that the nuclear emergency response community might encounter. Cambio also provides an analysis capability suitable for HPGe spectra when detector response and scattering environment are not well known. Why is Cambio needed: (1) Cambio solves the following problem - With over 50 types of formats from instruments used in the field and new format variations appearing frequently, it is impractical for every responder to have current versions of the manufacturer's software from every instrument used in the field; (2) Cambio converts field spectra to any one of several common formats that are used for analysis, saving valuable time in an emergency situation; (3) Cambio provides basic tools for comparing spectra, calibrating spectra, and isotope identification with analysis suited especially for HPGe spectra; and (4) Cambio has a batch processing capability to automatically translate a large number of archival spectral files of any format to one of several common formats, such as the IAEA SPE or the DHS N42. Currently over 540 analysts and members of the nuclear emergency response community worldwide are on the distribution list for updates to Cambio. Cambio users come from all levels of government, university, and commercial partners around the world that support efforts to counter terrorist nuclear activities. Cambio is Unclassified Unlimited Release (UUR) and distributed by internet downloads with email notifications whenever a new build of Cambio provides for new formats, bug fixes, or new or improved capabilities. Cambio is also provided as a DLL to the Karlsruhe Institute for Transuranium Elements so that Cambio's automatic file-reading capability can be included at the Nucleonica web site.
This is a dissertation on research conducted studying the dynamic crack initiation toughness of a 4340 steel. Researchers have been conducting experimental testing of dynamic crack initiation toughness, K{sub Ic}, for many years, using many experimental techniques with vastly different trends in the results when reporting K{sub Ic} as a function of loading rate. The dissertation describes a novel experimental technique for measuring K{sub Ic} in metals using the Kolsky bar. The method borrows from improvements made in recent years in traditional Kolsky bar testing by using pulse shaping techniques to ensure a constant loading rate applied to the sample before crack initiation. Dynamic crack initiation measurements were reported on a 4340 steel at two different loading rates. The steel was shown to exhibit a rate dependence, with the recorded values of K{sub Ic} being much higher at the higher loading rate. Using the knowledge of this rate dependence as a motivation in attempting to model the fracture events, a viscoplastic constitutive model was implemented into a peridynamic computational mechanics code. Peridynamics is a newly developed theory in solid mechanics that replaces the classical partial differential equations of motion with integral-differential equations which do not require the existence of spatial derivatives in the displacement field. This allows for the straightforward modeling of unguided crack initiation and growth. To date, peridynamic implementations have used severely restricted constitutive models. This research represents the first implementation of a complex material model and its validation. After showing results comparing deformations to experimental Taylor anvil impact for the viscoplastic material model, a novel failure criterion is introduced to model the dynamic crack initiation toughness experiments. The failure model is based on an energy criterion and uses the K{sub Ic} values recorded experimentally as an input. The failure model is then validated against one class of problems showing good agreement with experimental results.
Presto is a three-dimensional transient dynamics code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. Contact capabilities are parallel and scalable. The Presto 4.14 User's Guide provides information about the functionality in Presto and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Presto is similar to that of the code Adagio [3]. Adagio is a three-dimensional quasi-static code with a versatile element library, nonlinear material models, large deformation capabilities, and contact. Adagio, like Presto, is built on the SIERRA Framework [1]. Contact capabilities for Adagio are also parallel and scalable. A significant feature of Adagio is that it offers a multilevel, nonlinear iterative solver. Because of the similarities in input and usage between Presto and Adagio, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Adagio may be found in the Presto user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example, though both Presto and Adagio have contact functionality, the content of the chapter on contact in the two guides differs. Important references for both Adagio and Presto are given in the references section at the end of this chapter. Adagio was preceded by the codes JAC and JAS3D; JAC is described in Reference 4; JAS3D is described in Reference 5. Presto was preceded by the code Pronto3D. Pronto3D is described in References 6 and 7. Some of the fundamental nonlinear technology used by both Presto and Adagio are described in References 8, 9, and 10. Currently, both Presto and Adagio use the Exodus II database and the XDMF database; Exodus II is more commonly used than XDMF. (Other options may be added in the future.) The Exodus II database format is described in Reference 11, and the XDMF database format is described in Reference 12. Important information about contact is provided in the reference document for ACME [13]. ACME is a third-party library for contact. One of the key concepts for the command structure in the input file is a concept referred to as scope. A detailed explanation of scope is provided in Section 1.2. Most of the command lines in Chapter 2 are related to a certain scope rather than to some particular functionality.
This document is a user's guide for the code Adagio. Adagio is a three-dimensional, implicit solid mechanics code with a versatile element library, nonlinear material models, and capabilities for modeling large deformation and contact. Adagio is a parallel code, and its nonlinear solver and contact capabilities enable scalable solutions of large problems. It is built on the SIERRA Framework [1, 2]. SIERRA provides a data management framework in a parallel computing environment that allows the addition of capabilities in a modular fashion. The Adagio 4.14 User's Guide provides information about the functionality in Adagio and the command structure required to access this functionality in a user input file. This document is divided into chapters based primarily on functionality. For example, the command structure related to the use of various element types is grouped in one chapter; descriptions of material models are grouped in another chapter. The input and usage of Adagio is similar to that of the code Presto [3]. Presto, like Adagio, is a solid mechanics code built on the SIERRA Framework. The primary difference between the two codes is that Presto uses explicit time integration for transient dynamics analysis, whereas Adagio is an implicit code. Because of the similarities in input and usage between Adagio and Presto, the user's guides for the two codes are structured in the same manner and share common material. (Once you have mastered the input structure for one code, it will be easy to master the syntax structure for the other code.) To maintain the commonality between the two user's guides, we have used a variety of techniques. For example, references to Presto may be found in the Adagio user's guide and vice versa, and the chapter order across the two guides is the same. On the other hand, each of the two user's guides is expressly tailored to the features of the specific code and documents the particular functionality for that code. For example, though both Presto and Adagio have contact functionality, the content of the chapter on contact in the two guides differs. Important references for both Adagio and Presto are given in the references section at the end of this chapter. Adagio was preceded by the codes JAC and JAS3D; JAC is described in Reference 4; JAS3D is described in Reference 5. Presto was preceded by the code Pronto3D. Pronto3D is described in References 6 and 7. Some of the fundamental nonlinear technology used by both Presto and Adagio are described in References 8, 9, and 10. Currently, both Presto and Adagio use the Exodus II database and the XDMF database; Exodus II is more commonly used than XDMF. (Other options may be added in the future.) The Exodus II database format is described in Reference 11, and the XDMF database format is described in Reference 12. Important information about contact is provided in the reference document for ACME [13]. ACME is a third-party library for contact. One of the key concepts for the command structure in the input file is a concept referred to as scope. A detailed explanation of scope is provided in Section 1.2. Most of the command lines in Chapter 2 are related to a certain scope rather than to some particular functionality.
The Systems Engineering Management Plan (SEMP) is a comprehensive and effective tool used to assist in the management of systems engineering efforts. It is intended to guide the work of all those involved in the project. The SEMP is comprised of three main sections: technical project planning and control, systems engineering process, and engineering specialty integration. The contents of each section must be tailored to the specific effort. A model outline and example SEMP are provided. The target audience is those who are familiar with the systems engineering approach and who have an interest in employing the SEMP as a tool for systems management. The goal of this document is to provide the reader with an appreciation for the use and importance of the SEMP, as well as provide a framework that can be used to create the management plan.
2008 Proceedings of the ASME Summer Heat Transfer Conference, HT 2008
The PUFF code was originally written and designed to calculate the rise of a large detonation or deflagration non-continuous plume (puff) in the atmosphere. It is based on a buoyant spherical control volume approximation. The theory for the model is updated and presented. The model has been observed to result in what are believed to be unrealistic plume elevation oscillations as the plume approaches the terminal elevation. Recognizing a similarity between the equations for a classical damped spring oscillator and the present model, the plume rise model can be analyzed by evaluating equivalent spring constants and damping functions. Such an analysis suggests a buoyant plume in the atmosphere is significantly under-damped, explaining the occurrence of the oscillations in the model. Based on lessons learned from the analogy evaluations and guided by comparisons with early plume rise data, a set of assumptions is proposed to address the excessive oscillations found in the predicted plume near the terminal elevation, and to improve the robustness of the predictions. This is done while retaining the basic context of the present model formulation. The propriety of the present formulation is evaluated. The revised model fits the vast majority of the existing data to +/- 25%, which is considered reasonable given the present model form. Further validation efforts would be advisable, but are impeded by a lack of quality existing datasets. Copyright © 2008 by ASME.
Proceedings of SPIE - The International Society for Optical Engineering
Iris recognition utilizes distinct patterns found in the human iris to perform identification. Image acquisition is a critical first step towards successful operation of iris recognition systems. However, the quality of iris images required by standard iris recognition algorithms puts hard constraints on the imaging optical systems which have resulted in demonstrated systems to date requiring a relatively short subject stand-off distance. In this paper, we study long-range iris recognition at distances as large as 200 meters, and determine conditions the imaging system must satisfy for identification at longer stand-off distances. © 2009 SPIE.
Proceedings of SPIE - The International Society for Optical Engineering
Effective defense against chemical and biological threats requires an "end-to-end" strategy that encompasses the entire problem space, from threat assessment and target hardening to response planning and recovery. A key element of the strategy is the definition of appropriate system requirements for surveillance and detection of threat agents. Our end-toend approach to venue chem/bio defense is captured in the Facilities Weapons of Mass Destruction Decision Analysis Capability (FacDAC), an integrated system-of-systems toolset that can be used to generate requirements across all stages of detector development. For example, in the early stage of detector development the approach can be used to develop performance targets (e.g., sensitivity, selectivity, false positive rate) to provide guidance on what technologies to pursue. In the development phase, after a detector technology has been selected, the approach can aid in determining performance trade-offs and down-selection of competing technologies. During the application stage, the approach can be employed to design optimal defensive architectures that make the best use of available technology to maximize system performance. This presentation will discuss the end-to-end approach to defining detector requirements and demonstrate the capabilities of the FacDAC toolset using examples from a number of studies for the Department of Homeland Security. © 2009 SPIE.
The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.
This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.
Abstract not provided.
The modeling of solids is most naturally placed within a Lagrangian framework because it requires constitutive models which depend on knowledge of the original material orientations and subsequent deformations. Detailed kinematic information is needed to ensure material frame indifference which is captured through the deformation gradient F. Such information can be tracked easily in a Lagrangian code. Unfortunately, not all problems can be easily modeled using Lagrangian concepts due to severe distortions in the underlying motion. Either a Lagrangian/Eulerian or a pure Eulerian modeling framework must be introduced. We discuss and contrast several Lagrangian/Eulerian approaches for keeping track of the details of material kinematics.
Shared libraries have become ubiquitous and are used to achieve great resource efficiencies on many platforms. The same properties that enable efficiencies on time-shared computers and convenience on small clusters prove to be great obstacles to scalability on large clusters and High Performance Computing platforms. In addition, Light Weight operating systems such as Catamount have historically not supported the use of shared libraries specifically because they hinder scalability. In this report we will outline the methods of supporting shared libraries on High Performance Computing platforms using Light Weight kernels that we investigated. The considerations necessary to evaluate utility in this area are many and sometimes conflicting. While our initial path forward has been determined based on this evaluation we consider this effort ongoing and remain prepared to re-evaluate any technology that might provide a scalable solution. This report is an evaluation of a range of possible methods of supporting dynamically linked executables on capability class1 High Performance Computing platforms. Efforts are ongoing and extensive testing at scale is necessary to evaluate performance. While performance is a critical driving factor, supporting whatever method is used in a production environment is an equally important and challenging task.
The development continues for Finite State Abstraction (FSA) methods to enable Impacts Analysis (IA) for cyber attack against power grid control systems. Building upon previous work, we successfully demonstrated the addition of Bounded Model Checking (BMC) to the FSA method, which constrains grid conditions to reasonable behavior. The new FSA feature was successfully implemented and tested. FSA is an important part of IA for the power grid, complementing steady-state approaches. It enables the simultaneous evaluation of myriad dynamic trajectories for the system, which in turn facilitates IA for whole ranges of system conditions simultaneously. Given the potentially wide range and subtle nature of potential control system attacks, this is a promising research approach. In this report, we will explain the addition of BMC to the previous FSA work and some testing/simulation upon the implemented code using a two-bus test system. The current FSA approach and code allow the calculation of the acceptability of power grid conditions post-cyber attack (over a given time horizon and for a specific grid topology). Future work will enable analysis spanning various topologies (to account for switching events), as well as an understanding of the cyber attack stimuli that can lead to undesirable grid conditions.
Application performance is determined by a combination of many choices: hardware platform, runtime environment, languages and compilers used, algorithm choice and implementation, and more. In this complicated environment, we find that the use of mini-applications - small self-contained proxies for real applications - is an excellent approach for rapidly exploring the parameter space of all these choices. Furthermore, use of mini-applications enriches the interaction between application, library and computer system developers by providing explicit functioning software and concrete performance results that lead to detailed, focused discussions of design trade-offs, algorithm choices and runtime performance issues. In this paper we discuss a collection of mini-applications and demonstrate how we use them to analyze and improve application performance on new and future computer platforms.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This report summarizes the work completed under the Laboratory Directed Research and Development (LDRD) project 09-1351, 'Computational Investigation of Thermal Gas Separation for CO{sub 2} Capture'. Thermal gas separation for a binary mixture of carbon dioxide and nitrogen is investigated using the Direct Simulation Monte Carlo (DSMC) method of molecular gas dynamics. Molecular models for nitrogen and carbon dioxide are developed, implemented, compared to theoretical results, and compared to several experimental thermophysical properties. The molecular models include three translational modes, two fully excited rotational modes, and vibrational modes, whose degree of excitation depends on the temperature. Nitrogen has one vibrational mode, and carbon dioxide has four vibrational modes (two of which are degenerate). These models are used to perform a parameter study for mixtures of carbon dioxide and nitrogen confined between parallel walls over realistic ranges of gas temperatures and nominal concentrations of carbon dioxide. The degree of thermal separation predicted by DSMC is slightly higher than experimental values and is sensitive to the details of the molecular models.
Oil & Gas Journal
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Microelectronics Reliability
This paper reviews the significant successes in MEMS products from a reliability perspective. MEMS reliability is challenging and can be device and process dependent, but exercising the proper reliability techniques very early in product development has yielded success for many manufacturers. The reliability concerns of various devices are discussed including ink jet printhead, inertial sensors, pressure sensors, micro-mirror arrays, and the emerging applications of RF switches and resonators. Metal contacting RF switches are susceptible to hydrocarbon contamination which can increase the contact resistance over cycle count. Packaging techniques are described in the context of the whole reliability program. © 2009 Elsevier Ltd.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Interfaces are a critical determinant of the full range of materials properties, especially at the nanoscale. Computational and experimental methods developed a comprehensive understanding of nanograin evolution based on a fundamental understanding of internal interfaces in nanocrystalline nickel. It has recently been shown that nanocrystals with a bi-modal grain-size distribution possess a unique combination of high-strength, ductility and wear-resistance. We performed a combined experimental and theoretical investigation of the structure and motion of internal interfaces in nanograined metal and the resulting grain evolution. The properties of grain boundaries are computed for an unprecedented range of boundaries. The presence of roughening transitions in grain boundaries is explored and related to dramatic changes in boundary mobility. Experimental observations show that abnormal grain growth in nanograined materials is unlike conventional scale material in both the level of defects and the formation of unfavored phases. Molecular dynamics simulations address the origins of some of these phenomena.
Abstract not provided.
Engineers and designers are constantly searching for test methods to qualify or 'prove-in' new designs. In the High Reliability world of military parts, design test, qualification tests, in process tests and product characteristic tests, become even more important. The use of in process and function tests has been adopted as a way of demonstrating that parts will operate correctly and survive its 'use' environments. This paper discusses various types of tests to qualify the magnetic components - the current carrying capability of coils, a next assembly 'as used' test, a corona test and inductance at temperature test. Each of these tests addresses a different potential failure on a component. The entire process from design to implementation is described.
Journal of Immunology
Abstract not provided.
Abstract not provided.
Abstract not provided.
Physical Review Letters
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This study investigates a pathway to nanoporous structures created by hydrogen implantation in aluminum. Previous experiments for fusion applications have indicated that hydrogen and helium ion implantations are capable of producing bicontinuous nanoporous structures in a variety of metals. This study focuses specifically on hydrogen and helium implantations of aluminum, including complementary experimental results and computational modeling of this system. Experimental results show the evolution of the surface morphology as the hydrogen ion fluence increases from 10{sup 17} cm{sup -2} to 10{sup 18} cm{sup -2}. Implantations of helium at a fluence of 10{sup 18} cm{sup -2} produce porosity on the order of 10 nm. Computational modeling demonstrates the formation of alanes, their desorption, and the resulting etching of aluminum surfaces that likely drives the nanostructures that form in the presence of hydrogen.
This report documents the results of an FY09 ASC V&V Methods level 2 milestone demonstrating new algorithmic capabilities for mixed aleatory-epistemic uncertainty quantification. Through the combination of stochastic expansions for computing aleatory statistics and interval optimization for computing epistemic bounds, mixed uncertainty analysis studies are shown to be more accurate and efficient than previously achievable. Part I of the report describes the algorithms and presents benchmark performance results. Part II applies these new algorithms to UQ analysis of radiation effects in electronic devices and circuits for the QASPR program.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
LDRD Project 105876 was a research project whose primary goal was to discover the currently unknown science underlying the basic linear and nonlinear electrodynamic response of nanotubes and nanowires in a manner that will support future efforts aimed at converting forefront nanoscience into innovative new high-frequency nanodevices. The project involved experimental and theoretical efforts to discover and understand high frequency (MHz through tens of GHz) electrodynamic response properties of nanomaterials, emphasizing nanowires of silicon, zinc oxide, and carbon nanotubes. While there is much research on DC electrical properties of nanowires, electrodynamic characteristics still represent a major new frontier in nanotechnology. We generated world-leading insight into how the low dimensionality of these nanomaterials yields sometimes desirable and sometimes problematic high-frequency properties that are outside standard model electron dynamics. In the cases of silicon nanowires and carbon nanotubes, evidence of strong disorder or glass-like charge dynamics was measured, indicating that these materials still suffer from serious inhomogeneities that limit there high frequency performance. Zinc oxide nanowires were found to obey conventional Drude dynamics. In all cases, a significant practical problem involving large impedance mismatch between the high intrinsic impedance of all nanowires and nanotubes and high-frequency test equipment had to be overcome.
LDRD Project 139363 supported experiments to quantify the performance characteristics of monolithically integrated Schottky diode + quantum cascade laser (QCL) heterodyne mixers at terahertz (THz) frequencies. These integrated mixers are the first all-semiconductor THz devices to successfully incorporate a rectifying diode directly into the optical waveguide of a QCL, obviating the conventional optical coupling between a THz local oscillator and rectifier in a heterodyne mixer system. This integrated mixer was shown to function as a true heterodyne receiver of an externally received THz signal, a breakthrough which may lead to more widespread acceptance of this new THz technology paradigm. In addition, questions about QCL mode shifting in response to temperature, bias, and external feedback, and to what extent internal frequency locking can improve stability have been answered under this project.
Abstract not provided.
Abstract not provided.
Abstract not provided.
In many regions across the nation geologic formations are currently being used to store natural gas underground. Storage options are dictated by the regional geology and the operational need. The U.S. Department of Energy (DOE) has an interest in understanding theses various geologic storage options, the advantages and disadvantages, in the hopes of developing an underground facility for the storage of hydrogen as a low cost storage option, as part of the hydrogen delivery infrastructure. Currently, depleted gas/oil reservoirs, aquifers, and salt caverns are the three main types of underground natural gas storage in use today. The other storage options available currently and in the near future, such as abandoned coal mines, lined hard rock caverns, and refrigerated mined caverns, will become more popular as the demand for natural gas storage grows, especially in regions were depleted reservoirs, aquifers, and salt deposits are not available. The storage of hydrogen within the same type of facilities, currently used for natural gas, may add new operational challenges to the existing cavern storage industry, such as the loss of hydrogen through chemical reactions and the occurrence of hydrogen embrittlement. Currently there are only three locations worldwide, two of which are in the United States, which store hydrogen. All three sites store hydrogen within salt caverns.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This LDRD was a Sandia Fellowship that supported Andrea Hsu's PhD research at Texas A&M University and her work as a visitor at Sandia's Combustion Research Facility. The research project at Texas A&M University is concerned with the experimental characterization of hypersonic (Mach>5) flowfields using experimental diagnostics. This effort is part of a Multidisciplinary University Research Initiative (MURI) and is a collaboration between the Chemistry and Aerospace Engineering departments. Hypersonic flight conditions often lead to a non-thermochemical equilibrium (NTE) state of air, where the timescale of reaching a single (equilibrium) Boltzmann temperature is much longer than the timescale of the flow. Certain molecular modes, such as vibrational modes, may be much more excited than the translational or rotational modes of the molecule, leading to thermal-nonequilibrium. A nontrivial amount of energy is therefore contained within the vibrational mode, and this energy cascades into the flow as thermal energy, affecting flow properties through vibrational-vibrational (V-V) and vibrational-translational (V-T) energy exchanges between the flow species. The research is a fundamental experimental study of these NTE systems and involves the application of advanced laser and optical diagnostics towards hypersonic flowfields. The research is broken down into two main categories: the application and adaptation of existing laser and optical techniques towards characterization of NTE, and the development of new molecular tagging velocimetry techniques which have been demonstrated in an underexpanded jet flowfield, but may be extended towards a variety of flowfields. In addition, Andrea's work at Sandia National Labs involved the application of advanced laser diagnostics to flames and turbulent non-reacting jets. These studies included quench-free planar laser-induced fluorescence measurements of nitric oxide (NO) and mixture fraction measurements via Rayleigh scattering.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.
Abstract not provided.
The objective of this project is to lay the foundation for using ordered nanoporous materials known as metal-organic frameworks (MOFs) to create devices and sensors whose properties are determined by the dimensions of the MOF lattice. Our hypothesis is that because of the very short (tens of angstroms) distances between pores within the unit cell of these materials, enhanced electro-optical properties will be obtained when the nanopores are infiltrated to create nanoclusters of metals and other materials. Synthetic methods used to produce metal nanoparticles in disordered templates or in solution typically lead to a distribution of particle sizes. In addition, creation of the smallest clusters, with sizes of a few to tens of atoms, remains very challenging. Nanoporous metal-organic frameworks (MOFs) are a promising solution to these problems, since their long-range crystalline order creates completely uniform pore sizes with potential for both steric and chemical stabilization. We report results of synthetic efforts. First, we describe a systematic investigation of silver nanocluster formation within MOFs using three representative MOF templates. The as-synthesized clusters are spectroscopically consistent with dimensions {le} 1 nm, with a significant fraction existing as Ag{sub 3} clusters, as shown by electron paramagnetic resonance. Importantly, we show conclusively that very rapid TEM-induced MOF degradation leads to agglomeration and stable, easily imaged particles, explaining prior reports of particles larger than MOF pores. These results solve an important riddle concerning MOF-based templates and suggest that heterostructures composed of highly uniform arrays of nanoparticles within MOFs are feasible. Second, a preliminary study of methods to incorporate fulleride (K{sub 3}C{sub 60}) guest molecules within MOF pores that will impart electrical conductivity is described.
Nanoporous materials have maximum practical surface areas for electrical charge storage; every point in an electrode is within a few atoms of an interface at which charge can be stored. Metal-electrolyte interfaces make best use of surface area in porous materials. However, ion transport through long, narrow pores is slow. We seek to understand and optimize the tradeoff between capacity and transport. Modeling and measurements of nanoporous gold electrodes has allowed us to determine design principles, including the fact that these materials can deplete salt from the electrolyte, increasing resistance. We have developed fabrication techniques to demonstrate architectures inspired by these principles that may overcome identified obstacles. A key concept is that electrodes should be as close together as possible; this is likely to involve an interpenetrating pore structure. However, this may prove extremely challenging to fabricate at the finest scales; a hierarchically porous structure can be a worthy compromise.
In order to design a thermoelectric (TE) module suitable for long-term elevated temperature use, the Department 8651 has conducted parametric experiments to study material compatibility and thermal aging of TE materials. In addition, a comprehensive material characterization has been preformed to examine thermal stability of P- and N-based alloys and their interaction with interconnect diffusion barrier(s) and solder. At present, we have completed the 7-days aging experiments for 36 tiles, from ambient to 250 C. The thermal behavior of P- and N-based alloys and their thermal interaction with both Ni and Co diffusion barriers and Au-Sn solder were examined. The preliminary results show the microstructure, texture, alloy composition, and hardness of P-(Bi,Sb){sub 2}Te{sub 3} and N-Bi{sub 2}(Te,Se){sub 3} alloys are thermally stable up to 7 days annealing at 250 C. However, metallurgical reactions between the Ni-phosphor barriers and P-type base alloy were evident at temperatures {ge} 175 C. At 250 C, the depth (or distance) of the metallurgical reaction and/or Ni diffusion into P-(Bi,Sb){sub 2}Te{sub 3} is approximately 10-15 {micro}m. This thermal instability makes the Ni-phosphor barrier unsuitable for use at temperatures {ge} 175 C. The Co barrier appeared to be thermally stable and compatible with P(Bi,Sb){sub 2}Te{sub 3} at all annealing temperatures, with the exception of a minor Co diffusion into Au-Sn solder at {ge} 175 C. The effects of Co diffusion on long-term system reliability and/or the thermal stability of the Co barrier are yet to be determined. Te evaporation and its subsequent reaction with Au-Sn solder and Ni and Co barriers on the ends of the tiles at temperatures {ge} 175 C were evident. The Te loss and its effect on the long-term required stoichiometry of P-(Bi, Sb){sub 2}Te{sub 3} are yet to be understood. The aging experiments of 90 days and 180 days are ongoing and scheduled to be completed in 30 days and 150 days, respectively. Material characterization activities are continuing for the remaining tiles.
Thermoelectric materials have many applications in the conversion of thermal energy to electrical power and in solid-state cooling. One route to improving thermoelectric energy conversion efficiency in bulk material is to embed nanoscale inclusions. This report summarize key results from a recently completed LDRD project exploring the science underpinning the formation and stability of nanostructures in bulk thermoelectric and the quantitative relationships between such structures and thermoelectric properties.
Materials are desperately needed for cryogenic solid state refrigeration. We have investigated nanostructured Bi-Te alloys for their potential use in Ettingshausen refrigeration to liquid nitrogen temperatures. These alloys form alternating layers of Bi{sub 2} and Bi{sub 2}Te{sub 3} blocks in equilibrium. The composition Bi{sub 4}Te{sub 3} was identified as having the greatest potential for having a high Ettingshausen figure of merit. Both single crystal and polycrystalline forms of this material were synthesized. After evaluating the Ettingshausen figure of merit for a large, high quality polycrystal, we simulated the limits of practical refrigeration in this material from 200 to 77 K using a simple device model. The band structure was also computed and compared to experiments. We discuss the crystal growth, transport physics, and practical refrigeration potential of Bi-Te alloys.
Metal films perforated with subwavelength hole arrays have been show to demonstrate an effect known as Extraordinary Transmission (EOT). In EOT devices, optical transmission passbands arise that can have up to 90% transmission and a bandwidth that is only a few percent of the designed center wavelength. By placing a tunable dielectric in proximity to the EOT mesh, one can tune the center frequency of the passband. We have demonstrated over 1 micron of passive tuning in structures designed for an 11 micron center wavelength. If a suitable midwave (3-5 micron) tunable dielectric (perhaps BaTiO{sub 3}) were integrated with an EOT mesh designed for midwave operation, it is possible that a fast, voltage tunable, low temperature filter solution could be demonstrated with a several hundred nanometer passband. Such an element could, for example, replace certain components in a filter wheel solution.