Friction and wear are major concerns in the performance and reliability of micromechanical (MEMS) devices. While a variety of lubricant and wear resistant coatings are known which we might consider for application to MEMS devices, the severe geometric constraints of many micromechanical systems (high aspect ratios, shadowed surfaces) make most deposition methods for friction and wear-resistance coatings impossible. In this program we have produced and evaluate highly conformal, tribological coatings, deposited by atomic layer deposition (ALD), for use on surface micromachined (SMM) and LIGA structures. ALD is a chemical vapor deposition process using sequential exposure of reagents and self-limiting surface chemistry, saturating at a maximum of one monolayer per exposure cycle. The self-limiting chemistry results in conformal coating of high aspect ratio structures, with monolayer precision. ALD of a wide variety of materials is possible, but there have been no studies of structural, mechanical, and tribological properties of these films. We have developed processes for depositing thin (<100 nm) conformal coatings of selected hard and lubricious films (Al2O3, ZnO, WS2, W, and W/Al{sub 2}O{sub 3} nanolaminates), and measured their chemical, physical, mechanical and tribological properties. A significant challenge in this program was to develop instrumentation and quantitative test procedures, which did not exist, for friction, wear, film/substrate adhesion, elastic properties, stress, etc., of extremely thin films and nanolaminates. New scanning probe and nanoindentation techniques have been employed along with detailed mechanics-based models to evaluate these properties at small loads characteristic of microsystem operation. We emphasize deposition processes and fundamental properties of ALD materials, however we have also evaluated applications and film performance for model SMM and LIGA devices.
This multinational test program is quantifying the aerosol particulates produced when a high energy density device (HEDD) impacts surrogate material and actual spent fuel test rodlets. The experimental work, performed in four consecutive test phases, has been in progress for several years. The overall program provides needed data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. This program also provides significant political benefits in international cooperation for nuclear security related evaluations. The spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of Transport and Storage Casks (WGSTSC), and supported by both the U.S. Department of Energy and Nuclear Regulatory Commission. This report summarizes the preliminary, Phase 1 work performed in 2001 and 2002 at Sandia National Laboratories and the Fraunhofer Institute, Germany, and documents the experimental results obtained, observations, and preliminary interpretations. Phase 1 testing included: performance quantifications of the HEDD devices; characterization of the HEDD or conical shaped charge (CSC) jet properties with multiple tests; refinement of the aerosol particle collection apparatus being used; and, CSC jet-aerosol tests using leaded glass plates and glass pellets, serving as representative brittle materials. Phase 1 testing was quite important for the design and performance of the following Phase 2 test program and test apparatus.
Due to the nature of many infectious agents, such as anthrax, symptoms may either take several days to manifest or resemble those of less serious illnesses leading to misdiagnosis. Thus, bioterrorism attacks that include the release of such agents are particularly dangerous and potentially deadly. For this reason, a system is needed for the quick and correct identification of disease outbreaks. The Real-time Outbreak Disease Surveillance System (RODS), initially developed by Carnegie Mellon University and the University of Pittsburgh, was created to meet this need. The RODS software implements different classifiers for pertinent health surveillance data in order to determine whether or not an outbreak has occurred. In an effort to improve the capability of RODS at detecting outbreaks, we incorporate a data fusion method. Data fusion is used to improve the results of a single classification by combining the output of multiple classifiers. This paper documents the first stages of the development of a data fusion system that can combine the output of the classifiers included in RODS.
Large complex teams (e.g., DOE labs) must achieve sustained productivity in critical operations (e.g., weapons and reactor development) while maintaining safety for involved personnel, the public, and physical assets, as well as security for property and information. This requires informed management decisions that depend on tradeoffs of factors such as the mode and extent of personnel protection, potential accident consequences, the extent of information and physical asset protection, and communication with and motivation of involved personnel. All of these interact (and potentially interfere) with each other and must be weighed against financial resources and implementation time. Existing risk analysis tools can successfully treat physical response, component failure, and routine human actions. However, many ''soft'' factors involving human motivation and interaction among weakly related factors have proved analytically problematic. There has been a need for an effective software tool capable of quantifying these tradeoffs and helping make rational choices. This type of tool, developed during this project, facilitates improvements in safety, security, and productivity, and enables measurement of improvements as a function of resources expended. Operational safety, security, and motivation are significantly influenced by ''latent effects'', which are pre-occurring influences. One example of these is that an atmosphere of excessive fear can suppress open and frank disclosures, which can in turn hide problems, impede correction, and prevent lessons learned. Another is that a cultural mind-set of commitment, self-responsibility, and passion for an activity is a significant contributor to the activity's success. This project pursued an innovative approach for quantitatively analyzing latent effects in order to link the above types of factors, aggregating available information into quantitative metrics that can contribute to strategic management decisions, and measuring the results. The approach also evaluates the inherent uncertainties, and allows for tracking dynamics for early response and assessing developing trends. The model development is based on how factors combine and influence other factors in real time and over extended time periods. Potential strategies for improvement can be simulated and measured. Input information can be determined by quantification of qualitative information in a structured derivation process. This has proved to be a promising new approach for research and development applied to personnel performance and risk management.
Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.
A previously-developed experimental facility has been used to determine gas-surface thermal accommodation coefficients from the pressure dependence of the heat flux between parallel plates of similar material but different surface finish. Heat flux between the plates is inferred from measurements of temperature drop between the plate surface and an adjacent temperature-controlled water bath. Thermal accommodation measurements were determined from the pressure dependence of the heat flux for a fixed plate separation. Measurements of argon and nitrogen in contact with standard machined (lathed) or polished 304 stainless steel plates are indistinguishable within experimental uncertainty. Thus, the accommodation coefficient of 304 stainless steel with nitrogen and argon is estimated to be 0.80 {+-} 0.02 and 0.87 {+-} 0.02, respectively, independent of the surface roughness within the range likely to be encountered in engineering practice. Measurements of the accommodation of helium showed a slight variation with 304 stainless steel surface roughness: 0.36 {+-} 0.02 for a standard machine finish and 0.40 {+-} 0.02 for a polished finish. Planned tests with carbon-nanotube-coated plates will be performed when 304 stainless-steel blanks have been successfully coated.
Two different Sandia MEMS devices have been tested in a high-g environment to determine their performance and survivability. The first test was performed using a drop-table to produce a peak acceleration load of 1792 g's over a period of 1.5 ms. For the second test the MEMS devices were assembled in a gun-fired penetrator and shot into a cement target at the Army Waterways Experiment Station in Vicksburg Mississippi. This test resulted in a peak acceleration of 7191 g's for a duration of 5.5 ms. The MEMS devices were instrumented using the MEMS Diagnostic Extraction System (MDES), which is capable of driving the devices and recording the device output data during the high-g event, providing in-flight data to assess the device performance. A total of six devices were monitored during the experiments, four mechanical non-volatile memory devices (MNVM) and two Silicon Reentry Switches (SiRES). All six devices functioned properly before, during, and after each high-g test without a single failure. This is the first known test under flight conditions of an active, powered MEMS device at Sandia.
Optoelectronic microsystems are more and more prevalent as researchers seek to increase transmission bandwidths, implement electrical isolation, enhance security, or take advantage of sensitive optical sensing methods. Board level photonic integration techniques continue to improve, but photonic microsystems and fiber interfaces remain problematic, especially upon size reduction. Optical fiber is unmatched as a transmission medium for distances ranging from tens of centimeters to kilometers. The difficulty with using optical fiber is the small size of the core (approximately 9 {micro}m for the core of single mode telecommunications fiber) and the tight requirement on spot size and input numerical aperture (NA). Coupling to devices such as vertical cavity emitting lasers (VCSELs) and photodetectors presents further difficulties since these elements work in a plane orthogonal to the electronics board and typically require additional optics. This leads to the need for a packaging solution that can incorporate dissimilar materials while maintaining the tight alignment tolerances required by the optics. Over the course of this LDRD project, we have examined the capabilities of components such as VCSELs and photodetectors for high-speed operation and investigated the alignment tolerances required by the optical system. A solder reflow process has been developed to help fulfill these packaging requirements and the results of that work are presented here.
This report examines a number of hardware circuit design issues associated with implementing certain functions in FPGA and ASIC technologies. Here we show circuit designs for AES and SHA-1 that have an extremely small hardware footprint, yet show reasonably good performance characteristics as compared to the state of the art designs found in the literature. Our AES performance numbers are fueled by an optimized composite field S-box design for the Stratix chipset. Our SHA-1 designs use register packing and feedback functionalities of the Stratix LE, which reduce the logic element usage by as much as 72% as compared to other SHA-1 designs.
While isentropic compression experiment (ICE) techniques have proved useful in deducing the high-pressure compressibility of a wide range of materials, they have encountered difficulties where large-volume phase transitions exist. The present study sought to apply graded-density impactor methods for producing isentropic loading to planar impact experiments to selected such problems. Cerium was chosen due to its 20% compression between 0.7 and 1.0 GPa. A model was constructed based on limited earlier dynamic data, and applied to the design of a suite of experiments. A capability for handling this material was installed. Two experiments were executed using shock/reload techniques with available samples, loading initially to near the gamma-alpha transition, then reloading. As well, two graded-density impactor experiments were conducted with alumina. A method for interpreting ICE data was developed and validated; this uses a wavelet construction for the ramp wave and includes corrections for the ''diffraction'' of wavelets by releases or reloads reflected from the sample/window interface. Alternate methods for constructing graded-density impactors are discussed.
Water is the critical natural resource of the new century. Significant improvements in traditional water treatment processes require novel approaches based on a fundamental understanding of nanoscale and atomic interactions at interfaces between aqueous solution and materials. To better understand these critical issues and to promote an open dialog among leading international experts in water-related specialties, Sandia National Laboratories sponsored a workshop on April 24-26, 2005 in Santa Fe, New Mexico. The ''Frontiers of Interfacial Water Research Workshop'' provided attendees with a critical review of water technologies and emphasized the new advances in surface and interfacial microscopy, spectroscopy, diffraction, and computer simulation needed for the development of new materials for water treatment.
Recent interest in reprocessing nuclear fuel in the U.S. has led to advanced separations processes that employ continuous processing and multiple extraction steps. These advanced plants will need to be designed with state-of-the-art instrumentation for materials accountancy and control. This research examines the current and upcoming instrumentation for nuclear materials accountancy for those most suited to the reprocessing environment. Though this topic has received attention time and again in the past, new technologies and changing world conditions require a renewed look and this subject. The needs for the advanced UREX+ separations concept are first identified, and then a literature review of current and upcoming measuring techniques is presented. The report concludes with a preliminary list of recommended instruments and measurement locations.
This report contains the summary of LDRD project 91312, titled ''Binary Electrokinetic Separation of Target DNA from Background DNA Primers''. This work is the first product of a collaboration with Columbia University and the Northeast BioDefense Center of Excellence. In conjunction with Ian Lipkin's lab, we are developing a technique to reduce false positive events, due to the detection of unhybridized reporter molecules, in a sensitive and multiplexed detection scheme for nucleic acids developed by the Lipkin lab. This is the most significant problem in the operation of their capability. As they are developing the tools for rapidly detecting the entire panel of hemorrhagic fevers this technology will immediately serve an important national need. The goal of this work was to attempt to separate nucleic acid from a preprocessed sample. We demonstrated the preconcentration of kilobase-pair length double-stranded DNA targets, and observed little preconcentration of 60 base-pair length single-stranded DNA probes. These objectives were accomplished in microdevice formats that are compatible with larger detection systems for sample pre-processing. Combined with Columbia's expertise, this technology would enable a unique, fast, and potentially compact method for detecting/identifying genetically-modified organisms and multiplexed rapid nucleic acid identification. Another competing approach is the DARPA funded IRIS Pharmaceutical TIGER platform which requires many hours for operation, and an 800k$ piece of equipment that fills a room. The Columbia/SNL system could provide a result in 30 minutes, at the cost of a few thousand dollars for the platform, and would be the size of a shoebox or smaller.
This report documents the investigation regarding the failure of CPVC piping that was used to connect a solar hot water system to standard plumbing in a home. Details of the failure are described along with numerous pictures and diagrams. A potential failure mechanism is described and recommendations are outlined to prevent such a failure.
Political borders are controversial and contested spaces. In an attempt to better understand movement along and through political borders, this project applied the metaphor of a membrane to look at how people, ideas, and things ''move'' through a border. More specifically, the research team employed this metaphor in a system dynamics framework to construct a computer model to assess legal and illegal migration on the US-Mexico border. Employing a metaphor can be helpful, as it was in this project, to gain different perspectives on a complex system. In addition to the metaphor, the multidisciplinary team utilized an array of methods to gather data including traditional literature searches, an experts workshop, a focus group, interviews, and culling expertise from the individuals on the research team. Results from the qualitative efforts revealed strong social as well as economic drivers that motivate individuals to cross the border legally. Based on the information gathered, the team concluded that legal migration dynamics were of a scope we did not want to consider hence, available demographic models sufficiently capture migration at the local level. Results from both the quantitative and qualitative data searches were used to modify a 1977 border model to demonstrate the dynamic nature of illegal migration. Model runs reveal that current US-policies based on neo-classic economic theory have proven ineffective in curbing illegal migration, and that proposed enforcement policies are also likely to be ineffective. We suggest, based on model results, that improvement in economic conditions within Mexico may have the biggest impact on illegal migration to the U.S. The modeling also supports the views expressed in the current literature suggesting that demographic and economic changes within Mexico are likely to slow illegal migration by 2060 with no special interventions made by either government.
Current Joint Test Assembly (JTA) neutron monitors rely on knock-on proton type detectors that are susceptible to X-rays and low energy gamma rays. We investigated two novel plastic scintillating fiber directional neutron detector prototypes. One prototype used a fiber selected such that the fiber width was less than 2.1mm which is the range of a proton in plastic. The difference in the distribution of recoil proton energy deposited in the fiber was used to determine the incident neutron direction. The second prototype measured both the recoil proton energy and direction. The neutron direction was determined from the kinematics of single neutron-proton scatters. This report describes the development and performance of these detectors.
A turbulence model for buoyant flows has been developed in the context of a k-{var_epsilon} turbulence modeling approach. A production term is added to the turbulent kinetic energy equation based on dimensional reasoning using an appropriate time scale for buoyancy-induced turbulence taken from the vorticity conservation equation. The resulting turbulence model is calibrated against far field helium-air spread rate data, and validated with near source, strongly buoyant helium plume data sets. This model is more numerically stable and gives better predictions over a much broader range of mesh densities than the standard k-{var_epsilon} model for these strongly buoyant flows.
Because of the inevitable depletion of fossil fuels and the corresponding release of carbon to the environment, the global energy future is complex. Some of the consequences may be politically and economically disruptive, and expensive to remedy. For the next several centuries, fuel requirements will increase with population, land use, and ecosystem degradation. Current or projected levels of aggregated energy resource use will not sustain civilization as we know it beyond a few more generations. At the same time, issues of energy security, reliability, sustainability, recoverability, and safety need attention. We supply a top-down, qualitative model--the surety model--to balance expenditures of limited resources to assure success while at the same time avoiding catastrophic failure. Looking at U.S. energy challenges from a surety perspective offers new insights on possible strategies for developing solutions to challenges. The energy surety model with its focus on the attributes of security and sustainability could be extrapolated into a global energy system using a more comprehensive energy surety model than that used here. In fact, the success of the energy surety strategy ultimately requires a more global perspective. We use a 200 year time frame for sustainability because extending farther into the future would almost certainly miss the advent and perfection of new technologies or changing needs of society.
UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.