Potential Site #6 Flow Model - FEHM Software
Abstract not provided.
Abstract not provided.
Optical Engineering
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proposed for publication in Physical Review Letters.
Abstract not provided.
Abstract not provided.
This report summarizes research performed at Sandia National Laboratories (SNL) in collaboration with the Environmental Protection Agency (EPA) to assess microarray quality on arrays from two platforms of interest to the EPA. Custom microarrays from two novel, commercially produced array platforms were imaged with SNL's unique hyperspectral imaging technology and multivariate data analysis was performed to investigate sources of emission on the arrays. No extraneous sources of emission were evident in any of the array areas scanned. This led to the conclusions that either of these array platforms could produce high quality, reliable microarray data for the EPA toxicology programs. Hyperspectral imaging results are presented and recommendations for microarray analyses using these platforms are detailed within the report.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Raman spectroscopic imaging is a powerful technique for visualizing chemical differences within a variety of samples based on the interaction of a substance's molecular vibrations with laser light. While Raman imaging can provide a unique view of samples such as residual stress within silicon devices, chemical degradation, material aging, and sample heterogeneity, the Raman scattering process is often weak and thus requires very sensitive collection optics and detectors. Many commercial instruments (including ones owned here at Sandia National Laboratories) generate Raman images by raster scanning a point focused laser beam across a sample--a process which can expose a sample to extreme levels of laser light and requires lengthy acquisition times. Our previous research efforts have led to the development of a state-of-the-art two-dimensional hyperspectral imager for fluorescence imaging applications such as microarray scanning. This report details the design, integration, and characterization of a line-scan Raman imaging module added to this efficient hyperspectral fluorescence microscope. The original hyperspectral fluorescence instrument serves as the framework for excitation and sample manipulation for the Raman imaging system, while a more appropriate axial transmissive Raman imaging spectrometer and detector are utilized for collection of the Raman scatter. The result is a unique and flexible dual-modality fluorescence and Raman imaging system capable of high-speed imaging at high spatial and spectral resolutions. Care was taken throughout the design and integration process not to hinder any of the fluorescence imaging capabilities. For example, an operator can switch between the fluorescence and Raman modalities without need for extensive optical realignment. The instrument performance has been characterized and sample data is presented.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proposed for publication in the Journal of The Electrochemical Society.
Abstract not provided.
Abstract not provided.
Advances in the Astronautical Sciences
Physical mechanisms responsible for single-event effects are reviewed, concentrating on silicon MOS devices and digital integrated circuits. A brief historical overview of single-event effects in space and terrestrial systems is given. Single-event upset mechanisms in SRAMs are briefly described, as is the initiation of single-event latchup in CMOS structures. Techniques for mitigating single-event effects are described, including the impact of technology trends on mitigation efficacy. Future challenges are briefly explored.
Proceedings of SPIE - The International Society for Optical Engineering
Relatively small motion measurement errors manifest themselves principally as a phase error in Synthetic Aperture Radar (SAR) complex data samples, and if large enough become observable as a smearing, blurring, or other degradation in the image. The phase error function can be measured and then deconvolved from the original data to compensate for the presumed motion error, ultimately resulting in a well-focused image. Techniques that do this are termed "autofocus" algorithms. A very popular autofocus algorithm is the Phase Gradient Autofocus (PGA) algorithm. The nearly universal, and typically reasonable, assumption is that the motion errors are less than the range resolution of the radar, allowing solely a phase correction to suffice. Very large relative motion measurement errors manifest themselves as an unexpected additional shifting or migration of target locations beyond any deterministic migration during the course of the synthetic aperture. Degradation in images from data exhibiting errors of this magnitude are substantial, often rendering the image completely useless. When residual range migration due to either real or apparent motion errors exceeds the range resolution, conventional autofocus algorithms fail. Excessive residual migration is increasingly encountered as resolutions become finer, less expensive inertial sensors are used, and operating ranges become longer (due to atmospheric phenomena). A new migration-correction autofocus algorithm has been developed that estimates the excessive residual migration and applies phase and frequency corrections to properly focus the image. This overcomes the conventional constraint that motion errors not exceed the SAR range resolution.
Proceedings of SPIE - The International Society for Optical Engineering
Sandia National Laboratories designs and builds Synthetic Aperture Radar (SAR) systems capable of forming high-quality exceptionally fine resolution images. During the spring of 2004 a series of test flights were completed with a Ka-band testbed SAR on Sandia's DeHavilland DHC-6 Twin Otter aircraft. A large data set was collected including real-time fine-resolution images of a variety of target scenes. This paper offers a sampling of high quality images representative of the output of Sandia's Ka-band testbed radar with resolutions as fine as 4 inches. Images will be annotated with descriptions of collection geometries and other relevant image parameters.
Proceedings of SPIE - The International Society for Optical Engineering
Airborne synthetic aperture radar (SAR) imaging systems have reached a degree of accuracy and sophistication that requires the validity of the free-space approximation for radio-wave propagation to be questioned. Based on the thin-lens approximation, a closed-form model for the focal length of a gravity wave-modulated refractive-index interface in the lower troposphere is developed. The model corroborates the suggestion that mesoscale, quasi-deterministic variations of the clear-air radio refractive-index field can cause diffraction patterns on the ground that are consistent with reflectivity artifacts occasionally seen in SAR images, particularly in those collected at long ranges, short wavelengths, and small grazing angles.
Proceedings of SPIE - The International Society for Optical Engineering
An unattended ground sensor (UGS) that attempts to perform target identification without providing some corresponding estimate of confidence level is of limited utility. In this context, a confidence level is a measure of probability that the detected vehicle is of a particular target class. Many identification methods attempt to match features of a detected vehicle to each of a set of target templates. Each template is formed empirically from features collected from vehicles known to be members of the particular target class. The nontarget class is inherent in this formulation and must be addressed in providing a confidence level. Often, it is difficult to adequately characterize the nontarget class empirically by feature collection, so assumptions must be made about the nontarget class. An analyst tasked with deciding how to use the confidence level of the classifier decision should have an accurate understanding of the meaning of the confidence level given. This paper compares several definitions of confidence level by considering the assumptions that are made in each, how these assumptions affect the meaning, and giving examples of implementing them in a practical acoustic UGS.
Physical Review B - Condensed Matter and Materials Physics
Heteroepitaxial growth of GeSi alloys on Si (001) under deposition conditions that partially limit surface mobility leads to an unusual form of strain-induced surface morphological evolution. We discuss a kinetic growth regime wherein pits form in a thick metastable wetting layer and, with additional deposition, evolve to a quantum dot molecule-a symmetric assembly of four quantum dots bound by the central pit. We discuss the size selection and scaling of quantum dot molecules. We then examine the key mechanism-preferred pit formation-in detail, using ex situ atomic force microscopy, in situ scanning tunneling microscopy, and kinetic Monte Carlo simulations. A picture emerges wherein localized pits appear to arise from a damped instability. When pits are annealed, they extend into an array of highly anisotropic surface grooves via a one-dimensional growth instability. Subsequent deposition on this grooved film results in a fascinating structure where compact quantum dots and molecules, as well as highly ramified quantum wires, are all simultaneously self-assembled. © 2005 The American Physical Society.
Physical Review Letters
We track individual twin boundaries in Ag films on Ru(0001) using low-energy electron microscopy. The twin boundaries, which separate film regions whose close-packed planes are stacked differently, move readily during film growth but relatively little during annealing. The growth-driven motion of twin boundaries occurs as film steps advance across the surface-as a new atomic Ag layer reaches an fcc twin boundary, the advancing step edge carries along the boundary. This coupling of the microstructural defect (twin boundary) and the surface step during growth can produce film regions over 10μm wide that are twin free. © 2005 The American Physical Society.
Proceedings of SPIE - The International Society for Optical Engineering
The shape control of thin, flexible structures has been studied primarily for edge-supported thin plates. For applications involving reconfigurable apertures such as membrane optics and active RF surfaces, corner-supported configurations may prove more applicable. Corner-supported adaptive structures allow for parabolic geometries, greater flexibility, and larger achievable deflections when compared to edge-supported geometries under similar actuation conditions. Preliminary models have been developed for corner-supported thin plates actuated by isotropic piezoelectric actuators. However, typical piezoelectric materials are known to be orthotropic. This paper extends a previously-developed isotropic model for a corner-supported, thin, rectangular bimorph to a more general orthotropic model for a bimorph actuated by a two-dimensional array of segmented PVDF laminates. First, a model determining the deflected shape of an orthotropic laminate for a given distribution of voltages over the actuator array is derived. Second, symmetric actuation of a bimorph consisting of orthotropic material is simulated using orthogonally-oriented laminae. Finally, the results of the model are shown to agree well with layered-shell finite element simulations for simple and complex voltage distributions.
Proposed for publication in Nanoletters.
Abstract not provided.
Journal of the American Ceramic Society
The ability to predict and control organic decomposition of a material under arbitrary thermal treatments is one of the main objectives of thermogravimetric studies. The development of this ability provides significant potential to ensure reliability and reproducibility for a given processing method and can be used in planning optimized thermal treatment strategies. Based on this report, the master sintering curve theory has been successfully extended to similar kinetically controlled phenomena. The theory has been applied to organic decomposition reaction kinetics to develop a master organic decomposition curve. The fundamental kinetics are assumed to be governed by an Arrhenius-type reaction rate, making master sintering and decomposition curves analogous to one another. The formulation and construction of a master decomposition curve are given in this paper. Simultaneous thermogravimetric and differential thermal analysis of a low-temperature co-fire glass/ceramic dielectric tape (Dupont 951 Green Tape™) is analyzed and used to demonstrate this new concept. The results reveal two independent organic decomposition reactions, the first occurring at ≈ 245° C and the second at ≈ 365°C. The analysis is used to produce a master decomposition curve and to calculate the activation energy for these reactions, at 86±6 and 142 ± 4 kJ/mol, respectively. In addition, the weight loss of product and the rate of decomposition can be predicted under varying thermal paths (time-temperature trajectories) following a minimal set of preliminary experiments. © 2005 The American Ceramic Society.
This report describes the test and evaluation methods by which the Teraflops Operating System, or TOS, that resides on Sandia's massively-parallel computer Janus is verified for production release. Also discussed are methods used to build TOS before testing and evaluating, miscellaneous utility scripts, a sample test plan, and a proposed post-test method for quickly examining the large number of test results. The purpose of the report is threefold: (1) to provide a guide to T&E procedures, (2) to aid and guide others who will run T&E procedures on the new ASCI Red Storm machine, and (3) to document some of the history of evaluation and testing of TOS. This report is not intended to serve as an exhaustive manual for testers to conduct T&E procedures.
Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities and uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.
Proposed for publication in Applied Geochemistry.
Abstract not provided.
Tensile and compressive stress-strain experiments on metals at strain rates in the range of 1-1000 1/s are relevant to many applications such as gravity-dropped munitions and airplane accidents. While conventional test methods cover strain rates up to {approx}10 s{sup -1} and split-Hopkinson and other techniques cover strain rates in excess of {approx}1000 s{sup -1}, there are no well defined techniques for the intermediate or ''Sub-Hopkinson'' strain-rate regime. The current work outlines many of the challenges in testing in the Sub-Hopkinson regime, and establishes methods for addressing these challenges. The resulting technique for obtaining intermediate rate stress-strain data is demonstrated in tension on a high-strength, high-toughness steel alloy (Hytuf) that could be a candidate alloy for earth penetrating munitions and in compression on a Au-Cu braze alloy.
We conducted broadband absorption measurements of atmospheric water vapor in the ground state, X {sup 1}A{sub 1} (000), from 0.4 to 2.7 THz with a pressure broadening-limited resolution of 6.2 GHz using pulsed, terahertz time-domain spectroscopy (THz-TDS). We measured a total of seventy-two absorption lines and forty-nine lines were identified as H{sub 2}{sup 16}O resonances. All the H{sub 2}{sup 16}O lines identified were confirmed by comparing their center frequencies to experimental values available in the literature.
Friction and wear are major concerns in the performance and reliability of micromechanical (MEMS) devices. While a variety of lubricant and wear resistant coatings are known which we might consider for application to MEMS devices, the severe geometric constraints of many micromechanical systems (high aspect ratios, shadowed surfaces) make most deposition methods for friction and wear-resistance coatings impossible. In this program we have produced and evaluate highly conformal, tribological coatings, deposited by atomic layer deposition (ALD), for use on surface micromachined (SMM) and LIGA structures. ALD is a chemical vapor deposition process using sequential exposure of reagents and self-limiting surface chemistry, saturating at a maximum of one monolayer per exposure cycle. The self-limiting chemistry results in conformal coating of high aspect ratio structures, with monolayer precision. ALD of a wide variety of materials is possible, but there have been no studies of structural, mechanical, and tribological properties of these films. We have developed processes for depositing thin (<100 nm) conformal coatings of selected hard and lubricious films (Al2O3, ZnO, WS2, W, and W/Al{sub 2}O{sub 3} nanolaminates), and measured their chemical, physical, mechanical and tribological properties. A significant challenge in this program was to develop instrumentation and quantitative test procedures, which did not exist, for friction, wear, film/substrate adhesion, elastic properties, stress, etc., of extremely thin films and nanolaminates. New scanning probe and nanoindentation techniques have been employed along with detailed mechanics-based models to evaluate these properties at small loads characteristic of microsystem operation. We emphasize deposition processes and fundamental properties of ALD materials, however we have also evaluated applications and film performance for model SMM and LIGA devices.
This multinational test program is quantifying the aerosol particulates produced when a high energy density device (HEDD) impacts surrogate material and actual spent fuel test rodlets. The experimental work, performed in four consecutive test phases, has been in progress for several years. The overall program provides needed data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. This program also provides significant political benefits in international cooperation for nuclear security related evaluations. The spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of Transport and Storage Casks (WGSTSC), and supported by both the U.S. Department of Energy and Nuclear Regulatory Commission. This report summarizes the preliminary, Phase 1 work performed in 2001 and 2002 at Sandia National Laboratories and the Fraunhofer Institute, Germany, and documents the experimental results obtained, observations, and preliminary interpretations. Phase 1 testing included: performance quantifications of the HEDD devices; characterization of the HEDD or conical shaped charge (CSC) jet properties with multiple tests; refinement of the aerosol particle collection apparatus being used; and, CSC jet-aerosol tests using leaded glass plates and glass pellets, serving as representative brittle materials. Phase 1 testing was quite important for the design and performance of the following Phase 2 test program and test apparatus.
Due to the nature of many infectious agents, such as anthrax, symptoms may either take several days to manifest or resemble those of less serious illnesses leading to misdiagnosis. Thus, bioterrorism attacks that include the release of such agents are particularly dangerous and potentially deadly. For this reason, a system is needed for the quick and correct identification of disease outbreaks. The Real-time Outbreak Disease Surveillance System (RODS), initially developed by Carnegie Mellon University and the University of Pittsburgh, was created to meet this need. The RODS software implements different classifiers for pertinent health surveillance data in order to determine whether or not an outbreak has occurred. In an effort to improve the capability of RODS at detecting outbreaks, we incorporate a data fusion method. Data fusion is used to improve the results of a single classification by combining the output of multiple classifiers. This paper documents the first stages of the development of a data fusion system that can combine the output of the classifiers included in RODS.
Large complex teams (e.g., DOE labs) must achieve sustained productivity in critical operations (e.g., weapons and reactor development) while maintaining safety for involved personnel, the public, and physical assets, as well as security for property and information. This requires informed management decisions that depend on tradeoffs of factors such as the mode and extent of personnel protection, potential accident consequences, the extent of information and physical asset protection, and communication with and motivation of involved personnel. All of these interact (and potentially interfere) with each other and must be weighed against financial resources and implementation time. Existing risk analysis tools can successfully treat physical response, component failure, and routine human actions. However, many ''soft'' factors involving human motivation and interaction among weakly related factors have proved analytically problematic. There has been a need for an effective software tool capable of quantifying these tradeoffs and helping make rational choices. This type of tool, developed during this project, facilitates improvements in safety, security, and productivity, and enables measurement of improvements as a function of resources expended. Operational safety, security, and motivation are significantly influenced by ''latent effects'', which are pre-occurring influences. One example of these is that an atmosphere of excessive fear can suppress open and frank disclosures, which can in turn hide problems, impede correction, and prevent lessons learned. Another is that a cultural mind-set of commitment, self-responsibility, and passion for an activity is a significant contributor to the activity's success. This project pursued an innovative approach for quantitatively analyzing latent effects in order to link the above types of factors, aggregating available information into quantitative metrics that can contribute to strategic management decisions, and measuring the results. The approach also evaluates the inherent uncertainties, and allows for tracking dynamics for early response and assessing developing trends. The model development is based on how factors combine and influence other factors in real time and over extended time periods. Potential strategies for improvement can be simulated and measured. Input information can be determined by quantification of qualitative information in a structured derivation process. This has proved to be a promising new approach for research and development applied to personnel performance and risk management.
Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.
A previously-developed experimental facility has been used to determine gas-surface thermal accommodation coefficients from the pressure dependence of the heat flux between parallel plates of similar material but different surface finish. Heat flux between the plates is inferred from measurements of temperature drop between the plate surface and an adjacent temperature-controlled water bath. Thermal accommodation measurements were determined from the pressure dependence of the heat flux for a fixed plate separation. Measurements of argon and nitrogen in contact with standard machined (lathed) or polished 304 stainless steel plates are indistinguishable within experimental uncertainty. Thus, the accommodation coefficient of 304 stainless steel with nitrogen and argon is estimated to be 0.80 {+-} 0.02 and 0.87 {+-} 0.02, respectively, independent of the surface roughness within the range likely to be encountered in engineering practice. Measurements of the accommodation of helium showed a slight variation with 304 stainless steel surface roughness: 0.36 {+-} 0.02 for a standard machine finish and 0.40 {+-} 0.02 for a polished finish. Planned tests with carbon-nanotube-coated plates will be performed when 304 stainless-steel blanks have been successfully coated.
Two different Sandia MEMS devices have been tested in a high-g environment to determine their performance and survivability. The first test was performed using a drop-table to produce a peak acceleration load of 1792 g's over a period of 1.5 ms. For the second test the MEMS devices were assembled in a gun-fired penetrator and shot into a cement target at the Army Waterways Experiment Station in Vicksburg Mississippi. This test resulted in a peak acceleration of 7191 g's for a duration of 5.5 ms. The MEMS devices were instrumented using the MEMS Diagnostic Extraction System (MDES), which is capable of driving the devices and recording the device output data during the high-g event, providing in-flight data to assess the device performance. A total of six devices were monitored during the experiments, four mechanical non-volatile memory devices (MNVM) and two Silicon Reentry Switches (SiRES). All six devices functioned properly before, during, and after each high-g test without a single failure. This is the first known test under flight conditions of an active, powered MEMS device at Sandia.
Optoelectronic microsystems are more and more prevalent as researchers seek to increase transmission bandwidths, implement electrical isolation, enhance security, or take advantage of sensitive optical sensing methods. Board level photonic integration techniques continue to improve, but photonic microsystems and fiber interfaces remain problematic, especially upon size reduction. Optical fiber is unmatched as a transmission medium for distances ranging from tens of centimeters to kilometers. The difficulty with using optical fiber is the small size of the core (approximately 9 {micro}m for the core of single mode telecommunications fiber) and the tight requirement on spot size and input numerical aperture (NA). Coupling to devices such as vertical cavity emitting lasers (VCSELs) and photodetectors presents further difficulties since these elements work in a plane orthogonal to the electronics board and typically require additional optics. This leads to the need for a packaging solution that can incorporate dissimilar materials while maintaining the tight alignment tolerances required by the optics. Over the course of this LDRD project, we have examined the capabilities of components such as VCSELs and photodetectors for high-speed operation and investigated the alignment tolerances required by the optical system. A solder reflow process has been developed to help fulfill these packaging requirements and the results of that work are presented here.
This report examines a number of hardware circuit design issues associated with implementing certain functions in FPGA and ASIC technologies. Here we show circuit designs for AES and SHA-1 that have an extremely small hardware footprint, yet show reasonably good performance characteristics as compared to the state of the art designs found in the literature. Our AES performance numbers are fueled by an optimized composite field S-box design for the Stratix chipset. Our SHA-1 designs use register packing and feedback functionalities of the Stratix LE, which reduce the logic element usage by as much as 72% as compared to other SHA-1 designs.
Proposed for publication in IEEE Transactions on Antennas and Propagation.
Abstract not provided.
While isentropic compression experiment (ICE) techniques have proved useful in deducing the high-pressure compressibility of a wide range of materials, they have encountered difficulties where large-volume phase transitions exist. The present study sought to apply graded-density impactor methods for producing isentropic loading to planar impact experiments to selected such problems. Cerium was chosen due to its 20% compression between 0.7 and 1.0 GPa. A model was constructed based on limited earlier dynamic data, and applied to the design of a suite of experiments. A capability for handling this material was installed. Two experiments were executed using shock/reload techniques with available samples, loading initially to near the gamma-alpha transition, then reloading. As well, two graded-density impactor experiments were conducted with alumina. A method for interpreting ICE data was developed and validated; this uses a wavelet construction for the ramp wave and includes corrections for the ''diffraction'' of wavelets by releases or reloads reflected from the sample/window interface. Alternate methods for constructing graded-density impactors are discussed.
Water is the critical natural resource of the new century. Significant improvements in traditional water treatment processes require novel approaches based on a fundamental understanding of nanoscale and atomic interactions at interfaces between aqueous solution and materials. To better understand these critical issues and to promote an open dialog among leading international experts in water-related specialties, Sandia National Laboratories sponsored a workshop on April 24-26, 2005 in Santa Fe, New Mexico. The ''Frontiers of Interfacial Water Research Workshop'' provided attendees with a critical review of water technologies and emphasized the new advances in surface and interfacial microscopy, spectroscopy, diffraction, and computer simulation needed for the development of new materials for water treatment.
Recent interest in reprocessing nuclear fuel in the U.S. has led to advanced separations processes that employ continuous processing and multiple extraction steps. These advanced plants will need to be designed with state-of-the-art instrumentation for materials accountancy and control. This research examines the current and upcoming instrumentation for nuclear materials accountancy for those most suited to the reprocessing environment. Though this topic has received attention time and again in the past, new technologies and changing world conditions require a renewed look and this subject. The needs for the advanced UREX+ separations concept are first identified, and then a literature review of current and upcoming measuring techniques is presented. The report concludes with a preliminary list of recommended instruments and measurement locations.
This report contains the summary of LDRD project 91312, titled ''Binary Electrokinetic Separation of Target DNA from Background DNA Primers''. This work is the first product of a collaboration with Columbia University and the Northeast BioDefense Center of Excellence. In conjunction with Ian Lipkin's lab, we are developing a technique to reduce false positive events, due to the detection of unhybridized reporter molecules, in a sensitive and multiplexed detection scheme for nucleic acids developed by the Lipkin lab. This is the most significant problem in the operation of their capability. As they are developing the tools for rapidly detecting the entire panel of hemorrhagic fevers this technology will immediately serve an important national need. The goal of this work was to attempt to separate nucleic acid from a preprocessed sample. We demonstrated the preconcentration of kilobase-pair length double-stranded DNA targets, and observed little preconcentration of 60 base-pair length single-stranded DNA probes. These objectives were accomplished in microdevice formats that are compatible with larger detection systems for sample pre-processing. Combined with Columbia's expertise, this technology would enable a unique, fast, and potentially compact method for detecting/identifying genetically-modified organisms and multiplexed rapid nucleic acid identification. Another competing approach is the DARPA funded IRIS Pharmaceutical TIGER platform which requires many hours for operation, and an 800k$ piece of equipment that fills a room. The Columbia/SNL system could provide a result in 30 minutes, at the cost of a few thousand dollars for the platform, and would be the size of a shoebox or smaller.
Abstract not provided.
This report documents the investigation regarding the failure of CPVC piping that was used to connect a solar hot water system to standard plumbing in a home. Details of the failure are described along with numerous pictures and diagrams. A potential failure mechanism is described and recommendations are outlined to prevent such a failure.
Political borders are controversial and contested spaces. In an attempt to better understand movement along and through political borders, this project applied the metaphor of a membrane to look at how people, ideas, and things ''move'' through a border. More specifically, the research team employed this metaphor in a system dynamics framework to construct a computer model to assess legal and illegal migration on the US-Mexico border. Employing a metaphor can be helpful, as it was in this project, to gain different perspectives on a complex system. In addition to the metaphor, the multidisciplinary team utilized an array of methods to gather data including traditional literature searches, an experts workshop, a focus group, interviews, and culling expertise from the individuals on the research team. Results from the qualitative efforts revealed strong social as well as economic drivers that motivate individuals to cross the border legally. Based on the information gathered, the team concluded that legal migration dynamics were of a scope we did not want to consider hence, available demographic models sufficiently capture migration at the local level. Results from both the quantitative and qualitative data searches were used to modify a 1977 border model to demonstrate the dynamic nature of illegal migration. Model runs reveal that current US-policies based on neo-classic economic theory have proven ineffective in curbing illegal migration, and that proposed enforcement policies are also likely to be ineffective. We suggest, based on model results, that improvement in economic conditions within Mexico may have the biggest impact on illegal migration to the U.S. The modeling also supports the views expressed in the current literature suggesting that demographic and economic changes within Mexico are likely to slow illegal migration by 2060 with no special interventions made by either government.
Current Joint Test Assembly (JTA) neutron monitors rely on knock-on proton type detectors that are susceptible to X-rays and low energy gamma rays. We investigated two novel plastic scintillating fiber directional neutron detector prototypes. One prototype used a fiber selected such that the fiber width was less than 2.1mm which is the range of a proton in plastic. The difference in the distribution of recoil proton energy deposited in the fiber was used to determine the incident neutron direction. The second prototype measured both the recoil proton energy and direction. The neutron direction was determined from the kinematics of single neutron-proton scatters. This report describes the development and performance of these detectors.
A turbulence model for buoyant flows has been developed in the context of a k-{var_epsilon} turbulence modeling approach. A production term is added to the turbulent kinetic energy equation based on dimensional reasoning using an appropriate time scale for buoyancy-induced turbulence taken from the vorticity conservation equation. The resulting turbulence model is calibrated against far field helium-air spread rate data, and validated with near source, strongly buoyant helium plume data sets. This model is more numerically stable and gives better predictions over a much broader range of mesh densities than the standard k-{var_epsilon} model for these strongly buoyant flows.
Because of the inevitable depletion of fossil fuels and the corresponding release of carbon to the environment, the global energy future is complex. Some of the consequences may be politically and economically disruptive, and expensive to remedy. For the next several centuries, fuel requirements will increase with population, land use, and ecosystem degradation. Current or projected levels of aggregated energy resource use will not sustain civilization as we know it beyond a few more generations. At the same time, issues of energy security, reliability, sustainability, recoverability, and safety need attention. We supply a top-down, qualitative model--the surety model--to balance expenditures of limited resources to assure success while at the same time avoiding catastrophic failure. Looking at U.S. energy challenges from a surety perspective offers new insights on possible strategies for developing solutions to challenges. The energy surety model with its focus on the attributes of security and sustainability could be extrapolated into a global energy system using a more comprehensive energy surety model than that used here. In fact, the success of the energy surety strategy ultimately requires a more global perspective. We use a 200 year time frame for sustainability because extending farther into the future would almost certainly miss the advent and perfection of new technologies or changing needs of society.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Proposed for publication in Groundwater.
Abstract not provided.
UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.
Proposed for publication in Nuclear Instruments and Methods in Physics Research.
Abstract not provided.
We present a new ab initio method for electronic structure calculations of materials at finite temperature (FT) based on the all-electron quasiparticle self-consistent GW (QPscGW) approximation and Keldysh time-loop Green's function approach. We apply the method to Si, Ge, GaAs, InSb, and diamond and show that the band gaps of these materials universally decrease with temperature in contrast with the local density approximation (LDA) of density functional theory (DFT) where the band gaps universally increase. At temperatures of a few eV the difference between quasiparticle energies obtained in FT-QPscGW and FT-LDA approaches significantly reduces. This result suggests that existing simulations of very high temperature materials based on the FT-LDA are more justified then it might appear from well-known LDA band gap errors at zero-temperature.
The use of Ion Mobility Spectrometry (IMS)in the Detection of Contraband Sandia researchers use ion mobility spectrometers for trace chemical detection and analysis in a variety of projects and applications. Products developed in recent years based on IMS-technology include explosives detection personnel portals, the Material Area Access (MAA) checkpoint of the future, an explosives detection vehicle portal, hand-held detection systems such as the Hound and Hound II (all 6400), micro-IMS sensors (1700), ordnance detection (2500), and Fourier Transform IMS technology (8700). The emphasis to date has been on explosives detection, but the detection of chemical agents has also been pursued (8100 and 6400).
Abstract not provided.
Abstract not provided.
Abstract not provided.
This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.
Abstract not provided.
Abstract not provided.
AMarkovprocessmodelhasbeenusedfortheDARTsystemsanalysisstudy.ThebasicdesignthroughanalysisprocessisnotimmediatelydescribableasaMarkovprocess,butweshowhowatrueMarkovprocesscanbederivedandanalyzed.Wealsoshowhowsensitivitiesofthemodelwithrespecttotheinputvaluescanbecomputedefficiently.Thisisusefulinunderstandinghowtheresultsofthismodelcanbeusedtodeterminestrategiesforinvestmentthatwillimprovethedesignthroughanalysisprocess.3
An experimental technique was developed to perform isentropic compression of heated liquid tin samples at the Z Accelerator, and multiple such experiments were performed to investigate solidification under rapid compression. Preliminary analyses, using two different methods, of data from experiments with high uncertainty in sample thickness suggest that solidification can begin to occur during isentropic compression on time scales of less than 100 ns. Repeatability of this result has not been confirmed due to technical issues on the subsequent experiments performed. First-principles molecular-dynamics calculations based on density-functional theory showed good agreement with experimentally-determined structure factors for liquid tin, and were used to investigate the equation of state and develop a novel interatomic pseudo-potential for liquid tin and its high-pressure solid phase. Empirical-potential molecular-dynamics calculations, using the new potential, gave results for the solid-liquid interface velocity, which was found to vary linearly with difference in free energy between the solid and liquid phases, as well as the liquidus, the maximum over-pressurization, and the solid-liquid interfacial energy. These data will prove useful in future modeling of solidification kinetics for liquid tin.
Abstract not provided.
Damping vibrations is important in the design of some types of inertial sensing devices. One method for adding damping to a device is to use magnetic forces generated by a static magnetic field interacting with eddy currents. In this report, we develop a 2-dimensional finite element model for the analysis of quasistatic eddy currents in a thin sheet of conducting material. The model was used for design and sensitivity analyses of a novel mechanical oscillator that consists of a shuttle mass (thin sheet of conducting material) and a set of folded spring elements. The oscillator is damped through the interaction of a static magnetic field and eddy currents in the shuttle mass. Using a prototype device and Laser Dopler Velocimetry (LDV), measurements were compared to the model in a validation study using simulation based uncertainty analyses. Measurements were found to follow the trends predicted by the model.
The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.
The overall objective of the Nonproliferation and Assessments Scenario Development project is to create and analyze potential and plausible scenarios that would lead to an adversary's ability to acquire and use a biological weapon. The initial three months of funding was intended to be used to develop a scenario to demonstrate the efficacy of this analysis methodology; however, it was determined that a substantial amount of preliminary data collection would be needed before a proof of concept scenario could be developed. We have dedicated substantial effort to determine the acquisition pathways for Foot and Mouth Disease Virus, and similar processes will be applied to all pathogens of interest. We have developed a biosecurity assessments database to capture information on adversary skill locales, available skill sets in specific regions, pathogen sources and regulations involved in pathogen acquisition from legitimate facilities. FY06 funding, once released, will be dedicated to data collection on acquisition, production and dissemination requirements on a pathogen basis. Once pathogen data has been collected, scenarios will be developed and scored.
Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonal decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.
Abstract not provided.
Abstract not provided.
The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This Pollution Prevention Opportunity Assessment (PPOA) was conducted for Sandia National Laboratories/California Electronics Prototype Laboratory (EPL) in May 2005. The primary purpose of this PPOA is to provide recommendations to assist Electronics Prototype Laboratory personnel in reducing the generation of waste and improving the efficiency of their processes. This report contains a summary of the information collected, analyses performed and recommended options for implementation. The Sandia National Laboratories Pollution Prevention staff will continue to work with the EPL to implement the recommendations.
Explosive initiation and energy release have been studied in two sample geometries designed to minimize stochastic behavior in shock-loading experiments. These sample concepts include a design with explosive material occupying the hole locations of a close-packed bed of inert spheres and a design that utilizes infiltration of a liquid explosive into a well-defined inert matrix. Wave profiles transmitted by these samples in gas-gun impact experiments have been characterized by both velocity interferometry diagnostics and three-dimensional numerical simulations. Highly organized wave structures associated with the characteristic length scales of the deterministic samples have been observed. Initiation and reaction growth in an inert matrix filled with sensitized nitromethane (a homogeneous explosive material) result in wave profiles similar to those observed with heterogeneous explosives. Comparison of experimental and numerical results indicates that energetic material studies in deterministic sample geometries can provide an important new tool for validation of models of energy release in numerical simulations of explosive initiation and performance.
Threats to water distribution systems include release of contaminants and Denial of Service (DoS) attacks. A better understanding, and validated computational models, of the flow in water distribution systems would enable determination of sensor placement in real water distribution networks, allow source identification, and guide mitigation/minimization efforts. Validation data are needed to evaluate numerical models of network operations. Some data can be acquired in real-world tests, but these are limited by 1) unknown demand, 2) lack of repeatability, 3) too many sources of uncertainty (demand, friction factors, etc.), and 4) expense. In addition, real-world tests have limited numbers of network access points. A scale-model water distribution system was fabricated, and validation data were acquired over a range of flow (demand) conditions. Standard operating variables included system layout, demand at various nodes in the system, and pressure drop across various pipe sections. In addition, the location of contaminant (salt or dye) introduction was varied. Measurements of pressure, flowrate, and concentration at a large number of points, and overall visualization of dye transport through the flow network were completed. Scale-up issues that that were incorporated in the experiment design include Reynolds number, pressure drop across nodes, and pipe friction and roughness. The scale was chosen to be 20:1, so the 10 inch main was modeled with a 0.5 inch pipe in the physical model. Controlled validation tracer tests were run to provide validation to flow and transport models, especially of the degree of mixing at pipe junctions. Results of the pipe mixing experiments showed large deviations from predicted behavior and these have a large impact on standard network operations models.3