Wireless computer networks are increasing exponentially around the world. They are being implemented in both the unlicensed radio frequency (RF) spectrum (IEEE 802.11a/b/g) and the licensed spectrum (e.g., Firetide [1] and Motorola Canopy [2]). Wireless networks operating in the unlicensed spectrum are by far the most popular wireless computer networks in existence. The open (i.e., proprietary) nature of the IEEE 802.11 protocols and the availability of ''free'' RF spectrum have encouraged many producers of enterprise and common off-the-shelf (COTS) computer networking equipment to jump into the wireless arena. Competition between these companies has driven down the price of 802.11 wireless networking equipment and has improved user experiences with such equipment. The end result has been an increased adoption of the equipment by businesses and consumers, the establishment of the Wi-Fi Alliance [3], and widespread use of the Alliance's ''Wi-Fi'' moniker to describe these networks. Consumers use 802.11 equipment at home to reduce the burden of running wires in existing construction, facilitate the sharing of broadband Internet services with roommates or neighbors, and increase their range of ''connectedness''. Private businesses and government entities (at all levels) are deploying wireless networks to reduce wiring costs, increase employee mobility, enable non-employees to access the Internet, and create an added revenue stream to their existing business models (coffee houses, airports, hotels, etc.). Municipalities (Philadelphia; San Francisco; Grand Haven, MI) are deploying wireless networks so they can bring broadband Internet access to places lacking such access; offer limited-speed broadband access to impoverished communities; offer broadband in places, such as marinas and state parks, that are passed over by traditional broadband providers; and provide themselves with higher quality, more complete network coverage for use by emergency responders and other municipal agencies. In short, these Wi-Fi networks are being deployed everywhere. Much thought has been and is being put into evaluating cost-benefit analyses of wired vs. wireless networks and issues such as how to effectively cover an office building or municipality, how to efficiently manage a large network of wireless access points (APs), and how to save money by replacing an Internet service provider (ISP) with 802.11 technology. In comparison, very little thought and money are being focused on wireless security and monitoring for security purposes.
Traditional polar format image formation for Synthetic Aperture Radar (SAR) requires a large amount of processing power and memory in order to accomplish in real-time. These requirements can thus eliminate the possible usage of interpreted language environments such as MATLAB. However, with trapezoidal aperture phase history collection and changes to the traditional polar format algorithm, certain optimizations make MATLAB a possible tool for image formation. Thus, this document's purpose is two-fold. The first outlines a change to the existing Polar Format MATLAB implementation utilizing the Chirp Z-Transform that improves performance and memory usage achieving near realtime results for smaller apertures. The second is the addition of two new possible image formation options that perform a more traditional interpolation style image formation. These options allow the continued exploration of possible interpolation methods for image formation and some preliminary results comparing image quality are given.
This SAND report provides the technical progress through April 2005 of the Sandia-led project, ''Carbon Sequestration in Synechococcus Sp.: From Molecular Machines to Hierarchical Modeling'', funded by the DOE Office of Science GenomicsGTL Program. Understanding, predicting, and perhaps manipulating carbon fixation in the oceans has long been a major focus of biological oceanography and has more recently been of interest to a broader audience of scientists and policy makers. It is clear that the oceanic sinks and sources of CO{sub 2} are important terms in the global environmental response to anthropogenic atmospheric inputs of CO{sub 2} and that oceanic microorganisms play a key role in this response. However, the relationship between this global phenomenon and the biochemical mechanisms of carbon fixation in these microorganisms is poorly understood. In this project, we will investigate the carbon sequestration behavior of Synechococcus Sp., an abundant marine cyanobacteria known to be important to environmental responses to carbon dioxide levels, through experimental and computational methods. This project is a combined experimental and computational effort with emphasis on developing and applying new computational tools and methods. Our experimental effort will provide the biology and data to drive the computational efforts and include significant investment in developing new experimental methods for uncovering protein partners, characterizing protein complexes, identifying new binding domains. We will also develop and apply new data measurement and statistical methods for analyzing microamy experiments. Computational tools will be essential to our efforts to discover and characterize the function of the molecular machines of Synechococcus. To this end, molecular simulation methods will be coupled with knowledge discovery from diverse biological data sets for high-throughput discovery and characterization of protein-protein complexes. In addition, we will develop a set of novel capabilities for inference of regulatory pathways in microbial genomes across multiple sources of information through the integration of computational and experimental technologies. These capabilities will be applied to Synechococcus regulatory pathways to characterize their interaction map and identify component proteins in these pathways. We will also investigate methods for combining experimental and computational results with visualization and natural language tools to accelerate discovery of regulatory pathways. The ultimate goal of this effort is develop and apply new experimental and computational methods needed to generate a new level of understanding of how the Synechococcus genome affects carbon fixation at the global scale. Anticipated experimental and computational methods will provide ever-increasing insight about the individual elements and steps in the carbon fixation process, however relating an organism's genome to its cellular response in the presence of varying environments will require systems biology approaches. Thus a primary goal for this effort is to integrate the genomic data generated from experiments and lower level simulations with data from the existing body of literature into a whole cell model. We plan to accomplish this by developing and applying a set of tools for capturing the carbon fixation behavior of complex of Synechococcus at different levels of resolution. Finally, the explosion of data being produced by high-throughput experiments requires data analysis and models which are more computationally complex, more heterogeneous, and require coupling to ever increasing amounts of experimentally obtained data in varying formats. These challenges are unprecedented in high performance scientific computing and necessitate the development of a companion computational infrastructure to support this effort.
We report on the successful attempts to trigger high voltage pressurized gas switches by utilizing beam transport through 1 MO-cm deionized water. The wavelength of the laser radiation was 532 nm. We have investigated Nd: YAG laser triggering of a 6 MV, SF6 insulated gas switch for a range of laser and switch parameters. Laser wavelength of 532 nm with nominal pulse lengths of 10 ns full width half maximum (FWHM) were used to trigger the switch. The laser beam was transported through 67 cm-long cell of 1 MO-cm deionized water constructed with anti reflection UV grade fused silica windows. The laser beam was then focused to form a breakdown arc in the gas between switch electrodes. Less than 10 ns jitter in the operation of the switch was obtained for laser pulse energies of between 80-110 mJ. Breakdown arcs more than 35 mm-long were produced by using a 70 cm focusing optic.
This report summarizes research performed at Sandia National Laboratories (SNL) in collaboration with the Environmental Protection Agency (EPA) to assess microarray quality on arrays from two platforms of interest to the EPA. Custom microarrays from two novel, commercially produced array platforms were imaged with SNL's unique hyperspectral imaging technology and multivariate data analysis was performed to investigate sources of emission on the arrays. No extraneous sources of emission were evident in any of the array areas scanned. This led to the conclusions that either of these array platforms could produce high quality, reliable microarray data for the EPA toxicology programs. Hyperspectral imaging results are presented and recommendations for microarray analyses using these platforms are detailed within the report.
Raman spectroscopic imaging is a powerful technique for visualizing chemical differences within a variety of samples based on the interaction of a substance's molecular vibrations with laser light. While Raman imaging can provide a unique view of samples such as residual stress within silicon devices, chemical degradation, material aging, and sample heterogeneity, the Raman scattering process is often weak and thus requires very sensitive collection optics and detectors. Many commercial instruments (including ones owned here at Sandia National Laboratories) generate Raman images by raster scanning a point focused laser beam across a sample--a process which can expose a sample to extreme levels of laser light and requires lengthy acquisition times. Our previous research efforts have led to the development of a state-of-the-art two-dimensional hyperspectral imager for fluorescence imaging applications such as microarray scanning. This report details the design, integration, and characterization of a line-scan Raman imaging module added to this efficient hyperspectral fluorescence microscope. The original hyperspectral fluorescence instrument serves as the framework for excitation and sample manipulation for the Raman imaging system, while a more appropriate axial transmissive Raman imaging spectrometer and detector are utilized for collection of the Raman scatter. The result is a unique and flexible dual-modality fluorescence and Raman imaging system capable of high-speed imaging at high spatial and spectral resolutions. Care was taken throughout the design and integration process not to hinder any of the fluorescence imaging capabilities. For example, an operator can switch between the fluorescence and Raman modalities without need for extensive optical realignment. The instrument performance has been characterized and sample data is presented.
Physical mechanisms responsible for single-event effects are reviewed, concentrating on silicon MOS devices and digital integrated circuits. A brief historical overview of single-event effects in space and terrestrial systems is given. Single-event upset mechanisms in SRAMs are briefly described, as is the initiation of single-event latchup in CMOS structures. Techniques for mitigating single-event effects are described, including the impact of technology trends on mitigation efficacy. Future challenges are briefly explored.
Sandia National Laboratories designs and builds Synthetic Aperture Radar (SAR) systems capable of forming high-quality exceptionally fine resolution images. During the spring of 2004 a series of test flights were completed with a Ka-band testbed SAR on Sandia's DeHavilland DHC-6 Twin Otter aircraft. A large data set was collected including real-time fine-resolution images of a variety of target scenes. This paper offers a sampling of high quality images representative of the output of Sandia's Ka-band testbed radar with resolutions as fine as 4 inches. Images will be annotated with descriptions of collection geometries and other relevant image parameters.
Airborne synthetic aperture radar (SAR) imaging systems have reached a degree of accuracy and sophistication that requires the validity of the free-space approximation for radio-wave propagation to be questioned. Based on the thin-lens approximation, a closed-form model for the focal length of a gravity wave-modulated refractive-index interface in the lower troposphere is developed. The model corroborates the suggestion that mesoscale, quasi-deterministic variations of the clear-air radio refractive-index field can cause diffraction patterns on the ground that are consistent with reflectivity artifacts occasionally seen in SAR images, particularly in those collected at long ranges, short wavelengths, and small grazing angles.
An unattended ground sensor (UGS) that attempts to perform target identification without providing some corresponding estimate of confidence level is of limited utility. In this context, a confidence level is a measure of probability that the detected vehicle is of a particular target class. Many identification methods attempt to match features of a detected vehicle to each of a set of target templates. Each template is formed empirically from features collected from vehicles known to be members of the particular target class. The nontarget class is inherent in this formulation and must be addressed in providing a confidence level. Often, it is difficult to adequately characterize the nontarget class empirically by feature collection, so assumptions must be made about the nontarget class. An analyst tasked with deciding how to use the confidence level of the classifier decision should have an accurate understanding of the meaning of the confidence level given. This paper compares several definitions of confidence level by considering the assumptions that are made in each, how these assumptions affect the meaning, and giving examples of implementing them in a practical acoustic UGS.
The shape control of thin, flexible structures has been studied primarily for edge-supported thin plates. For applications involving reconfigurable apertures such as membrane optics and active RF surfaces, corner-supported configurations may prove more applicable. Corner-supported adaptive structures allow for parabolic geometries, greater flexibility, and larger achievable deflections when compared to edge-supported geometries under similar actuation conditions. Preliminary models have been developed for corner-supported thin plates actuated by isotropic piezoelectric actuators. However, typical piezoelectric materials are known to be orthotropic. This paper extends a previously-developed isotropic model for a corner-supported, thin, rectangular bimorph to a more general orthotropic model for a bimorph actuated by a two-dimensional array of segmented PVDF laminates. First, a model determining the deflected shape of an orthotropic laminate for a given distribution of voltages over the actuator array is derived. Second, symmetric actuation of a bimorph consisting of orthotropic material is simulated using orthogonally-oriented laminae. Finally, the results of the model are shown to agree well with layered-shell finite element simulations for simple and complex voltage distributions.
This report describes the test and evaluation methods by which the Teraflops Operating System, or TOS, that resides on Sandia's massively-parallel computer Janus is verified for production release. Also discussed are methods used to build TOS before testing and evaluating, miscellaneous utility scripts, a sample test plan, and a proposed post-test method for quickly examining the large number of test results. The purpose of the report is threefold: (1) to provide a guide to T&E procedures, (2) to aid and guide others who will run T&E procedures on the new ASCI Red Storm machine, and (3) to document some of the history of evaluation and testing of TOS. This report is not intended to serve as an exhaustive manual for testers to conduct T&E procedures.
Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities and uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.
Tensile and compressive stress-strain experiments on metals at strain rates in the range of 1-1000 1/s are relevant to many applications such as gravity-dropped munitions and airplane accidents. While conventional test methods cover strain rates up to {approx}10 s{sup -1} and split-Hopkinson and other techniques cover strain rates in excess of {approx}1000 s{sup -1}, there are no well defined techniques for the intermediate or ''Sub-Hopkinson'' strain-rate regime. The current work outlines many of the challenges in testing in the Sub-Hopkinson regime, and establishes methods for addressing these challenges. The resulting technique for obtaining intermediate rate stress-strain data is demonstrated in tension on a high-strength, high-toughness steel alloy (Hytuf) that could be a candidate alloy for earth penetrating munitions and in compression on a Au-Cu braze alloy.
We conducted broadband absorption measurements of atmospheric water vapor in the ground state, X {sup 1}A{sub 1} (000), from 0.4 to 2.7 THz with a pressure broadening-limited resolution of 6.2 GHz using pulsed, terahertz time-domain spectroscopy (THz-TDS). We measured a total of seventy-two absorption lines and forty-nine lines were identified as H{sub 2}{sup 16}O resonances. All the H{sub 2}{sup 16}O lines identified were confirmed by comparing their center frequencies to experimental values available in the literature.
Friction and wear are major concerns in the performance and reliability of micromechanical (MEMS) devices. While a variety of lubricant and wear resistant coatings are known which we might consider for application to MEMS devices, the severe geometric constraints of many micromechanical systems (high aspect ratios, shadowed surfaces) make most deposition methods for friction and wear-resistance coatings impossible. In this program we have produced and evaluate highly conformal, tribological coatings, deposited by atomic layer deposition (ALD), for use on surface micromachined (SMM) and LIGA structures. ALD is a chemical vapor deposition process using sequential exposure of reagents and self-limiting surface chemistry, saturating at a maximum of one monolayer per exposure cycle. The self-limiting chemistry results in conformal coating of high aspect ratio structures, with monolayer precision. ALD of a wide variety of materials is possible, but there have been no studies of structural, mechanical, and tribological properties of these films. We have developed processes for depositing thin (<100 nm) conformal coatings of selected hard and lubricious films (Al2O3, ZnO, WS2, W, and W/Al{sub 2}O{sub 3} nanolaminates), and measured their chemical, physical, mechanical and tribological properties. A significant challenge in this program was to develop instrumentation and quantitative test procedures, which did not exist, for friction, wear, film/substrate adhesion, elastic properties, stress, etc., of extremely thin films and nanolaminates. New scanning probe and nanoindentation techniques have been employed along with detailed mechanics-based models to evaluate these properties at small loads characteristic of microsystem operation. We emphasize deposition processes and fundamental properties of ALD materials, however we have also evaluated applications and film performance for model SMM and LIGA devices.
This multinational test program is quantifying the aerosol particulates produced when a high energy density device (HEDD) impacts surrogate material and actual spent fuel test rodlets. The experimental work, performed in four consecutive test phases, has been in progress for several years. The overall program provides needed data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments. This program also provides significant political benefits in international cooperation for nuclear security related evaluations. The spent fuel sabotage--aerosol test program is coordinated with the international Working Group for Sabotage Concerns of Transport and Storage Casks (WGSTSC), and supported by both the U.S. Department of Energy and Nuclear Regulatory Commission. This report summarizes the preliminary, Phase 1 work performed in 2001 and 2002 at Sandia National Laboratories and the Fraunhofer Institute, Germany, and documents the experimental results obtained, observations, and preliminary interpretations. Phase 1 testing included: performance quantifications of the HEDD devices; characterization of the HEDD or conical shaped charge (CSC) jet properties with multiple tests; refinement of the aerosol particle collection apparatus being used; and, CSC jet-aerosol tests using leaded glass plates and glass pellets, serving as representative brittle materials. Phase 1 testing was quite important for the design and performance of the following Phase 2 test program and test apparatus.
Due to the nature of many infectious agents, such as anthrax, symptoms may either take several days to manifest or resemble those of less serious illnesses leading to misdiagnosis. Thus, bioterrorism attacks that include the release of such agents are particularly dangerous and potentially deadly. For this reason, a system is needed for the quick and correct identification of disease outbreaks. The Real-time Outbreak Disease Surveillance System (RODS), initially developed by Carnegie Mellon University and the University of Pittsburgh, was created to meet this need. The RODS software implements different classifiers for pertinent health surveillance data in order to determine whether or not an outbreak has occurred. In an effort to improve the capability of RODS at detecting outbreaks, we incorporate a data fusion method. Data fusion is used to improve the results of a single classification by combining the output of multiple classifiers. This paper documents the first stages of the development of a data fusion system that can combine the output of the classifiers included in RODS.
Large complex teams (e.g., DOE labs) must achieve sustained productivity in critical operations (e.g., weapons and reactor development) while maintaining safety for involved personnel, the public, and physical assets, as well as security for property and information. This requires informed management decisions that depend on tradeoffs of factors such as the mode and extent of personnel protection, potential accident consequences, the extent of information and physical asset protection, and communication with and motivation of involved personnel. All of these interact (and potentially interfere) with each other and must be weighed against financial resources and implementation time. Existing risk analysis tools can successfully treat physical response, component failure, and routine human actions. However, many ''soft'' factors involving human motivation and interaction among weakly related factors have proved analytically problematic. There has been a need for an effective software tool capable of quantifying these tradeoffs and helping make rational choices. This type of tool, developed during this project, facilitates improvements in safety, security, and productivity, and enables measurement of improvements as a function of resources expended. Operational safety, security, and motivation are significantly influenced by ''latent effects'', which are pre-occurring influences. One example of these is that an atmosphere of excessive fear can suppress open and frank disclosures, which can in turn hide problems, impede correction, and prevent lessons learned. Another is that a cultural mind-set of commitment, self-responsibility, and passion for an activity is a significant contributor to the activity's success. This project pursued an innovative approach for quantitatively analyzing latent effects in order to link the above types of factors, aggregating available information into quantitative metrics that can contribute to strategic management decisions, and measuring the results. The approach also evaluates the inherent uncertainties, and allows for tracking dynamics for early response and assessing developing trends. The model development is based on how factors combine and influence other factors in real time and over extended time periods. Potential strategies for improvement can be simulated and measured. Input information can be determined by quantification of qualitative information in a structured derivation process. This has proved to be a promising new approach for research and development applied to personnel performance and risk management.
Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.
A previously-developed experimental facility has been used to determine gas-surface thermal accommodation coefficients from the pressure dependence of the heat flux between parallel plates of similar material but different surface finish. Heat flux between the plates is inferred from measurements of temperature drop between the plate surface and an adjacent temperature-controlled water bath. Thermal accommodation measurements were determined from the pressure dependence of the heat flux for a fixed plate separation. Measurements of argon and nitrogen in contact with standard machined (lathed) or polished 304 stainless steel plates are indistinguishable within experimental uncertainty. Thus, the accommodation coefficient of 304 stainless steel with nitrogen and argon is estimated to be 0.80 {+-} 0.02 and 0.87 {+-} 0.02, respectively, independent of the surface roughness within the range likely to be encountered in engineering practice. Measurements of the accommodation of helium showed a slight variation with 304 stainless steel surface roughness: 0.36 {+-} 0.02 for a standard machine finish and 0.40 {+-} 0.02 for a polished finish. Planned tests with carbon-nanotube-coated plates will be performed when 304 stainless-steel blanks have been successfully coated.
Two different Sandia MEMS devices have been tested in a high-g environment to determine their performance and survivability. The first test was performed using a drop-table to produce a peak acceleration load of 1792 g's over a period of 1.5 ms. For the second test the MEMS devices were assembled in a gun-fired penetrator and shot into a cement target at the Army Waterways Experiment Station in Vicksburg Mississippi. This test resulted in a peak acceleration of 7191 g's for a duration of 5.5 ms. The MEMS devices were instrumented using the MEMS Diagnostic Extraction System (MDES), which is capable of driving the devices and recording the device output data during the high-g event, providing in-flight data to assess the device performance. A total of six devices were monitored during the experiments, four mechanical non-volatile memory devices (MNVM) and two Silicon Reentry Switches (SiRES). All six devices functioned properly before, during, and after each high-g test without a single failure. This is the first known test under flight conditions of an active, powered MEMS device at Sandia.
Optoelectronic microsystems are more and more prevalent as researchers seek to increase transmission bandwidths, implement electrical isolation, enhance security, or take advantage of sensitive optical sensing methods. Board level photonic integration techniques continue to improve, but photonic microsystems and fiber interfaces remain problematic, especially upon size reduction. Optical fiber is unmatched as a transmission medium for distances ranging from tens of centimeters to kilometers. The difficulty with using optical fiber is the small size of the core (approximately 9 {micro}m for the core of single mode telecommunications fiber) and the tight requirement on spot size and input numerical aperture (NA). Coupling to devices such as vertical cavity emitting lasers (VCSELs) and photodetectors presents further difficulties since these elements work in a plane orthogonal to the electronics board and typically require additional optics. This leads to the need for a packaging solution that can incorporate dissimilar materials while maintaining the tight alignment tolerances required by the optics. Over the course of this LDRD project, we have examined the capabilities of components such as VCSELs and photodetectors for high-speed operation and investigated the alignment tolerances required by the optical system. A solder reflow process has been developed to help fulfill these packaging requirements and the results of that work are presented here.
This report examines a number of hardware circuit design issues associated with implementing certain functions in FPGA and ASIC technologies. Here we show circuit designs for AES and SHA-1 that have an extremely small hardware footprint, yet show reasonably good performance characteristics as compared to the state of the art designs found in the literature. Our AES performance numbers are fueled by an optimized composite field S-box design for the Stratix chipset. Our SHA-1 designs use register packing and feedback functionalities of the Stratix LE, which reduce the logic element usage by as much as 72% as compared to other SHA-1 designs.
While isentropic compression experiment (ICE) techniques have proved useful in deducing the high-pressure compressibility of a wide range of materials, they have encountered difficulties where large-volume phase transitions exist. The present study sought to apply graded-density impactor methods for producing isentropic loading to planar impact experiments to selected such problems. Cerium was chosen due to its 20% compression between 0.7 and 1.0 GPa. A model was constructed based on limited earlier dynamic data, and applied to the design of a suite of experiments. A capability for handling this material was installed. Two experiments were executed using shock/reload techniques with available samples, loading initially to near the gamma-alpha transition, then reloading. As well, two graded-density impactor experiments were conducted with alumina. A method for interpreting ICE data was developed and validated; this uses a wavelet construction for the ramp wave and includes corrections for the ''diffraction'' of wavelets by releases or reloads reflected from the sample/window interface. Alternate methods for constructing graded-density impactors are discussed.
Water is the critical natural resource of the new century. Significant improvements in traditional water treatment processes require novel approaches based on a fundamental understanding of nanoscale and atomic interactions at interfaces between aqueous solution and materials. To better understand these critical issues and to promote an open dialog among leading international experts in water-related specialties, Sandia National Laboratories sponsored a workshop on April 24-26, 2005 in Santa Fe, New Mexico. The ''Frontiers of Interfacial Water Research Workshop'' provided attendees with a critical review of water technologies and emphasized the new advances in surface and interfacial microscopy, spectroscopy, diffraction, and computer simulation needed for the development of new materials for water treatment.
Recent interest in reprocessing nuclear fuel in the U.S. has led to advanced separations processes that employ continuous processing and multiple extraction steps. These advanced plants will need to be designed with state-of-the-art instrumentation for materials accountancy and control. This research examines the current and upcoming instrumentation for nuclear materials accountancy for those most suited to the reprocessing environment. Though this topic has received attention time and again in the past, new technologies and changing world conditions require a renewed look and this subject. The needs for the advanced UREX+ separations concept are first identified, and then a literature review of current and upcoming measuring techniques is presented. The report concludes with a preliminary list of recommended instruments and measurement locations.
This report contains the summary of LDRD project 91312, titled ''Binary Electrokinetic Separation of Target DNA from Background DNA Primers''. This work is the first product of a collaboration with Columbia University and the Northeast BioDefense Center of Excellence. In conjunction with Ian Lipkin's lab, we are developing a technique to reduce false positive events, due to the detection of unhybridized reporter molecules, in a sensitive and multiplexed detection scheme for nucleic acids developed by the Lipkin lab. This is the most significant problem in the operation of their capability. As they are developing the tools for rapidly detecting the entire panel of hemorrhagic fevers this technology will immediately serve an important national need. The goal of this work was to attempt to separate nucleic acid from a preprocessed sample. We demonstrated the preconcentration of kilobase-pair length double-stranded DNA targets, and observed little preconcentration of 60 base-pair length single-stranded DNA probes. These objectives were accomplished in microdevice formats that are compatible with larger detection systems for sample pre-processing. Combined with Columbia's expertise, this technology would enable a unique, fast, and potentially compact method for detecting/identifying genetically-modified organisms and multiplexed rapid nucleic acid identification. Another competing approach is the DARPA funded IRIS Pharmaceutical TIGER platform which requires many hours for operation, and an 800k$ piece of equipment that fills a room. The Columbia/SNL system could provide a result in 30 minutes, at the cost of a few thousand dollars for the platform, and would be the size of a shoebox or smaller.
This report documents the investigation regarding the failure of CPVC piping that was used to connect a solar hot water system to standard plumbing in a home. Details of the failure are described along with numerous pictures and diagrams. A potential failure mechanism is described and recommendations are outlined to prevent such a failure.
Political borders are controversial and contested spaces. In an attempt to better understand movement along and through political borders, this project applied the metaphor of a membrane to look at how people, ideas, and things ''move'' through a border. More specifically, the research team employed this metaphor in a system dynamics framework to construct a computer model to assess legal and illegal migration on the US-Mexico border. Employing a metaphor can be helpful, as it was in this project, to gain different perspectives on a complex system. In addition to the metaphor, the multidisciplinary team utilized an array of methods to gather data including traditional literature searches, an experts workshop, a focus group, interviews, and culling expertise from the individuals on the research team. Results from the qualitative efforts revealed strong social as well as economic drivers that motivate individuals to cross the border legally. Based on the information gathered, the team concluded that legal migration dynamics were of a scope we did not want to consider hence, available demographic models sufficiently capture migration at the local level. Results from both the quantitative and qualitative data searches were used to modify a 1977 border model to demonstrate the dynamic nature of illegal migration. Model runs reveal that current US-policies based on neo-classic economic theory have proven ineffective in curbing illegal migration, and that proposed enforcement policies are also likely to be ineffective. We suggest, based on model results, that improvement in economic conditions within Mexico may have the biggest impact on illegal migration to the U.S. The modeling also supports the views expressed in the current literature suggesting that demographic and economic changes within Mexico are likely to slow illegal migration by 2060 with no special interventions made by either government.
Current Joint Test Assembly (JTA) neutron monitors rely on knock-on proton type detectors that are susceptible to X-rays and low energy gamma rays. We investigated two novel plastic scintillating fiber directional neutron detector prototypes. One prototype used a fiber selected such that the fiber width was less than 2.1mm which is the range of a proton in plastic. The difference in the distribution of recoil proton energy deposited in the fiber was used to determine the incident neutron direction. The second prototype measured both the recoil proton energy and direction. The neutron direction was determined from the kinematics of single neutron-proton scatters. This report describes the development and performance of these detectors.
A turbulence model for buoyant flows has been developed in the context of a k-{var_epsilon} turbulence modeling approach. A production term is added to the turbulent kinetic energy equation based on dimensional reasoning using an appropriate time scale for buoyancy-induced turbulence taken from the vorticity conservation equation. The resulting turbulence model is calibrated against far field helium-air spread rate data, and validated with near source, strongly buoyant helium plume data sets. This model is more numerically stable and gives better predictions over a much broader range of mesh densities than the standard k-{var_epsilon} model for these strongly buoyant flows.
Because of the inevitable depletion of fossil fuels and the corresponding release of carbon to the environment, the global energy future is complex. Some of the consequences may be politically and economically disruptive, and expensive to remedy. For the next several centuries, fuel requirements will increase with population, land use, and ecosystem degradation. Current or projected levels of aggregated energy resource use will not sustain civilization as we know it beyond a few more generations. At the same time, issues of energy security, reliability, sustainability, recoverability, and safety need attention. We supply a top-down, qualitative model--the surety model--to balance expenditures of limited resources to assure success while at the same time avoiding catastrophic failure. Looking at U.S. energy challenges from a surety perspective offers new insights on possible strategies for developing solutions to challenges. The energy surety model with its focus on the attributes of security and sustainability could be extrapolated into a global energy system using a more comprehensive energy surety model than that used here. In fact, the success of the energy surety strategy ultimately requires a more global perspective. We use a 200 year time frame for sustainability because extending farther into the future would almost certainly miss the advent and perfection of new technologies or changing needs of society.
UNIPROCESSOR PERFORMANCE ANALYSIS OF A REPRESENTATIVE WORKLOAD OF SANDIA NATIONAL LABORATORIES' SCIENTIFIC APPLICATIONS Master of Science in Electrical Engineering New Mexico State University Las Cruces, New Mexico, 2005 Dr. Jeanine Cook, Chair Throughout the last decade computer performance analysis has become absolutely necessary to maximum performance of some workloads. Sandia National Laboratories (SNL) located in Albuquerque, New Mexico is no different in that to achieve maximum performance of large scientific, parallel workloads performance analysis is needed at the uni-processor level. A representative workload has been chosen as the basis of a computer performance study to determine optimal processor characteristics in order to better specify the next generation of supercomputers. Cube3, a finite element test problem developed at SNL is a representative workload of their scientific workloads. This workload has been studied at the uni-processor level to understand characteristics in the microarchitecture that will lead to the overall performance improvement at the multi-processor level. The goal of studying vthis workload at the uni-processor level is to build a performance prediction model that will be integrated into a multi-processor performance model which is currently being developed at SNL. Through the use of performance counters on the Itanium 2 microarchitecture, performance statistics are studied to determine bottlenecks in the microarchitecture and/or changes in the application code that will maximize performance. From source code analysis a performance degrading loop kernel was identified and through the use of compiler optimizations a performance gain of around 20% was achieved.
We present a new ab initio method for electronic structure calculations of materials at finite temperature (FT) based on the all-electron quasiparticle self-consistent GW (QPscGW) approximation and Keldysh time-loop Green's function approach. We apply the method to Si, Ge, GaAs, InSb, and diamond and show that the band gaps of these materials universally decrease with temperature in contrast with the local density approximation (LDA) of density functional theory (DFT) where the band gaps universally increase. At temperatures of a few eV the difference between quasiparticle energies obtained in FT-QPscGW and FT-LDA approaches significantly reduces. This result suggests that existing simulations of very high temperature materials based on the FT-LDA are more justified then it might appear from well-known LDA band gap errors at zero-temperature.
The use of Ion Mobility Spectrometry (IMS)in the Detection of Contraband Sandia researchers use ion mobility spectrometers for trace chemical detection and analysis in a variety of projects and applications. Products developed in recent years based on IMS-technology include explosives detection personnel portals, the Material Area Access (MAA) checkpoint of the future, an explosives detection vehicle portal, hand-held detection systems such as the Hound and Hound II (all 6400), micro-IMS sensors (1700), ordnance detection (2500), and Fourier Transform IMS technology (8700). The emphasis to date has been on explosives detection, but the detection of chemical agents has also been pursued (8100 and 6400).
This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carry out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.