It was discovered that MgO or Mg(OH){sub 2} when it reacts with water is a very strong sorbent for arsenic. Distribution constants, or K{sub d} values, are as high as 1 x 10{sup 6} L/mole. In this work, Mg(OH){sub 2} and other compounds have been investigated as sorbents for arsenic and other contaminants. This work has resulted in several major accomplishments including: (1) design, construction, and testing of a pressure sand filter to remove Mg(OH){sub 2} after it has sorbed arsenic from water, (2) stabilization of Mg(OH){sub 2} as a Sorrel's cement against reaction with carbonate that results in MgCO{sub 3} formation decreasing the efficiency of Mg(OH){sub 2} to sorb arsenic, and (3) the development of a new, very promising sorbent for arsenic based on zirconium. Zirconium is an environmentally benign material found in many common products such as toothpaste. It is currently used in water treatment and is very inexpensive. In this work, zirconium has been bonded to activated carbon, zeolites, sand and montmorillonite. Because of its high charge in ionic form (+6), zirconium is a strong sorbent for many anions including arsenic. In equilibrium experiments arsenic concentrations in water were reduced from 200 ppb to less than 1 ppb in less than 1 minute of contact time. Additionally, analytical methods for detecting arsenic in water have also been investigated. Various analytical techniques including HPLC, AA and ICP-MS are used for quantification of arsenic. Due to large matrix interferences HPLC and AA techniques are not very selective and are time consuming. ICP-MS is highly efficient, requires a low sample volume and has a high tolerance for interferences. All these techniques are costly and require trained staff, and with the exception of ICP-MS, these methods cannot be used at low ppb arsenic concentration without using a pre-concentration step. An alternative to these traditional techniques is to use a colorimetric method based on leucocrystal violet dye interaction with iodine. This method has been adapted in our facility for quantifying arsenic concentrations down to 14 ppb.
Aspen, a powerful economic modeling tool that uses agent modeling and genetic algorithms, can accurately simulate the economy. In it, individuals are hired by firms to produce a good that households then purchase. The firms decide what price to charge for this good, and based on that price, the households determine which firm to purchase from. We will attempt to discover the Nash Equilibrium price found in this model under two different methods of determining how many orders each firm receives. To keep it simple, we will assume there are only two firms in our model, and that these firms compete for the sale of one identical good.
This report outlines our work on the integration of high efficiency photonic lattice structures with MEMS (MicroElectroMechanical Systems). The simplest of these structures were based on 1-D mirror structures. These were integrated into a variety of devices, movable mirrors, switchable cavities and finally into Bragg fiber structures which enable the control of light in at least 2 dimensions. Of these devices, the most complex were the Bragg fibers. Bragg fibers consist of hollow tubes in which light is guided in a low index media (air) and confined by surrounding Bragg mirror stacks. In this work, structures with internal diameters from 5 to 30 microns have been fabricated and much larger structures should also be possible. We have demonstrated the fabrication of these structures with short wavelength band edges ranging from 400 to 1600nm. There may be potential applications for such structures in the fields of integrated optics and BioMEMS. We have also looked at the possibility of waveguiding in 3 dimensions by integrating defects into 3-dimensional photonic lattice structures. Eventually it may be possible to tune such structures by mechanically modulating the defects.
The purpose of this review was to provide insights and information to Sandia National Laboratories' (SNL) Education Council on the state of technical education and training at SNL in order to address the concern that a change in philosophy surrounding education had occurred. To accomplish this, the status of current and past technical training and education programs were compared, and significant changes at SNL were assessed for their impact on education and training. Major changes in education and training are in the advertisement of course offerings, the course delivery methods, and the funding mechanisms for student and instructor time as well as course costs. The significant changes in SNL which influenced technical training and education are the considerable increase in mandatory or compliance training, a fundamental shift in SNL's management structure from an institutional structure to a more business-like, project-budgeted structure, and the change in SNL's mission at the end of the Cold War. These changes contributed to less time for technical training, reduction of training funds, elimination of some training, and a Service Center approach to paying for training. Most importantly, the overall combined effect has resulted in a shift from a strategic to a tactical training approach. The Corporate Training Department (CTD) has maneuvered to accommodate these changes and keep abreast of constantly changing needs.
Sandia National Laboratories has been investigating the use of remotely operated weapon platforms in Department of Energy (DOE) facilities. These platforms offer significant force multiplication and enhancement by enabling near instantaneous response to attackers, increasing targeting accuracy, removing personnel from direct weapon fire, providing immunity to suppressive fire, and reducing security force size needed to effectively respond. Test results of the Telepresent Rapid Aiming Platform (TRAP) from Precision Remotes, Inc. have been exceptional and response from DOE sites and the U.S. Air Force is enthusiastic. Although this platform performs comparably to a trained marksman, the target acquisition speeds are up to three times longer. TRAP is currently enslaved to a remote operator's joystick. Tracking moving targets with a joystick is difficult; it dependent upon target range, movement patterns, and operator skill. Even well-trained operators encounter difficulty tracking moving targets. Adding intelligent targeting capabilities on a weapon platform such as TRAP would significantly improve security force response in terms of effectiveness and numbers of responders. The initial goal of this project was to integrate intelligent targeting with TRAP. However, the unavailability of a TRAP for laboratory purposes drove the development of a new platform that simulates TRAP but has a greater operating range and is significantly faster to reposition.
The Global Energy Futures Model (GEFM) is a demand-based, gross domestic product (GDP)-driven, dynamic simulation tool that provides an integrated framework to model key aspects of energy, nuclear-materials storage and disposition, environmental effluents from fossil and non fossil energy and global nuclear-materials management. Based entirely on public source data, it links oil, natural gas, coal, nuclear and renewable energy dynamically to greenhouse-gas emissions and 12 other measures of environmental impact. It includes historical data from 1990 to 2000, is benchmarked to the DOE/EIA/IEO 2001 [5] Reference Case for 2000 to 2020, and extrapolates energy demand through the year 2050. The GEFM is globally integrated, and breaks out five regions of the world: United States of America (USA), the Peoples Republic of China (China), the former Soviet Union (FSU), the Organization for Economic Cooperation and Development (OECD) nations excluding the USA (other industrialized countries), and the rest of the world (ROW) (essentially the developing world). The GEFM allows the user to examine a very wide range of ''what if'' scenarios through 2050 and to view the potential effects across widely dispersed, but interrelated areas. The authors believe that this high-level learning tool will help to stimulate public policy debate on energy, environment, economic and national security issues.
Recent terrorist attacks in the United States have increased concerns about potential national security consequences from energy supply disruptions. The purpose of this Laboratory Directed Research & Development (LDRD) is to develop a high-level dynamic simulation model that would allow policy makers to explore the national security consequences of major US. energy supply disruptions, and to do so in a way that would integrate energy, economic and environmental components. The model allows exploration of potential combinations of demand-driven energy supplies that meet chosen policy objectives, including: Mitigating economic losses, measured in national economic output and employment levels, due to terrorist activity or forced outages of the type seen in California; Control of greenhouse gas levels and growth rates; and Moderating US. energy import requirements. This work has built upon the Sandia US. Energy and greenhouse Gas Model (USEGM) by integrating a macroeconomic input-output framework into the model, adding the capability to assess the potential economic impact of energy supply disruptions and the associated national security issues. The economic impacts of disruptions are measured in terms of lost US. output (e.g., GDP, sectoral output) and lost employment, and are assessed either at a broad sectoral level (3 sectors) or at a disaggregated level (52 sectors). In this version of the model, physical energy disruptions result in quantitative energy shortfalls, and energy prices are not permitted to rise to clear the markets.
This LDRD project has involved the development and application of Sandia's massively parallel materials modeling software to several significant biophysical systems. They have been successful in applying the molecular dynamics code LAMMPS to modeling DNA, unstructured proteins, and lipid membranes. They have developed and applied a coupled transport-molecular theory code (Tramonto) to study ion channel proteins with gramicidin A as a prototype. they have used the Towhee configurational bias Monte-Carlo code to perform rigorous tests of biological force fields. they have also applied the MP-Sala reacting-diffusion code to model cellular systems. Electroporation of cell membranes has also been studied, and detailed quantum mechanical studies of ion solvation have been performed. In addition, new molecular theory algorithms have been developed (in FasTram) that may ultimately make protein solvation calculations feasible on workstations. Finally, they have begun implementation of a combined molecular theory and configurational bias Monte-Carlo code. They note that this LDRD has provided a basis for several new internal (e.g. several new LDRD) and external (e.g. 4 NIH proposals and a DOE/Genomes to Life) proposals.
As demonstrated by the anthrax attack through the United States mail, people infected by the biological agent itself will give the first indication of a bioterror attack. Thus, a distributed information system that can rapidly and efficiently gather and analyze public health data would aid epidemiologists in detecting and characterizing emerging diseases, including bioterror attacks. We propose using clusters of adverse health events in space and time to detect possible bioterror attacks. Space-time clusters can indicate exposure to infectious diseases or localized exposure to toxins. Most space-time clustering approaches require individual patient data. To protect the patient's privacy, we have extended these approaches to aggregated data and have embedded this extension in a sequential probability ratio test (SPRT) framework. The real-time and sequential nature of health data makes the SPRT an ideal candidate. The result of space-time clustering gives the statistical significance of a cluster at every location in the surveillance area and can be thought of as a ''health-index'' of the people living in this area. As a surrogate to bioterrorism data, we have experimented with two flu data sets. For both databases, we show that space-time clustering can detect a flu epidemic up to 21 to 28 days earlier than a conventional periodic regression technique. We have also tested using simulated anthrax attack data on top of a respiratory illness diagnostic category. Results show we do very well at detecting an attack as early as the second or third day after infected people start becoming severely symptomatic.
Thermoluminescent dosimeters (TLDs), particularly CaF{sub 2}:Mn, are often used as photon dosimeters in mixed (n/{gamma}) field environments. In these mixed field environments, it is desirable to separate the photon response of a dosimeter from the neutron response. For passive dosimeters that measure an integral response, such as TLDs, the separation of the two components must be performed by post-experiment analysis because the TLD reading system cannot distinguish between photon and neutron produced response. Using a model of an aluminum-equilibrated TLD-400 chip, a systematic effort has been made to analytically determine the various components that contribute to the neutron response of a TLD reading. The calculations were performed for five measured reactor neutron spectra and one theoretical thermal neutron spectrum. The five measured reactor spectra all have dosimetry quality experimental values for aluminum-equilibrated TLD-400 chips. Calculations were used to determined the percentage of the total TLD response produced by neutron interactions in the TLD and aluminum equilibrator. These calculations will aid the Sandia National Laboratories-Radiation Metrology Laboratory (SNL-RML) in the interpretation of the uncertainty for TLD dosimetry measurements in the mixed field environments produced by SNL reactor facilities.
Buried landmines are often detected through the chemical signature in the air above the soil surface by mine detection dogs. Environmental processes play a significant role in the chemical signature available for detection. Due to the shallow burial depth of landmines, the weather influences the release of chemicals from the landmine, transport through the soil to the surface, and degradation processes in the soil. The effect of weather on the landmine chemical signature from a PMN landmine was evaluated with the T2TNT code for Kabul, Afghanistan. Results for TNT and DNT gas-phase and soil solid-phase concentrations are presented as a function of time of the day and time of the year.
This report describes research and development of methods to couple vastly different subsystems and physical models and to encapsulate these methods in a Java{trademark}-based framework. The work described here focused on developing a capability to enable design engineers and safety analysts to perform multifidelity, multiphysics analyses more simply. In particular this report describes a multifidelity algorithm for thermal radiative heat transfer and illustrates its performance. Additionally, it describes a module-based computer software architecture that facilitates multifidelity, multiphysics simulations. The architecture is currently being used to develop an environment for modeling the effects of radiation on electronic circuits in support of the FY 2003 Hostile Environments Milestone for the Accelerated Strategic Computing Initiative.
The quantitative analysis of microstructure and sequence distribution in polysiloxane copolymers using high-resolution solution {sup 29}Si NMR is reported. Copolymers containing dimethylsiloxane (DMS) and diphenysiloxane (DPS) monomer units prepared with either high vinyl content (HVM) or low vinyl content (LVM) were analyzed. The average run length (R{sub exp}), the number average sequence length (l{sub A}, l{sub B}), along with the various linkage probabilities (p{sub AA}, p{sub AB}, p{sub BA}, and p{sub BB}) were determined for different production lots of the LVM97 and HVM97 samples to address the lot variability of microstructure in these materials.
In an effort to recruit and retain skilled workers in the Manufacturing Science and Technology Center (14000), an innovative and highly diverse team at Sandia National Laboratories and the U.S. Department of Energy joined with concerned community constitutents, such as Albuquerque Technical Vocational Institute and the Albuquerque Public Schools, to offer mentoring and on-the-job training to qualified students in high schools and community colleges. Now, within several years of its inception, the educational program called the Advanced Manufacturing Trades Training Program is a model in the community and the nation, while enabling Sandia to have valuable trained and skilled employees to meet its national mission and workforce demands.
This manual describes the use of the Xyce Parallel Electronic Simulator code for simulating electrical circuits at a variety of abstraction levels. The Xyce Parallel Electronic Simulator has been written to support,in a rigorous manner, the simulation needs of the Sandia National Laboratories electrical designers. As such, the development has focused on improving the capability over the current state-of-the-art in the following areas: (1) Capability to solve extremely large circuit problems by supporting large-scale parallel computing platforms (up to thousands of processors). Note that this includes support for most popular parallel and serial computers. (2) Improved performance for all numerical kernels (e.g., time integrator, nonlinear and linear solvers) through state-of-the-art algorithms and novel techniques. (3) A client-server or multi-tiered operating model wherein the numerical kernel can operate independently of the graphical user interface (GUI). (4) Object-oriented code design and implementation using modern coding-practices that ensure that the Xyce Parallel Electronic Simulator will be maintainable and extensible far into the future. The code is a parallel code in the most general sense of the phrase--a message passing parallel implementation--which allows it to run efficiently on the widest possible number of computing platforms. These include serial, shared-memory and distributed-memory parallel as well as heterogeneous platforms. Furthermore, careful attention has been paid to the specific nature of circuit-simulation problems to ensure that optimal parallel efficiency is achieved even as the number of processors grows. Another feature required by designers is the ability to add device models, many specific to the needs of Sandia, to the code. To this end, the device package in the Xyce Parallel Electronic Simulator is designed to support a variety of device model inputs. These input formats include standard analytical models, behavioral models and look-up tables. Combined with this flexible interface is an architectural design that greatly simplifies the addition of circuit models. One of the most important contribution Xyce makes to the designers at Sandia National Laboratories is in providing a platform for computational research and development aimed specifically at the needs of the Laboratory. With Xyce, Sandia now has an ''in-house''capability with which both new electrical (e.g., device model development) and algorithmic (e.g., faster time-integration methods) research and development can be performed. Furthermore, these capabilities will then be migrated to the end users.
Fixtures are tools used to hold parts in specific positions and orientations so that certain manufacturing steps can be carried out within required accuracies. Despite the importance of fixtures in the production of expensive devices at Sandia National Laboratories, there is little in-house expertise in mathematical design issues associated with fixtures. As a result, fixtures typically do not work as intended when they are first manufactured. Thus, an inefficient and expensive trial-and-error approach must be utilized. This design methodology adversely impacts important mission duties of Sandia National Laboratories, such as the production of neutron generators. The work performed under the support of this LDRD project took steps toward providing mechanical designers with software tools based on rigorous analytical techniques for dealing with fixture stability and tolerance stack-up.
This report presents a detailed multi-methods comparison of the spatial errors associated with finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. The errors are reported in terms of non-dimensional phase and group speeds, discrete diffusivity, artificial diffusivity, and grid-induced anisotropy. It is demonstrated that Fourier analysis (aka von Neumann analysis) provides an automatic process for separating the spectral behavior of the discrete advective operator into its symmetric dissipative and skew-symmetric advective components. Further it is demonstrated that streamline upwind Petrov-Galerkin and its control-volume finite element analogue, streamline upwind control-volume, produce both an artificial diffusivity and an artificial phase speed in addition to the usual semi-discrete artifacts observed in the discrete phase speed, group speed and diffusivity. For each of the numerical methods considered, asymptotic truncation error and resolution estimates are presented for the limiting cases of pure advection and pure diffusion. The Galerkin finite element method and its streamline upwind derivatives are shown to exhibit super-convergent behavior in terms of phase and group speed when a consistent mass matrix is used in the formulation. In contrast, the CVFEM method and its streamline upwind derivatives yield strictly second-order behavior. While this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common mathematical framework.
BURNCAL is a Fortran computer code designed to aid in analysis, prediction, and optimization of fuel burnup performance in a nuclear reactor. The code uses output parameters generated by the Monte Carlo neutronics code MCNP to determine the isotopic inventory as a function of time and power density. The code allows for multiple fueled regions to be analyzed. The companion code, RELOAD, can be used to shuffle fueled regions or reload regions with fresh fuel. BURNCAL can be used to study the reactivity effects and isotopic inventory as a function of time for a nuclear reactor system. Neutron transmutation, fission, and radioactive decay are included in the modeling of the production and removal terms for each isotope of interest. For a fueled region, neutron transmutation, fuel depletion, fission-product poisoning, actinide generation, and burnable poison loading and depletion effects are included in the calculation. Fueled and un-fueled regions, such as cladding and moderator, can be analyzed simultaneously. The nuclides analyzed are limited only by the neutron cross section availability in the MCNP cross-section library. BURNCAL is unique in comparison to other burnup codes in that it does not use the calculated neutron flux as input to other computer codes to generate the nuclide mixture for the next time step. Instead, BURNCAL directly uses the neutron absorption tally/reaction information generated by MCNP for each nuclide of interest to determine the nuclide inventory for that region. This allows for the full capabilities of MCNP to be incorporated into the calculation and a more accurate and robust analysis to be performed.
An analysis was conducted of the potential for unmanned and unattended robotic technologies for forward-based, immediate response capabilities that enables access and controlled task performance. The authors analyze high-impact response scenarios in conjunction with homeland security organizations, such as the NNSA Office of Emergency Response, the FBI, the National Guard, and the Army Technical Escort Unit, to cover a range of radiological, chemical and biological threats. They conducted an analysis of the potential of forward-based, unmanned and unattended robotic technologies to accelerate and enhance emergency and crisis response by Homeland Defense organizations. Response systems concepts were developed utilizing new technologies supported by existing emerging threats base technologies to meet the defined response scenarios. These systems will pre-position robotic and remote sensing capabilities stationed close to multiple sites for immediate action. Analysis of assembled systems included experimental activities to determine potential efficacy in the response scenarios, and iteration on systems concepts and remote sensing and robotic technologies, creating new immediate response capabilities for Homeland Defense.
The SNL/NM FY2001 SWEIS Annual Review discusses changes in facilities and facility operations that have occurred in selected and notable facilities since source data were collected for the SNL/NM SWEIS (DOE/EIS-0281). The following information is presented: (1) An updated overview of SNL/NM selected and notable facilities and infrastructure capabilities. (2) An overview of SNL/NM environment, safety, and health programs, including summaries of the purpose, operations, activities, hazards, and hazard controls at relevant facilities and risk management methods for SNL/NM. (3) Updated base year activities data, projections of FY2003 and FY2008 activities, together with related inventories, material consumption, emissions, waste, and resource consumption. (4) Appendices summarizing activities and related hazards at SNL/NM individual special, general, and highbay laboratories, and chemical purchases.
Despite many decades of jet-in-crossflow experimentation, a distinct lack of data remains for a supersonic jet exhausting into a subsonic compressible crossflow. The present investigation seeks to address this deficiency by examining the flowfield structure of a Mach 3.73 jet injected transversely from a flat plate into a subsonic compressible freestream. The experimental results described herein include the mean surface pressure field as mapped using static pressure taps on the flat plate and an identification of flow features by employing an oil-based surface flow tracer. The possibility of flow separation within the nozzle itself also is addressed using pressure taps along the nozzle interior wall, as is the asymmetry of the separation line due to the variation of the local backpressure around the perimeter of the nozzle orifice resulting from the jet-in-crossflow interaction. Pressure data both on the flat plate and within the nozzle are presented at numerous angles with respect to the crossflow freestream direction to provide a breadth of measurements throughout the interaction region. Since the data are intended for use in validating computational models, attention is paid to providing details regarding the experimental geometry, boundary conditions, flowfield nonuniformities, and uncertainty analyses. Eight different sets of data are provided, covering a range of values of the jet-to-freestream dynamic pressure ratio from 2.8 to 16.9 and a freestream Mach number range of 0.5 to 0.8.
ALEGRA is an arbitrary Lagrangian-Eulerian finite element code that emphasizes large distortion and shock propagation. This document describes the user input language for the code.
The most widely used algorithm for estimating seismic event hypocenters and origin times is iterative linear least squares inversion. In this paper we review the mathematical basis of the algorithm and discuss the major assumptions made during its derivation. We go on to explore the utility of using Levenberg-Marquardt damping to improve the performance of the algorithm in cases where some of these assumptions are violated. We also describe how location parameter uncertainties are calculated. A technique to estimate an initial seismic event location is described in an appendix.
Mild detonating fuse is an extruded aluminum tube that contains explosive material. Fuse prepared by a new supplier (Company B) exhibited a formability problem and was analyzed to determine the source of that formability problem. The formability problem was associated with cracking of the aluminum tube when it was bent around a small radius. Mild detonating fuse prepared by the existing supplier of product (Company A) did not exhibit a formability problem. The two fuses were prepared using different aluminum alloys. The microstructure and chemical composition of the two aluminum alloys were compared. It was found that the microstructure of the Company A aluminum exhibited clear signs of dynamic recrystallization while the Company B aluminum did not. Recrystallization results in the removal of dislocations associated with work hardening and will dramatically improve formability. Comparison of the chemical composition of the two aluminum alloys revealed that the Company A aluminum contained significantly lower levels of impurity elements (specifically Fe and Si) than the COMPANY B aluminum. It has been concluded that the formability problem exhibited by the COMPANY B material will be solved by using an aluminum alloy with low impurity content such as 1190-H18 or 1199-0.
The Blade Manufacturing Improvement Project explores new, unique and improved materials integrated with innovative manufacturing techniques that promise substantial economic enhancements for the fabrication of wind turbine blades. The primary objectives promote the development of advanced wind turbine blade manufacturing in ways that lower blade costs, cut rotor weight, reduce turbine maintenance costs, improve overall turbine quality and increase ongoing production reliability. Foam Matrix (FMI) has developed a wind turbine blade with an engineered foam core, incorporating advanced composite materials and using Resin Transfer Molding (RTM) processes to form a monolithic blade structure incorporating a single molding tool. Patented techniques are employed to increase blade load bearing capability and insure the uniform quality of the manufactured blade. In production quantities, FMI manufacturing innovations may return a sizable per blade cost reduction when compared to the cost of producing comparable blades with conventional methods.
A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings).
The objective of this heat transfer and fluid flow study is to assess the ability of a computational fluid dynamics (CFD) code to reproduce the experimental results, numerical simulation results, and heat transfer correlation equations developed in the literature for natural convection heat transfer within the annulus of horizontal concentric cylinders. In the literature, a variety of heat transfer expressions have been developed to compute average equivalent thermal conductivities. However, the expressions have been primarily developed for very small inner and outer cylinder radii and gap-widths. In this comparative study, interest is primarily focused on large gap widths (on the order of half meter or greater) and large radius ratios. From the steady-state CFD analysis it is found that the concentric cylinder models for the larger geometries compare favorably to the results of the Kuehn and Goldstein correlations in the Rayleigh number range of about 10{sup 5} to 10{sup 8} (a range that encompasses the laminar to turbulent transition). For Rayleigh numbers greater than 10{sup 8}, both numerical simulations and experimental data (from the literature) are consistent and result in slightly lower equivalent thermal conductivities than those obtained from the Kuehn and Goldstein correlations.
A study has been completed into the RAS features necessary for Massively Parallel Processor (MPP) systems. As part of this research, a use case model was built of how RAS features would be employed in an operational MPP system. Use cases are an effective way to specify requirements so that all involved parties can easily understand them. This technique is in contrast to laundry lists of requirements that are subject to misunderstanding as they are without context. As documented in the use case model, the study included a look at incorporating system software and end-user applications, as well as hardware, into the RAS system.
Three years of large-scale PDE-constrained optimization research and development are summarized in this report. We have developed an optimization framework for 3 levels of SAND optimization and developed a powerful PDE prototyping tool. The optimization algorithms have been interfaced and tested on CVD problems using a chemically reacting fluid flow simulator resulting in an order of magnitude reduction in compute time over a black box method. Sandia's simulation environment is reviewed by characterizing each discipline and identifying a possible target level of optimization. Because SAND algorithms are difficult to test on actual production codes, a symbolic simulator (Sundance) was developed and interfaced with a reduced-space sequential quadratic programming framework (rSQP++) to provide a PDE prototyping environment. The power of Sundance/rSQP++ is demonstrated by applying optimization to a series of different PDE-based problems. In addition, we show the merits of SAND methods by comparing seven levels of optimization for a source-inversion problem using Sundance and rSQP++. Algorithmic results are discussed for hierarchical control methods. The design of an interior point quadratic programming solver is presented.
A distributed reconfigurable micro-robotic system is a collection of unlimited numbers of distributed small, homogeneous robots designed to autonomously organize and reorganize in order to achieve mission-specified geometric shapes and functions. This project investigated the design, control, and planning issues for self-configuring and self-organizing robots. In the 2D space a system consisting of two robots was prototyped and successfully displayed automatic docking/undocking to operate dependently or independently. Additional modules were constructed to display the usefulness of a self-configuring system in various situations. In 3D a self-reconfiguring robot system of 4 identical modules was built. Each module connects to its neighbors using rotating actuators. An individual component can move in three dimensions on its neighbors. We have also built a self-reconfiguring robot system consisting of 9-module Crystalline Robot. Each module in this robot is actuated by expansion/contraction. The system is fully distributed, has local communication (to neighbors) capabilities and it has global sensing capabilities.
A computational method for the prediction of the bursting frequency associated with the coherent streamwise structures in high-speed compressible turbulent boundary layers is presented. The structures are described as wavelike disturbances of the turbulent mean flow. A direct resonance theory is used to determine the frequency of bursting. The resulting hydrodynamic linear stability equations are discretized by using a Chebyshev collocation method. A global numerical method capable of resolving the entire eigenvalue spectrum is used. Realistic turbulent mean velocity and temperature profiles are applied. For all of the compressible turbulent boundary layers calculated, the results show at least one frequency that satisfies the resonance condition. A second frequency can be identified for cases with high Reynolds numbers. An estimate is also made for the profile distribution of the temperature disturbance.
This report describes the development of an ultra-low power spread spectrum receiver based on a programmable surface acoustic wave (SAW) correlator. This work was funded under LDRD 02-26573, Ultra-Low Power Spread Spectrum Receiver. The approach taken in this project uses direct demodulation of a radio frequency (RF) signal from carrier frequency to data frequency. This approach was taken to reduce power consumption and size. The design is based on the technique of correlating the received RF signal with the preprogrammed spreading code. The system requirements, applications, design methodology, and testing results are all documented in the following pages.
Document is the final report for PSP project No. 14402-10-02 entitled ''Improved Manufacturing of MC4531 Mold Bodies Using High-Speed Machining (HSM)''. The basic physics of high speed machining is discussed in detail including multiple vibrational mode machining systems (milling and turning) and the effect of spindle speed regulation on maximizing the depth of cut and metal removal rate of a machining operation. The topics of cutting tests and tap tests are also discussed as well as the use of the HSM assistance software ''Harmonizer''. Results of the application of HSM to the machining of encapsulation molds are explained in detail including cutting test results, new tool speeds and feeds, dimensional and surface finish measurements and a comparison to the original machining operations and cycle times. A 38% improvement in cycle time is demonstrated while achieving a 50% better surface finish than required.
Wireless communication plays an increasing role in military, industrial, public safety, and academic computer networks. Although in general, radio transmitters are not currently permitted in secured areas at Sandia, wireless communications would open new opportunities, allowing mobile and pervasive user access. Without wireless communications, we must live in a ''non-mainstream'' world of fixed, wired networks, where it becomes ever more difficult to attract and retain the best professionals. This report provides a review of the current state of wireless communications, which direction wireless technology is heading, and where wireless technology could be employed at Sandia. A list of recommendations on harnessing the power of wireless communications is provided to aid in building a state-of-the-art communication environment for the 21st century at Sandia.
In 1993, the Government Performance and Results Act (GPRA, PL 103-62) was enacted. GPRA, which applies to all federal programs, has three components: strategic plans, annual performance plans, and metrics to show how well annual plans are being followed. As part of meeting the GRPA requirement in FY2002, a 15-member external review committee chaired by Dr. Alvin Trivelpiece (the Trivelpiece Committee) was convened by Sandia National Laboratories (SNL) on May 7-9, 2002 to review Sandia National Laboratories' Pulsed Power Programs as a component of the Performance Appraisal Process negotiated with the National Nuclear Security Administration of the Department of Energy (NNSA/DOE). The scope of the review included activities in high energy density physics (HEDP), inertial confinement fusion (ICF), radiation/weapon physics, the petawatt laser initiative (PW) and fast ignition, equation-of-state studies, radiation effects science and lethality, x-ray radiography, ZR development, basic research and pulsed power technology research and development, as well as electromagnetics and work for others. In his charge to the Committee, Dr. Jeffrey P. Quintenz, Director of Pulsed Power Sciences (Org. 1600) asked that the evaluation and feedback be based on three criteria: (1) quality of technical activities in science, technology, and engineering, (2) programmatic performance, management, and planning, and (3) relevance to national needs and agency missions. In addition, the director posed specific programmatic questions. The accompanying report, produced as a SAND document, is the report of the Committee's finding.
In high consequence systems, all layers of the protocol stack need security features. If network and data-link layer control messages are not secured, a network may be open to adversarial manipulation. The open nature of the wireless channel makes mobile wireless mobile ad hoc networks (MANETs) especially vulnerable to control plane manipulation. The objective of this research is to investigate MANET performance issues when cryptographic processing delays are applied at the data-link layer. The results of analysis are combined with modeling and simulation experiments to show that network performance in MANETs is highly sensitive to the cryptographic overhead.
The Green Zia Environmental Excellence Program is a voluntary program designed to support and assist New Mexico businesses to achieve environmental excellence through the development of an environmental management system (EMS). Since 2000, organizations within Sandia National Laboratories (SNL) have participated in the program. SNL's Pollution Prevention (P2) program supports and assists SNL organizations by utilizing Green Zia tools to aid in the implementation of each organization's EMS. This report is based on a feedback session held in September 2002 with past SNL Green Zia Program participants. The goal of the feedback session and of this report is to enhance the services that the P2 Program provides to SNL organizations. This report summarizes the feedback received.
A novel electrical-impedance tomography (EIT) diagnostic system, including hardware and software, has been developed and used to quantitatively measure material distributions in multiphase flows within electrically-conducting (i.e., industrially relevant or metal) vessels. The EIT system consists of energizing and measuring electronics and seven ring electrodes, which are equally spaced on a thin nonconducting rod that is inserted into the vessel. The vessel wall is grounded and serves as the ground electrode. Voltage-distribution measurements are used to numerically reconstruct the time-averaged impedance distribution within the vessel, from which the material distributions are inferred. Initial proof-of-concept and calibration was completed using a stationary solid-liquid mixture in a steel bench-top standpipe. The EIT system was then deployed in Sandia's pilot-scale slurry bubble-column reactor (SBCR) to measure material distributions of gas-liquid two-phase flows over a range of column pressures and superficial gas flow rates. These two-phase quantitative measurements were validated against an established gamma-densitometry tomography (GDT) diagnostic system, demonstrating agreement to within 0.05 volume fraction for most cases, with a maximum difference of 0.15 volume fraction. Next, the EIT system was combined with the GDT system to measure material distributions of gas-liquid-solid three-phase flows in Sandia's SBCR for two different solids loadings. Accuracy for the three-phase flow measurements is estimated to be within 0.15 volume fraction. The stability of the energizing electronics, the effect of the rod on the surrounding flow field, and the unsteadiness of the liquid temperature all degrade measurement accuracy and need to be explored further. This work demonstrates that EIT may be used to perform quantitative measurements of material distributions in multiphase flows in metal vessels.
In this report we describe the performance of the ALEGRA shock wave physics code on a set of gas dynamic shock reflection problems that have associated experimental pressure data. These reflections cover three distinct regimes of oblique shock reflection in gas dynamics--regular, Mach, and double Mach reflection. For the selected data, the use of an ideal gas equation of state is appropriate, thus simplifying to a considerable degree the task of validating the shock wave computational capability of ALEGRA in the application regime of the experiments. We find good agreement of ALEGRA with reported experimental data for sufficient grid resolution. We discuss the experimental data, the nature and results of the corresponding ALEGRA calculations, and the implications of the presented experiment--calculation comparisons.
Sandia National Laboratories, New Mexico (SNL/NM) is a government-owned, contractor-operated facility overseen by the U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA) through the Albuquerque Operations Office (AL), Office of Kirtland Site Operations (OKSO). Sandia Corporation, a wholly-owned subsidiary of Lockheed Martin Corporation, operates SNL/NM. Work performed at SNL/NM is in support of the DOE and Sandia Corporation's mission to provide weapon component technology and hardware for the needs of the nation's security. Sandia Corporation also conducts fundamental research and development (R&D) to advance technology in energy research, computer science, waste management, microelectronics, materials science, and transportation safety for hazardous and nuclear components. In support of Sandia Corporation's mission, the Integrated Safety and Security (ISS) Center and the Environmental Restoration (ER) Project at SNL/NM have established extensive environmental programs to assist Sandia Corporation's line organizations in meeting all applicable local, state, and federal environmental regulations and DOE requirements. This annual report summarizes data and the compliance status of Sandia Corporation's environmental protection and monitoring programs through December 31, 2001. Major environmental programs include air quality, water quality, groundwater protection, terrestrial surveillance, waste management, pollution prevention (P2), environmental remediation, oil and chemical spill prevention, and the National Environmental Policy Act (NEPA). Environmental monitoring and surveillance programs are required by DOE Order 5400.1, General Environmental Protection Program (DOE 1990) and DOE Order 231.1, Environment, Safety, and Health Reporting (DOE 1996).
We present a set of novel design principles to aid in the development of complex collective behaviors in fleets of mobile robots. The key elements are: the use of a graph algorithm that we have created, with certain proven properties, that guarantee scalable local communications for fleets of arbitrary size; the use of artificial forces to simplify the design of motion control; the use of certain proximity values in the graph algorithm to simplify the sharing of robust navigation and sensor information among the robots. We describe these design elements and present a computer simulation that illustrates the behaviors readily achievable with these design tools.
Sandia Corporation (a subsidiary of Lockheed Martin Corporation through its contract with the U.S. Department of Energy [DOE]), National Nuclear Security Administration (NNSA) operates the Tonopah Test Range (TTR) in Nevada. Westinghouse Government Service, TTR's operations and maintenance contractor, performs most all environmental program functions. This Annual Site Environmental Report (ASER), which is published to inform the public about environmental conditions at TTR, describes environmental protection programs and summarizes the compliance status with major environmental laws and regulations during Calendar Year (CY) 2001.
The National Nuclear Security Administration is creating a ''Knowledge Base'' to store technical information to support the United States nuclear explosion monitoring mission. This guide is intended to be used by researchers who wish to contribute their work to the ''Knowledge Base''. It provides de.nitions of the kinds of data sets or research products in the ''Knowledge Base'', acceptable data formats, and templates to complete to facilitate the documentation necessary for the ''Knowledge Base''.
The process of developing the National Nuclear Security Administration (NNSA) Knowledge Base (KB) must result in high-quality Information Products in order to support activities for monitoring nuclear explosions consistent with United States treaty and testing moratoria monitoring missions. The validation, verification, and management of the Information Products is critical to successful scientific integration, and hence, will enable high-quality deliveries to be made to the United States National Data Center (USNDC) at the Air Force Technical Applications Center (AFTAC). As an Information Product passes through the steps necessary to become part of a delivery to AFTAC, domain experts (including technical KB Working Groups that comprise NNSA and DOE laboratory staff and the customer) will provide coordination and validation, where validation is the determination of relevance and scientific quality. Verification is the check for completeness and correctness, and will be performed by both the Knowledge Base Integrator and the Scientific Integrator with support from the Contributor providing two levels of testing to assure content integrity and performance. The Information Products and their contained data sets will be systematically tracked through the integration portion of their life cycle. The integration process, based on lessons learned during its initial implementations, is presented in this report.
The Navruz Project is a cooperative, transboundary, river monitoring project involving rivers and institutions in Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan facilitated by Sandia National Laboratories in the U.S. The Navruz Project focuses on waterborne radionuclides and metals because of their importance to public health and nuclear materials proliferation concerns in the region. Data obtained in this project are shared among all participating countries and the public through an internet web site and are available for use in further studies and in regional transboundary water resource management efforts. Overall, the project addresses three main goals: to help increase capabilities in Central Asian nations for sustainable water resources management; to provide a scientific basis for supporting nuclear transparency and non-proliferation in the region; and to help reduce the threat of conflict in Central Asia over water resources, proliferation concerns, or other factors. The Navruz project has a duration of three years. This document contains the reports from each of the participating institutions following the first year of data collection. While a majority of samples from the Navruz project are within normal limits, a preliminary analysis does indicate a high concentration of selenium in the Kazakhstan samples. Uzbekistan samples contain high uranium and thorium concentrations, as well as elevated levels of chromium, antimony and cesium. Additionally, elevated concentrations of radioactive isotopes have been detected at one Tajikistan sampling location. Further analysis will be published in a subsequent report.
The Entero Software Project emphasizes flexibility, integration and scalability in modeling complex engineering systems. The GUIGenerator project supports the Entero environment by providing a user-friendly graphical representation of systems, mutable at runtime. The first phase requires formal language specification describing the syntax and semantics of extensible Markup Language (XML) elements to he utilized, depicted through an XML schema. Given a system, front end user interaction with stored system data occurs through Java Graphical User Interfaces (GUIs), where often only subsets of system data require user input. The second phase demands interpreting well-formed XML documents into predefined graphical components, including the addition of fixed components not represented in systems such as buttons. The conversion process utilizes the critical features of JDOM, a Java based XML parser, and Core Java Reflection, an advanced Java feature that generates objects at runtime using XML input data. Finally, a searching mechanism provides the capability of referencing specific system components through a combination of established search engine techniques and regular expressions, useful for altering visual properties of output. The GUIGenerator will be used to create user interfaces for the Entero environment's code coupling in support of the ASCI Hostile Environments Level 2 milestones in 2003.
Imaging systems such as Synthetic Aperture Radar collect band-limited data from which an image of a target scene is rendered. The band-limited nature of the data generates sidelobes, or ''spilled energy'' most evident in the neighborhood of bright point-like objects. It is generally considered desirable to minimize these sidelobes, even at the expense of some generally small increase in system bandwidth. This is accomplished by shaping the spectrum with window functions prior to inversion or transformation into an image. A window function that minimizes sidelobe energy can be constructed based on prolate spheroidal wave functions. A parametric design procedure allows doing so even with constraints on allowable increases in system bandwidth. This approach is extended to accommodate spectral notches or holes, although the guaranteed minimum sidelobe energy can be quite high in this case. Interestingly, for a fixed bandwidth, the minimum-mean-squared-error image rendering of a target scene is achieved with no windowing at all (rectangular or boxcar window).
The Zone 4 Stage Right Shielded Lift Trucks (SLT's) will likely need refurbishment or replacement within the next two to five years, due to wear. This document discusses the options to provide a long term and reliable means of satisfying Zone 4 material movement and inventory requirements.
The National Nuclear Security Administration is creating a Knowledge Base to store technical information to support the United States nuclear explosion monitoring mission. This document defines the core database tables that are used in the Knowledge Base. The purpose of this document is to present the ORACLE database tables in the NNSA Knowledge Base that on modifications to the CSS3.0 Database Schema developed in 1990. (Anderson et al., 1990). These modifications include additional columns to the affiliation table, an increase in the internal ORACLE format from 8 integers to 9 integers for thirteen IDs, and new primary and unique key definitions for six tables. It is intended to be used as a reference by researchers inside and outside of NNSA/DOE as they compile information to submit to the NNSA Knowledge Base. These ''core'' tables are separated into two groups. The Primary tables are dynamic and consist of information that can be used in automatic and interactive processing (e.g. arrivals, locations). The Lookup tables change infrequently and are used for auxiliary information used by the processing. In general, the information stored in the core tables consists of: arrivals; events, origins, associations of arrivals; magnitude information; station information (networks, site descriptions, instrument responses); pointers to waveform data; and comments pertaining to the information. This document is divided into four sections, the first being this introduction. Section two defines the sixteen tables that make up the core tables of the NNSA Knowledge Base database. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. In addition, the primary, unique and foreign keys are defined. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams. The last section, defines the columns or attributes of the various tables. Information that is included is the Not Applicable (NA) value, the format of the data and the applicable range for the attribute.
The authors present a VHDL design that incorporates optimizations intended to provide digital signature generation with as little power, space, and time as possible. These three primary objectives of power, size, and speed must be balanced along with other important goals, including flexibility of the hardware and ease of use. The highest-level function doffered by their hardware design is Elliptic Curve Optimal El Gamal digital signature generation. The parameters are defined over the finite field GF(2{sup 178}), which gives security that is roughly equivalent to that provided by 1500-bit RSA signatures. The optimizations include using the point-halving algorithm for elliptic curves, field towers to speed up the finite field arithmetic in general, and further enhancements of basic finite field arithmetic operations. The result is a synthesized VHDL digital signature design (using a CMOS 0.5{micro}m, 5V, 25 C library) of 191,000 gates that generates a signature in 4.4 ms at 20 MHz.
This paper is the latest in a series of papers that attempt to relate the multiple observed gas breakdown phenomena to useful switch design parameters. This series started when the as density, not the gas type, was observed to give a power law relationship for the breakdown delay time delays of several gases to the applied electric field. This paper will show that this triggering or breakdown initiating process is similar to, if not the same as, a corona discharge. A hypothesis is made and a simple voltage breakdown relationship is shown to exist for air gaps between 1 and 1000 cm for sharp electrodes to a plane.
Specially designed synchronous ac generators can provide a high energy pulse power source capable of supplying energy to various pulse forming networks. One such generator, which is the subject of this paper, is presently being used as the prime power source for the Repetitive High Energy Pulsed Power Module (RHEPP) at Sandia National Laboratories. The generator has been designed to operate continuously in two distinct modes. In the first mode the generator can supply 50-kJ, 9.5-kV, 11,000-amp, 1-msec pulses continuously (500 kW average power) with a rep rate from 1 to 10 Hs. In the second mode, 20.8-kJ, 9.5-kV, 1052-amp, 4-msec pulses can be supplied continuously (5000 kW average power) at a rep rate of 240 pulses per second. The latter mode is being used in the RHEPP application at a reduced energy and voltage level. The generator was successfully tested in 9/89 to verify the performance at its maximum rating. Test results are presented along with details of the generator design and its applications.
Microwave two-port S-parameter measurements and modeling were performed on superconducting flux flow transistors. These transistors, based on the magnetic control of flux flow in an array of high-temperature superconducting weak links, can exhibit significant available power gain at microwave frequencies (over 20 dB at 7-10 GHz in some devices). The input impedance is largely inductive, while the output impedance is resistive and inductive. These devices are potentially useful in numerous applications, including matched amplifiers.< >
Pulse transformer conceptual design and system studies were conducted at the Westinghouse Science and Technology Center for the Sandia National Laboratories' Repetitive High Energy Pulsed Power (RHEPP) System. The RHEPP system relies on magnetic switches to achieve pulse compression from 120 Hz ac to microsecond pulses. A 600 kW, 120 Hz Westinghouse alternator supplies ac prime power at 10 kV (rms). Two magnetic switching stages will compress the pulses to 115 usec prior to the pulse transformer. The transformer steps the voltage up to 254 kV. The pulse transformer has an 18:1 turns ratio and is capable of continuous duty operation. System studies were conducted to minimize transformer loss and leakage inductance within transformer size constraints. The optimized design had a 3-step nickel iron core with 9 primary turns.
Flash x-ray sources at Sandia National Laboratories routinely test the hardness of electronic components to simulated threat spectra. While it is traditional to calculate the x-ray spectra produced in a given exposure from measurements of free-field dose, current, and voltage, these experimental quantities may not be accurately known for some source geometries. It is appropriate, therefore, to include a direct measurement of the x-ray spectrum for such tests. Random error propagation and unfold accuracy have been studied for the spectral unfold method used in the x-ray absorption spectrometer reported by Carlson. This system of 13 measurements and 30 spectral bins (0.01 -- 8 MeV) is underdetermined; a trial spectrum prevents unphysical solutions. Accuracy of the unfold was tested with simulated data from known spectra; the unfolds agreed with the known spectra to better than 10%, between 0.05 MeV and near the endpoints. Error propagation was studied by perturbing the input data randomly and unfolding the resulting data sets. In each unfold energy bin the standard deviation was taken as the propagated error. Above 0.05 MeV the unfold roughly doubled the input errors. The trail spectrum affects the unfold accuracy more strongly than the propagated errors.
A non-toxic, non-corrosive aqueous foam with enhanced physical stability for the rapid mitigation and decontamination of CBW agents has been developed at Sandia. This technology is attractive for the protection of the Nuclear Weapons facilities as well as for civilian and military applications for several reasons including (1) it requires minimal logistics support, (2) a single decon solution can be used for both CW and BW agents, (3) mitigation of agents can be accomplished in bulk, aerosol, and vapor phases, (4) it can be deployed rapidly, (5) it exhibits minimal health and collateral damage, (6) it is relatively inexpensive, and (7) it has minimal run-off of fluids and no lasting environmental impact. A range of methods including systems that yield desirable properties for fire suppression foams can deliver the foam. Although the foam's effectiveness against CBW agents is well established, the additional capability of being used for fire suppression would provide a dual-use capability. If the foam can suppress and control fires, it could lead to a significant enhancement to the level of protection for critical nuclear weapon facilities in that existing foam-based fire suppression systems could now provide the additional protection of decontamination and CBW agent removal. Fire suppression properties of the foam were investigated with the assistance of Southwest Research Institute Department of Fire Technology in conjunction with EnviroFoam Technologies, Inc., a technology licensee.
This report presents the results of a study of various wind turbine blade design parameters as a function of blade length in the range from 30 meters to 70 meters. The results have been summarized in dimensional and non-dimensional formats to aid in interpretation. The parametric review estimated peak power and annual energy capture for megawatt scale wind turbines with rotors of 62, 83, 104, 125, and 146 meters in diameter. The baseline ''thin'' distribution represents conventional airfoils used in large wind turbine blades. The ''thicker'' and ''thickest'' distributions utilize airfoils that have significantly increased thickness to improve structural performance and reduce weight. An aerodynamic scaling effort was undertaken in parallel with the structural analysis work to evaluate the effect of extreme thickness on aerodynamic characteristics. Increased airfoil section thickness appears to be a key tool in limiting blade weight and cost growth with scale. Thickened and truncated trailing edges in the inboard region provide strong, positive effects on blade structural performance. Larger blades may require higher tip speeds combined with reduced blade solidity to limit growth of design loads. A slender blade can be used to reduce extreme design loads when the rotor is parked, but requires a higher tip speed.
On September 13, 2001, the first day after the attacks of September 11 that Sandia National Laboratories re-opened, Vice President Gerry Yonas entirely redirected the efforts of his organization, the Advanced Concepts Group (ACG), to the problem of terrorism. For the next several weeks, the ACG focused on trying to better characterize the international terrorist threat and the vulnerabilities of the US to further attacks. This work culminated in a presentation by Dr . Yonas to the Fall Leadership Focus meeting at Sandia National Laboratories on October 22. Following that meeting, President and Lab Director, Paul Robinson, asked Dr. Yonas and the ACG to develop a long-term (3-5 year) technology roadmap showing how Sandia could direct efforts to making major contributions to the success of the nation's war on terrorism. The ACG effort would communicate with other Labs activities working on near-term responses to Federal calls for technological support. The ACG study was conducted in two phases. The first, more exploratory, stage divided the terrorism challenge into three broad parts, each examined by a team that included both permanent ACG staff and part-time staff and consultants from other Sandia organizations. The ''Red'' team looked at the problems of finding and stopping terrorists before they strike (or strike again). The ''Yellow'' team studied the problems of protecting people and facilities from terrorist attacks, as well as those of responding to attacks that occur. The ''Green'' team attempted to understand the long-term, ''root'' causes of terrorism, and how technology might help ameliorate the conditions that lead people to support, or even become, terrorists. In addition, a ''Purple'' team worked with the other teams to provide an integrating vision for them all, to help make appropriate connections among them, and to see that they left no important gaps between them. The findings of these teams were presented to a broad representation of laboratory staff and management on January 3, 2002. From the many ideas explored by the Red, Green, and Yellow teams, and keeping in mind criteria formulated by the Purple team, the ACG assembled a set of five major technology development goals. These goals, if pursued, could lead to major contributions to the war on terrorism. With some rearrangement of team members and coordinators, a new set of teams began fleshing out these five ''Big Hairy Audacious Goals'' for the consideration of Laboratory leadership. Dr. Yonas briefed Sandia upper management on the work of these teams on February 4, 2002. This report presents the essence of that work as applicable to the R&D community of the nation interested in the development of better tools for a long term ''War on Terrorism.''
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstrate the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.
Over the past several decades, the development of computer models to predict the atmospheric transport of hazardous material across a local (on the order of 10s of km) to mesoscale (on the order of 100s of km) region has received considerable attention, for both regulatory purposes, and to guide emergency response teams. Wind inputs to these models cover a spectrum of sophistication and required resources. At one end is the interpolation/extrapolation of available observations, which can be done rapidly, but at the risk of missing important local phenomena. Such a model can also only describe the wind at the time the observations were made. At the other end are sophisticated numerical solutions based on so-called Primitive Equation models. These prognostic models, so-called because in principle they can forecast future conditions, contain the most physics, but can easily consume tens of hours, if not days, of computer time. They may also require orders of magnitude more effort to set up, as both boundary and initial conditions on all the relevant variables must be supplied. The subject of this report is two classes of models intermediate in sophistication between the interpolated and prognostic ends of the spectrum. The first, known as mass-consistent (sometimes referred to as diagnostic) models, attempt to strike a compromise between simple interpolation and the complexity of the Primitive Equation models by satisfying only the conservation of mass (continuity) equation. The second class considered here consists of the so-called linear models, which purport to satisfy both mass and momentum balances. A review of the published literature on these models over the past few decades was performed. Though diagnostic models use a variety of approaches, they tend to fall into a relatively few well-defined categories. Linear models, on the other hand, follow a more uniform methodology, though they differ in detail. The discussion considers the theoretical underpinnings of each category of the diagnostic models, and the linear models, in order to assess the advantages and disadvantages of each. It is concluded that diagnostic models are the better suited of the two for predicting the atmospheric dispersion of hazardous materials in emergency response scenarios, as the linear models are only able to accommodate gently-sloping terrain, and are predicated on several simplifying approximations which can be difficult to justify a priori. Of the various approaches used in diagnostic modeling, that based on the calculus of variations appears to be the most objective, in that it introduces the fewest number of arbitrary parameters. The strengths and weaknesses of models in this category, as they relate to the activities of Sandia's Nuclear Emergency Support Team (NEST), are further highlighted.
The purpose of this biological assessment is to review the proposed continued operation of Sandia National Laboratories, California (SNL/CA) in sufficient detail to determine to what extent the proposed action may affect the species listed below. This assessment is prepared in accordance with Section 7 of the Endangered Species Act [16 U.S.C. 1536 (c)].
We propose a novel algorithm based on Principal Component Analysis (PCA). First, we present an interesting approximation of PCA using Gram-Schmidt orthonormalization. Next, we combine our approximation with the kernel functions from Support Vector Machines (SVMs) to provide a nonlinear generalization of PCA. After benchmarking our algorithm in the linear case, we explore its use in both the linear and nonlinear cases. We include applications to face data analysis, handwritten digit recognition, and fluid flow.
The long-range objective of this study was to develop chemically assisted technologies for removing heels from tanks. In FY 01, the first two steps toward this objective were taken: (1) catalogue the occurrence and nature of tank heels and assess which materials are available for study and (2) develop methods for synthesizing non-radioactive surrogate heel materials for use in testing potential removal technologies. The chief finding of Task 1 was the existence of ''heels'', depending on the definition used. Hard materials that would be almost impossible to remove by sluicing are all but absent from the records of both Savannah River and Hanford. Historical usage suggests that the term ''heel'' may also apply to chunky, granular, or semi-solid pasty accumulations. These materials are documented and may also be difficult to remove by conventional sluicing technologies. Such heels may be comprised of normal sludge components, dominantly iron and aluminum hydroxides, or they may result from added materials which were not part of the normal fuel reprocessing operations: Portland cement, diatomaceous earth, sand and soil and spent zeolite ion exchange ''resins''. The occurrence and chemistry of the most notable ''heel'', that of the zeolite mass in Tank 19F at Savannah River, is reviewed in some detail. Secondly, no clear correlation was found between high tank temperatures and difficulties encountered in removing materials from a tank at a later date; nor did the sludges from these tanks give any indication of being particularly solid. Experimental studies to develop synthetic heel materials were caned out using a number of different approaches. For normal sludge materials settling, even when assisted by a centrifuge, it proved ineffective. The same result was obtained from drying sludge samples. Even exposing sludges to a molten salt melt at 233 C, only produced a fine powder, rather than a resilient ceramic which resisted disaggregation. A cohesive material, however, was produced by wicking the pore fluid out of a sludge gel (into packed diatomaceous earth), while simultaneously applying pressure to compact the sludge as it dehydrated. Osmotic gradients could provide the same function as the capillary forces provided by the diatomaceous earth sorbant placed in contact with the sludge. Tests on the anomalous materials added to the tanks all indicated potential problems. Hard granules, and maybe chunks, may be encountered where Portland cement was added to a tank. Sand, spent zeolite resin, and diatomaceous earth, will all react with the tank fluids to produce a sodalite/cancrinite material. The degree of reaction determines whether the grains become cemented together. SRS activities showed that heels formed when spent zeolites were added to tanks can be readily dislodged and it is expected that heels from sand would possess equal or less cohesion. Diatomaceous earth may form more resilient crusts or masses. To summarize, the existence of ''hard'' heels has yet to be documented. A broader definition suggests inclusion of poorly cohesive cancrinite-cemented masses and dense past-like accumulations of abnormally compacted ''normal'' sludges. Chemical treatments to remove these materials must focus on agents that are active against aluminosilicates and hydrous oxides of iron and aluminum. Exploiting the high pore-water content of these materials may provide a second avenue for dislodging such accumulations. Techniques were developed to produce synthetic sludges on which various removal technologies could be tried.
As part of the U.S. Department of Energy's Wind Partnerships for Advanced Component Technologies (WindPACT) program, Global Energy Concepts LLC (GEC) is performing a study concerning innovations in materials, processes and structural configurations for application to wind turbine blades in the multi-megawatt range. The project team for this work includes experts in all areas of wind turbine blade design, analysis, manufacture, and testing. Constraints to cost-effective scaling-up of the current commercial blade designs and manufacturing methods are identified, including self-gravity loads, transportation, and environmental considerations. A trade-off study is performed to evaluate the incremental changes in blade cost, weight, and stiffness for a wide range of composite materials, fabric types, and manufacturing processes. Fiberglass/carbon fiber hybrid blades are identified as having a promising combination of cost, weight, stiffness and fatigue resistance. Vacuum-assisted resin transfer molding, resin film infision, and pre-impregnated materials are identified as having benefits in reduced volatile emissions, higher fiber content, and improved laminate quality relative to the baseline wet lay-up process. Alternative structural designs are identified, including jointed configurations to facilitate transportation. Based on the results to date, recommendations are made for further evaluation and testing under this study to verify the predicted material and structural performance.
Curie Point pyrolysis-gas chromatography was investigated for use as a tool for characterization of aged ammonium perchlorate based composite propellants (1). Successful application of the technique will support the surveillance program for the Explosives Materials and Subsystems Department (1). Propellant samples were prepared by separating the propellant into reacted (oxidated) and unreacted zones. The experimental design included the determination of system reliability followed by, reproducibility, sample preparation and analysis of pyrolysis products. Polystyrene was used to verify the reliability of the system and showed good reproducibility. Application of the technique showed high variation in the data. Modifications to sample preparation did not enhance the reproducibility. It was determined that the high concentration of ammonium perchlorate in the propellant matrix was compromising the repeatability of the analysis.
The WIPP Case Study describes the compliance monitoring program, record keeping requirements, and passive institutional controls that are used to help ensure the Waste Isolation Pilot Plant (WIPP) will safety contain radioactive waste and indicate dangers and location of the wastes. The radioactive components in the waste are regulated by the U.S. Environmental Protection Agency (EPA) while the hazardous components in the waste are regulated by the New Mexico Environment Department (NMED). This paper addresses monitoring relating to radionuclide containment performance, passive institutional controls, and record keeping over a 10,000-year time frame. Monitoring relating to the hazardous components and the associated regulator are not addressed in this paper. The WIPP containment performance is mandated by release limits set by regulation. Regulations also require the radioactive waste containment performance of the WIPP to be predicted by a ''Performance Assessment.'' The EPA did not base the acceptance of the WIPP solely on predicted containment but included additional assurance measures. One such assurance measure is monitoring, which may be defined as the on-going measurement of conditions in and around the repository. This case study describes the evolution of the WIPP monitoring program as the WIPP project progressed through the planning, site characterization, regulatory promulgation, and eventual operational stages that spanned a period of over 25 years. Included are discussions of the regulatory requirements for monitoring, selection of monitoring parameters, trigger values used to identify unexpected conditions, assessment of monitoring data against the trigger values, and plans for post-closure monitoring. The United EPA established the requirements for Passive Institutional Controls (PICs) for disposal sites. The requirements state the a disposal site must be designated by the most permanent markers, records, and other passive institutional controls practicable to indicate the dangers of the wastes and their location. The PIC Task Force assessed the effectiveness of PICs in deterring inadvertent human intrusion and developed a conceptual design for permanently marking the Waste Isolation Pilot Plant (WIPP), establishing records, and identifying other practicable controls to indicate the dangers of the wastes and their location. The marking system should provide information regarding the location, design, contents, and hazards associated with WIPP. This paper discuss these controls including markers, records, archives, and government ownership and land-use restrictions.
This document contains an updated list of common acronyms, initialisms, and abbreviations used at Sandia. It will be published in an electronic format only. It can be retrieved from HTTPS://wfsprod01.sandia.gov/groups/srn-uscitizens/documents/document/wfs048643.pdf.
This report summarizes an investigation of the use of high-gain Photo-Conductive Semiconductor Switch (PCSS) technology for a deployable impulse source. This includes a discussion of viability, packaging, and antennas. High gain GaAs PCSS-based designs offer potential advantages in terms of compactness, repetition rate, and cost.
The mathematical description of acoustic wave propagation within a time- and space-varying, and moving, linear viscous fluid is formulated as a system of coupled linear equations. This system is rigorously developed from fundamental principles of continuum mechanics (conservation of mass, balance of linear and angular momentum, balance of entropy) and various constitutive relations (for stress, entropy production, and entropy conduction) by linearizing all expressions with respect to the small-amplitude acoustic wavefield variables. A significant simplification arises if the fluid medium is neither viscous nor heat conducting (i.e., an ideal fluid). In this case the mathematical system can be reduced to a set of five, coupled, first-order partial differential equations. Coefficients in the systems depend on various mechanical and thermodynamic properties of the ambient medium that supports acoustic wave propagation. These material properties cannot all be arbitrarily specified, but must satisfy another system of nonlinear expressions characterizing the dynamic behavior of the background medium. Dramatic simplifications in both systems occur if the ambient medium is simultaneously adiabatic and stationary.
Borehole radar systems can provide essential subsurface structural information for environmental evaluation, geotechnical analysis, or energy exploration. Sandia developed a prototype continuous-wave Borehole Radar (BHR) in 1996, and development of a practical tool has been continuing at a Russian institute under a Sandia contract. The BHR field experiments, which were planned for the summer of 2001 in Russia, provided a unique opportunity to evaluate the latest Sandia algorithms with actual field data. A new three-dimensional code was developed to enable the analysis of BHR data on modest-sized desktop workstations. The code is based on the staggered grid, finite difference technique, and eliminates 55% of the massive storage associated with solving the system of finite-difference linear equations. The code was used to forward-model the Russian site geometry and placement of artificial targets to anticipate any problems that might arise when the data was received. Technical software and equipment problems in the Russian field tests, conducted in August 2001, invalidated all but one of the data sets. However, more field tests with improved equipment and software are planned for 2002, and analysis of that data will be presented in a future report.
The Gulf of Mexico (GoM) is the most active deepwater region in the world and provides some of the greatest challenges in scope and opportunity for the oil and gas industry. The complex geologic settings and significant water and reservoir depths necessitate high development costs, in addition to requiring innovating technology. The investment costs are substantial: because of the extreme water depths (up to 8000 feet) and considerable reservoir depths (to 30,000 feet below mudline), the cost of drilling a single well can be upwards of 50 to 100 million dollars. Central, therefore, to successful economic exploitation are developments with a minimum number of wells combined with a well service lifetime of twenty to thirty years. Many of the wells that are planned for the most significant developments will penetrate thick salt formations, and the combined drilling costs for these fields are estimated in the tens of billions of dollars. In May 2001, Sandia National Laboratories initiated a Joint Industry Project focused on the identification, quantification, and mitigation of potential well integrity issues associated with sub-salt and near-salt deepwater GoM reservoirs. The project is jointly funded by the DOE (Natural Gas and Oil Technology Partnership) and nine oil companies (BHP Billiton Petroleum, BP, ChevronTexaco, Conoco, ExxonMobil, Halliburton, Kerr-McGee, Phillips Petroleum, and Shell). This report provides an assessment of the state of the art of salt mechanics, and identifies potential well integrity issues relevant to deepwater GoM field developments. Salt deformation is discussed and a deformation mechanism map is provided for salt. A bounding steady-state strain rate contour map is constructed for deepwater GoM field developments, and the critical issue of constraint in the subsurface, and resultant necessity for numerical analyses is discussed.
This document describes the 2002 SNL Accelerated Strategic Computing Initiative (ASCI) Applications Software Quality Engineering (SQE) Assessment and the assessment results. The primary purpose of the assessment was to establish the current state of software engineering practices within the SNL ASCI Applications Program.
The Materials Chemistry Department 1846 has developed a lab-scale chem-prep process for the synthesis of PNZT 95/5, a ferroelectric material that is used in neutron generator power supplies. This process (Sandia Process, or SP) has been successfully transferred to and scaled by Department 14192 (Ceramics and Glass Department), (Transferred Sandia Process, or TSP), to meet the future supply needs of Sandia for its neutron generator production responsibilities. In going from the development-size SP batch (1.6 kg/batch) to the production-scale TSP powder batch size (10 kg/batch), it was important that it be determined if the scaling process caused any ''performance-critical'' changes in the PNZT 95/5 being produced. One area where a difference was found was in the particle size distributions of the calcined PNZT powders. Documented in this SAND report are the results of an experimental study to determine the origin of the differences in the particle size distribution of the SP and TSP powders.