A novel electrical-impedance tomography (EIT) diagnostic system, including hardware and software, has been developed and used to quantitatively measure material distributions in multiphase flows within electrically-conducting (i.e., industrially relevant or metal) vessels. The EIT system consists of energizing and measuring electronics and seven ring electrodes, which are equally spaced on a thin nonconducting rod that is inserted into the vessel. The vessel wall is grounded and serves as the ground electrode. Voltage-distribution measurements are used to numerically reconstruct the time-averaged impedance distribution within the vessel, from which the material distributions are inferred. Initial proof-of-concept and calibration was completed using a stationary solid-liquid mixture in a steel bench-top standpipe. The EIT system was then deployed in Sandia's pilot-scale slurry bubble-column reactor (SBCR) to measure material distributions of gas-liquid two-phase flows over a range of column pressures and superficial gas flow rates. These two-phase quantitative measurements were validated against an established gamma-densitometry tomography (GDT) diagnostic system, demonstrating agreement to within 0.05 volume fraction for most cases, with a maximum difference of 0.15 volume fraction. Next, the EIT system was combined with the GDT system to measure material distributions of gas-liquid-solid three-phase flows in Sandia's SBCR for two different solids loadings. Accuracy for the three-phase flow measurements is estimated to be within 0.15 volume fraction. The stability of the energizing electronics, the effect of the rod on the surrounding flow field, and the unsteadiness of the liquid temperature all degrade measurement accuracy and need to be explored further. This work demonstrates that EIT may be used to perform quantitative measurements of material distributions in multiphase flows in metal vessels.
In this report we describe the performance of the ALEGRA shock wave physics code on a set of gas dynamic shock reflection problems that have associated experimental pressure data. These reflections cover three distinct regimes of oblique shock reflection in gas dynamics--regular, Mach, and double Mach reflection. For the selected data, the use of an ideal gas equation of state is appropriate, thus simplifying to a considerable degree the task of validating the shock wave computational capability of ALEGRA in the application regime of the experiments. We find good agreement of ALEGRA with reported experimental data for sufficient grid resolution. We discuss the experimental data, the nature and results of the corresponding ALEGRA calculations, and the implications of the presented experiment--calculation comparisons.
Sandia National Laboratories, New Mexico (SNL/NM) is a government-owned, contractor-operated facility overseen by the U.S. Department of Energy (DOE), National Nuclear Security Administration (NNSA) through the Albuquerque Operations Office (AL), Office of Kirtland Site Operations (OKSO). Sandia Corporation, a wholly-owned subsidiary of Lockheed Martin Corporation, operates SNL/NM. Work performed at SNL/NM is in support of the DOE and Sandia Corporation's mission to provide weapon component technology and hardware for the needs of the nation's security. Sandia Corporation also conducts fundamental research and development (R&D) to advance technology in energy research, computer science, waste management, microelectronics, materials science, and transportation safety for hazardous and nuclear components. In support of Sandia Corporation's mission, the Integrated Safety and Security (ISS) Center and the Environmental Restoration (ER) Project at SNL/NM have established extensive environmental programs to assist Sandia Corporation's line organizations in meeting all applicable local, state, and federal environmental regulations and DOE requirements. This annual report summarizes data and the compliance status of Sandia Corporation's environmental protection and monitoring programs through December 31, 2001. Major environmental programs include air quality, water quality, groundwater protection, terrestrial surveillance, waste management, pollution prevention (P2), environmental remediation, oil and chemical spill prevention, and the National Environmental Policy Act (NEPA). Environmental monitoring and surveillance programs are required by DOE Order 5400.1, General Environmental Protection Program (DOE 1990) and DOE Order 231.1, Environment, Safety, and Health Reporting (DOE 1996).
We present a set of novel design principles to aid in the development of complex collective behaviors in fleets of mobile robots. The key elements are: the use of a graph algorithm that we have created, with certain proven properties, that guarantee scalable local communications for fleets of arbitrary size; the use of artificial forces to simplify the design of motion control; the use of certain proximity values in the graph algorithm to simplify the sharing of robust navigation and sensor information among the robots. We describe these design elements and present a computer simulation that illustrates the behaviors readily achievable with these design tools.
Sandia Corporation (a subsidiary of Lockheed Martin Corporation through its contract with the U.S. Department of Energy [DOE]), National Nuclear Security Administration (NNSA) operates the Tonopah Test Range (TTR) in Nevada. Westinghouse Government Service, TTR's operations and maintenance contractor, performs most all environmental program functions. This Annual Site Environmental Report (ASER), which is published to inform the public about environmental conditions at TTR, describes environmental protection programs and summarizes the compliance status with major environmental laws and regulations during Calendar Year (CY) 2001.
The National Nuclear Security Administration is creating a ''Knowledge Base'' to store technical information to support the United States nuclear explosion monitoring mission. This guide is intended to be used by researchers who wish to contribute their work to the ''Knowledge Base''. It provides de.nitions of the kinds of data sets or research products in the ''Knowledge Base'', acceptable data formats, and templates to complete to facilitate the documentation necessary for the ''Knowledge Base''.
The process of developing the National Nuclear Security Administration (NNSA) Knowledge Base (KB) must result in high-quality Information Products in order to support activities for monitoring nuclear explosions consistent with United States treaty and testing moratoria monitoring missions. The validation, verification, and management of the Information Products is critical to successful scientific integration, and hence, will enable high-quality deliveries to be made to the United States National Data Center (USNDC) at the Air Force Technical Applications Center (AFTAC). As an Information Product passes through the steps necessary to become part of a delivery to AFTAC, domain experts (including technical KB Working Groups that comprise NNSA and DOE laboratory staff and the customer) will provide coordination and validation, where validation is the determination of relevance and scientific quality. Verification is the check for completeness and correctness, and will be performed by both the Knowledge Base Integrator and the Scientific Integrator with support from the Contributor providing two levels of testing to assure content integrity and performance. The Information Products and their contained data sets will be systematically tracked through the integration portion of their life cycle. The integration process, based on lessons learned during its initial implementations, is presented in this report.
The Navruz Project is a cooperative, transboundary, river monitoring project involving rivers and institutions in Kazakhstan, Kyrgyzstan, Tajikistan, and Uzbekistan facilitated by Sandia National Laboratories in the U.S. The Navruz Project focuses on waterborne radionuclides and metals because of their importance to public health and nuclear materials proliferation concerns in the region. Data obtained in this project are shared among all participating countries and the public through an internet web site and are available for use in further studies and in regional transboundary water resource management efforts. Overall, the project addresses three main goals: to help increase capabilities in Central Asian nations for sustainable water resources management; to provide a scientific basis for supporting nuclear transparency and non-proliferation in the region; and to help reduce the threat of conflict in Central Asia over water resources, proliferation concerns, or other factors. The Navruz project has a duration of three years. This document contains the reports from each of the participating institutions following the first year of data collection. While a majority of samples from the Navruz project are within normal limits, a preliminary analysis does indicate a high concentration of selenium in the Kazakhstan samples. Uzbekistan samples contain high uranium and thorium concentrations, as well as elevated levels of chromium, antimony and cesium. Additionally, elevated concentrations of radioactive isotopes have been detected at one Tajikistan sampling location. Further analysis will be published in a subsequent report.
The Entero Software Project emphasizes flexibility, integration and scalability in modeling complex engineering systems. The GUIGenerator project supports the Entero environment by providing a user-friendly graphical representation of systems, mutable at runtime. The first phase requires formal language specification describing the syntax and semantics of extensible Markup Language (XML) elements to he utilized, depicted through an XML schema. Given a system, front end user interaction with stored system data occurs through Java Graphical User Interfaces (GUIs), where often only subsets of system data require user input. The second phase demands interpreting well-formed XML documents into predefined graphical components, including the addition of fixed components not represented in systems such as buttons. The conversion process utilizes the critical features of JDOM, a Java based XML parser, and Core Java Reflection, an advanced Java feature that generates objects at runtime using XML input data. Finally, a searching mechanism provides the capability of referencing specific system components through a combination of established search engine techniques and regular expressions, useful for altering visual properties of output. The GUIGenerator will be used to create user interfaces for the Entero environment's code coupling in support of the ASCI Hostile Environments Level 2 milestones in 2003.
Imaging systems such as Synthetic Aperture Radar collect band-limited data from which an image of a target scene is rendered. The band-limited nature of the data generates sidelobes, or ''spilled energy'' most evident in the neighborhood of bright point-like objects. It is generally considered desirable to minimize these sidelobes, even at the expense of some generally small increase in system bandwidth. This is accomplished by shaping the spectrum with window functions prior to inversion or transformation into an image. A window function that minimizes sidelobe energy can be constructed based on prolate spheroidal wave functions. A parametric design procedure allows doing so even with constraints on allowable increases in system bandwidth. This approach is extended to accommodate spectral notches or holes, although the guaranteed minimum sidelobe energy can be quite high in this case. Interestingly, for a fixed bandwidth, the minimum-mean-squared-error image rendering of a target scene is achieved with no windowing at all (rectangular or boxcar window).
The Zone 4 Stage Right Shielded Lift Trucks (SLT's) will likely need refurbishment or replacement within the next two to five years, due to wear. This document discusses the options to provide a long term and reliable means of satisfying Zone 4 material movement and inventory requirements.
The National Nuclear Security Administration is creating a Knowledge Base to store technical information to support the United States nuclear explosion monitoring mission. This document defines the core database tables that are used in the Knowledge Base. The purpose of this document is to present the ORACLE database tables in the NNSA Knowledge Base that on modifications to the CSS3.0 Database Schema developed in 1990. (Anderson et al., 1990). These modifications include additional columns to the affiliation table, an increase in the internal ORACLE format from 8 integers to 9 integers for thirteen IDs, and new primary and unique key definitions for six tables. It is intended to be used as a reference by researchers inside and outside of NNSA/DOE as they compile information to submit to the NNSA Knowledge Base. These ''core'' tables are separated into two groups. The Primary tables are dynamic and consist of information that can be used in automatic and interactive processing (e.g. arrivals, locations). The Lookup tables change infrequently and are used for auxiliary information used by the processing. In general, the information stored in the core tables consists of: arrivals; events, origins, associations of arrivals; magnitude information; station information (networks, site descriptions, instrument responses); pointers to waveform data; and comments pertaining to the information. This document is divided into four sections, the first being this introduction. Section two defines the sixteen tables that make up the core tables of the NNSA Knowledge Base database. Both internal (ORACLE) and external formats for the attributes are defined, along with a short description of each attribute. In addition, the primary, unique and foreign keys are defined. Section three of the document shows the relationships between the different tables by using entity-relationship diagrams. The last section, defines the columns or attributes of the various tables. Information that is included is the Not Applicable (NA) value, the format of the data and the applicable range for the attribute.
The authors present a VHDL design that incorporates optimizations intended to provide digital signature generation with as little power, space, and time as possible. These three primary objectives of power, size, and speed must be balanced along with other important goals, including flexibility of the hardware and ease of use. The highest-level function doffered by their hardware design is Elliptic Curve Optimal El Gamal digital signature generation. The parameters are defined over the finite field GF(2{sup 178}), which gives security that is roughly equivalent to that provided by 1500-bit RSA signatures. The optimizations include using the point-halving algorithm for elliptic curves, field towers to speed up the finite field arithmetic in general, and further enhancements of basic finite field arithmetic operations. The result is a synthesized VHDL digital signature design (using a CMOS 0.5{micro}m, 5V, 25 C library) of 191,000 gates that generates a signature in 4.4 ms at 20 MHz.
This paper is the latest in a series of papers that attempt to relate the multiple observed gas breakdown phenomena to useful switch design parameters. This series started when the as density, not the gas type, was observed to give a power law relationship for the breakdown delay time delays of several gases to the applied electric field. This paper will show that this triggering or breakdown initiating process is similar to, if not the same as, a corona discharge. A hypothesis is made and a simple voltage breakdown relationship is shown to exist for air gaps between 1 and 1000 cm for sharp electrodes to a plane.
Specially designed synchronous ac generators can provide a high energy pulse power source capable of supplying energy to various pulse forming networks. One such generator, which is the subject of this paper, is presently being used as the prime power source for the Repetitive High Energy Pulsed Power Module (RHEPP) at Sandia National Laboratories. The generator has been designed to operate continuously in two distinct modes. In the first mode the generator can supply 50-kJ, 9.5-kV, 11,000-amp, 1-msec pulses continuously (500 kW average power) with a rep rate from 1 to 10 Hs. In the second mode, 20.8-kJ, 9.5-kV, 1052-amp, 4-msec pulses can be supplied continuously (5000 kW average power) at a rep rate of 240 pulses per second. The latter mode is being used in the RHEPP application at a reduced energy and voltage level. The generator was successfully tested in 9/89 to verify the performance at its maximum rating. Test results are presented along with details of the generator design and its applications.
Microwave two-port S-parameter measurements and modeling were performed on superconducting flux flow transistors. These transistors, based on the magnetic control of flux flow in an array of high-temperature superconducting weak links, can exhibit significant available power gain at microwave frequencies (over 20 dB at 7-10 GHz in some devices). The input impedance is largely inductive, while the output impedance is resistive and inductive. These devices are potentially useful in numerous applications, including matched amplifiers.< >
Pulse transformer conceptual design and system studies were conducted at the Westinghouse Science and Technology Center for the Sandia National Laboratories' Repetitive High Energy Pulsed Power (RHEPP) System. The RHEPP system relies on magnetic switches to achieve pulse compression from 120 Hz ac to microsecond pulses. A 600 kW, 120 Hz Westinghouse alternator supplies ac prime power at 10 kV (rms). Two magnetic switching stages will compress the pulses to 115 usec prior to the pulse transformer. The transformer steps the voltage up to 254 kV. The pulse transformer has an 18:1 turns ratio and is capable of continuous duty operation. System studies were conducted to minimize transformer loss and leakage inductance within transformer size constraints. The optimized design had a 3-step nickel iron core with 9 primary turns.
Flash x-ray sources at Sandia National Laboratories routinely test the hardness of electronic components to simulated threat spectra. While it is traditional to calculate the x-ray spectra produced in a given exposure from measurements of free-field dose, current, and voltage, these experimental quantities may not be accurately known for some source geometries. It is appropriate, therefore, to include a direct measurement of the x-ray spectrum for such tests. Random error propagation and unfold accuracy have been studied for the spectral unfold method used in the x-ray absorption spectrometer reported by Carlson. This system of 13 measurements and 30 spectral bins (0.01 -- 8 MeV) is underdetermined; a trial spectrum prevents unphysical solutions. Accuracy of the unfold was tested with simulated data from known spectra; the unfolds agreed with the known spectra to better than 10%, between 0.05 MeV and near the endpoints. Error propagation was studied by perturbing the input data randomly and unfolding the resulting data sets. In each unfold energy bin the standard deviation was taken as the propagated error. Above 0.05 MeV the unfold roughly doubled the input errors. The trail spectrum affects the unfold accuracy more strongly than the propagated errors.
A non-toxic, non-corrosive aqueous foam with enhanced physical stability for the rapid mitigation and decontamination of CBW agents has been developed at Sandia. This technology is attractive for the protection of the Nuclear Weapons facilities as well as for civilian and military applications for several reasons including (1) it requires minimal logistics support, (2) a single decon solution can be used for both CW and BW agents, (3) mitigation of agents can be accomplished in bulk, aerosol, and vapor phases, (4) it can be deployed rapidly, (5) it exhibits minimal health and collateral damage, (6) it is relatively inexpensive, and (7) it has minimal run-off of fluids and no lasting environmental impact. A range of methods including systems that yield desirable properties for fire suppression foams can deliver the foam. Although the foam's effectiveness against CBW agents is well established, the additional capability of being used for fire suppression would provide a dual-use capability. If the foam can suppress and control fires, it could lead to a significant enhancement to the level of protection for critical nuclear weapon facilities in that existing foam-based fire suppression systems could now provide the additional protection of decontamination and CBW agent removal. Fire suppression properties of the foam were investigated with the assistance of Southwest Research Institute Department of Fire Technology in conjunction with EnviroFoam Technologies, Inc., a technology licensee.
This report presents the results of a study of various wind turbine blade design parameters as a function of blade length in the range from 30 meters to 70 meters. The results have been summarized in dimensional and non-dimensional formats to aid in interpretation. The parametric review estimated peak power and annual energy capture for megawatt scale wind turbines with rotors of 62, 83, 104, 125, and 146 meters in diameter. The baseline ''thin'' distribution represents conventional airfoils used in large wind turbine blades. The ''thicker'' and ''thickest'' distributions utilize airfoils that have significantly increased thickness to improve structural performance and reduce weight. An aerodynamic scaling effort was undertaken in parallel with the structural analysis work to evaluate the effect of extreme thickness on aerodynamic characteristics. Increased airfoil section thickness appears to be a key tool in limiting blade weight and cost growth with scale. Thickened and truncated trailing edges in the inboard region provide strong, positive effects on blade structural performance. Larger blades may require higher tip speeds combined with reduced blade solidity to limit growth of design loads. A slender blade can be used to reduce extreme design loads when the rotor is parked, but requires a higher tip speed.
On September 13, 2001, the first day after the attacks of September 11 that Sandia National Laboratories re-opened, Vice President Gerry Yonas entirely redirected the efforts of his organization, the Advanced Concepts Group (ACG), to the problem of terrorism. For the next several weeks, the ACG focused on trying to better characterize the international terrorist threat and the vulnerabilities of the US to further attacks. This work culminated in a presentation by Dr . Yonas to the Fall Leadership Focus meeting at Sandia National Laboratories on October 22. Following that meeting, President and Lab Director, Paul Robinson, asked Dr. Yonas and the ACG to develop a long-term (3-5 year) technology roadmap showing how Sandia could direct efforts to making major contributions to the success of the nation's war on terrorism. The ACG effort would communicate with other Labs activities working on near-term responses to Federal calls for technological support. The ACG study was conducted in two phases. The first, more exploratory, stage divided the terrorism challenge into three broad parts, each examined by a team that included both permanent ACG staff and part-time staff and consultants from other Sandia organizations. The ''Red'' team looked at the problems of finding and stopping terrorists before they strike (or strike again). The ''Yellow'' team studied the problems of protecting people and facilities from terrorist attacks, as well as those of responding to attacks that occur. The ''Green'' team attempted to understand the long-term, ''root'' causes of terrorism, and how technology might help ameliorate the conditions that lead people to support, or even become, terrorists. In addition, a ''Purple'' team worked with the other teams to provide an integrating vision for them all, to help make appropriate connections among them, and to see that they left no important gaps between them. The findings of these teams were presented to a broad representation of laboratory staff and management on January 3, 2002. From the many ideas explored by the Red, Green, and Yellow teams, and keeping in mind criteria formulated by the Purple team, the ACG assembled a set of five major technology development goals. These goals, if pursued, could lead to major contributions to the war on terrorism. With some rearrangement of team members and coordinators, a new set of teams began fleshing out these five ''Big Hairy Audacious Goals'' for the consideration of Laboratory leadership. Dr. Yonas briefed Sandia upper management on the work of these teams on February 4, 2002. This report presents the essence of that work as applicable to the R&D community of the nation interested in the development of better tools for a long term ''War on Terrorism.''
This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Robust Planning for Autonomous Navigation of Mobile Robots In Unstructured, Dynamic Environments (AutoNav)''. The project goal was to develop an algorithmic-driven, multi-spectral approach to point-to-point navigation characterized by: segmented on-board trajectory planning, self-contained operation without human support for mission duration, and the development of appropriate sensors and algorithms to navigate unattended. The project was partially successful in achieving gains in sensing, path planning, navigation, and guidance. One of three experimental platforms, the Minimalist Autonomous Testbed, used a repetitive sense-and-re-plan combination to demonstrate the majority of elements necessary for autonomous navigation. However, a critical goal for overall success in arbitrary terrain, that of developing a sensor that is able to distinguish true obstacles that need to be avoided as a function of vehicle scale, still needs substantial research to bring to fruition.
Over the past several decades, the development of computer models to predict the atmospheric transport of hazardous material across a local (on the order of 10s of km) to mesoscale (on the order of 100s of km) region has received considerable attention, for both regulatory purposes, and to guide emergency response teams. Wind inputs to these models cover a spectrum of sophistication and required resources. At one end is the interpolation/extrapolation of available observations, which can be done rapidly, but at the risk of missing important local phenomena. Such a model can also only describe the wind at the time the observations were made. At the other end are sophisticated numerical solutions based on so-called Primitive Equation models. These prognostic models, so-called because in principle they can forecast future conditions, contain the most physics, but can easily consume tens of hours, if not days, of computer time. They may also require orders of magnitude more effort to set up, as both boundary and initial conditions on all the relevant variables must be supplied. The subject of this report is two classes of models intermediate in sophistication between the interpolated and prognostic ends of the spectrum. The first, known as mass-consistent (sometimes referred to as diagnostic) models, attempt to strike a compromise between simple interpolation and the complexity of the Primitive Equation models by satisfying only the conservation of mass (continuity) equation. The second class considered here consists of the so-called linear models, which purport to satisfy both mass and momentum balances. A review of the published literature on these models over the past few decades was performed. Though diagnostic models use a variety of approaches, they tend to fall into a relatively few well-defined categories. Linear models, on the other hand, follow a more uniform methodology, though they differ in detail. The discussion considers the theoretical underpinnings of each category of the diagnostic models, and the linear models, in order to assess the advantages and disadvantages of each. It is concluded that diagnostic models are the better suited of the two for predicting the atmospheric dispersion of hazardous materials in emergency response scenarios, as the linear models are only able to accommodate gently-sloping terrain, and are predicated on several simplifying approximations which can be difficult to justify a priori. Of the various approaches used in diagnostic modeling, that based on the calculus of variations appears to be the most objective, in that it introduces the fewest number of arbitrary parameters. The strengths and weaknesses of models in this category, as they relate to the activities of Sandia's Nuclear Emergency Support Team (NEST), are further highlighted.
The purpose of this biological assessment is to review the proposed continued operation of Sandia National Laboratories, California (SNL/CA) in sufficient detail to determine to what extent the proposed action may affect the species listed below. This assessment is prepared in accordance with Section 7 of the Endangered Species Act [16 U.S.C. 1536 (c)].
We propose a novel algorithm based on Principal Component Analysis (PCA). First, we present an interesting approximation of PCA using Gram-Schmidt orthonormalization. Next, we combine our approximation with the kernel functions from Support Vector Machines (SVMs) to provide a nonlinear generalization of PCA. After benchmarking our algorithm in the linear case, we explore its use in both the linear and nonlinear cases. We include applications to face data analysis, handwritten digit recognition, and fluid flow.
The long-range objective of this study was to develop chemically assisted technologies for removing heels from tanks. In FY 01, the first two steps toward this objective were taken: (1) catalogue the occurrence and nature of tank heels and assess which materials are available for study and (2) develop methods for synthesizing non-radioactive surrogate heel materials for use in testing potential removal technologies. The chief finding of Task 1 was the existence of ''heels'', depending on the definition used. Hard materials that would be almost impossible to remove by sluicing are all but absent from the records of both Savannah River and Hanford. Historical usage suggests that the term ''heel'' may also apply to chunky, granular, or semi-solid pasty accumulations. These materials are documented and may also be difficult to remove by conventional sluicing technologies. Such heels may be comprised of normal sludge components, dominantly iron and aluminum hydroxides, or they may result from added materials which were not part of the normal fuel reprocessing operations: Portland cement, diatomaceous earth, sand and soil and spent zeolite ion exchange ''resins''. The occurrence and chemistry of the most notable ''heel'', that of the zeolite mass in Tank 19F at Savannah River, is reviewed in some detail. Secondly, no clear correlation was found between high tank temperatures and difficulties encountered in removing materials from a tank at a later date; nor did the sludges from these tanks give any indication of being particularly solid. Experimental studies to develop synthetic heel materials were caned out using a number of different approaches. For normal sludge materials settling, even when assisted by a centrifuge, it proved ineffective. The same result was obtained from drying sludge samples. Even exposing sludges to a molten salt melt at 233 C, only produced a fine powder, rather than a resilient ceramic which resisted disaggregation. A cohesive material, however, was produced by wicking the pore fluid out of a sludge gel (into packed diatomaceous earth), while simultaneously applying pressure to compact the sludge as it dehydrated. Osmotic gradients could provide the same function as the capillary forces provided by the diatomaceous earth sorbant placed in contact with the sludge. Tests on the anomalous materials added to the tanks all indicated potential problems. Hard granules, and maybe chunks, may be encountered where Portland cement was added to a tank. Sand, spent zeolite resin, and diatomaceous earth, will all react with the tank fluids to produce a sodalite/cancrinite material. The degree of reaction determines whether the grains become cemented together. SRS activities showed that heels formed when spent zeolites were added to tanks can be readily dislodged and it is expected that heels from sand would possess equal or less cohesion. Diatomaceous earth may form more resilient crusts or masses. To summarize, the existence of ''hard'' heels has yet to be documented. A broader definition suggests inclusion of poorly cohesive cancrinite-cemented masses and dense past-like accumulations of abnormally compacted ''normal'' sludges. Chemical treatments to remove these materials must focus on agents that are active against aluminosilicates and hydrous oxides of iron and aluminum. Exploiting the high pore-water content of these materials may provide a second avenue for dislodging such accumulations. Techniques were developed to produce synthetic sludges on which various removal technologies could be tried.
As part of the U.S. Department of Energy's Wind Partnerships for Advanced Component Technologies (WindPACT) program, Global Energy Concepts LLC (GEC) is performing a study concerning innovations in materials, processes and structural configurations for application to wind turbine blades in the multi-megawatt range. The project team for this work includes experts in all areas of wind turbine blade design, analysis, manufacture, and testing. Constraints to cost-effective scaling-up of the current commercial blade designs and manufacturing methods are identified, including self-gravity loads, transportation, and environmental considerations. A trade-off study is performed to evaluate the incremental changes in blade cost, weight, and stiffness for a wide range of composite materials, fabric types, and manufacturing processes. Fiberglass/carbon fiber hybrid blades are identified as having a promising combination of cost, weight, stiffness and fatigue resistance. Vacuum-assisted resin transfer molding, resin film infision, and pre-impregnated materials are identified as having benefits in reduced volatile emissions, higher fiber content, and improved laminate quality relative to the baseline wet lay-up process. Alternative structural designs are identified, including jointed configurations to facilitate transportation. Based on the results to date, recommendations are made for further evaluation and testing under this study to verify the predicted material and structural performance.
Curie Point pyrolysis-gas chromatography was investigated for use as a tool for characterization of aged ammonium perchlorate based composite propellants (1). Successful application of the technique will support the surveillance program for the Explosives Materials and Subsystems Department (1). Propellant samples were prepared by separating the propellant into reacted (oxidated) and unreacted zones. The experimental design included the determination of system reliability followed by, reproducibility, sample preparation and analysis of pyrolysis products. Polystyrene was used to verify the reliability of the system and showed good reproducibility. Application of the technique showed high variation in the data. Modifications to sample preparation did not enhance the reproducibility. It was determined that the high concentration of ammonium perchlorate in the propellant matrix was compromising the repeatability of the analysis.
The WIPP Case Study describes the compliance monitoring program, record keeping requirements, and passive institutional controls that are used to help ensure the Waste Isolation Pilot Plant (WIPP) will safety contain radioactive waste and indicate dangers and location of the wastes. The radioactive components in the waste are regulated by the U.S. Environmental Protection Agency (EPA) while the hazardous components in the waste are regulated by the New Mexico Environment Department (NMED). This paper addresses monitoring relating to radionuclide containment performance, passive institutional controls, and record keeping over a 10,000-year time frame. Monitoring relating to the hazardous components and the associated regulator are not addressed in this paper. The WIPP containment performance is mandated by release limits set by regulation. Regulations also require the radioactive waste containment performance of the WIPP to be predicted by a ''Performance Assessment.'' The EPA did not base the acceptance of the WIPP solely on predicted containment but included additional assurance measures. One such assurance measure is monitoring, which may be defined as the on-going measurement of conditions in and around the repository. This case study describes the evolution of the WIPP monitoring program as the WIPP project progressed through the planning, site characterization, regulatory promulgation, and eventual operational stages that spanned a period of over 25 years. Included are discussions of the regulatory requirements for monitoring, selection of monitoring parameters, trigger values used to identify unexpected conditions, assessment of monitoring data against the trigger values, and plans for post-closure monitoring. The United EPA established the requirements for Passive Institutional Controls (PICs) for disposal sites. The requirements state the a disposal site must be designated by the most permanent markers, records, and other passive institutional controls practicable to indicate the dangers of the wastes and their location. The PIC Task Force assessed the effectiveness of PICs in deterring inadvertent human intrusion and developed a conceptual design for permanently marking the Waste Isolation Pilot Plant (WIPP), establishing records, and identifying other practicable controls to indicate the dangers of the wastes and their location. The marking system should provide information regarding the location, design, contents, and hazards associated with WIPP. This paper discuss these controls including markers, records, archives, and government ownership and land-use restrictions.
This document contains an updated list of common acronyms, initialisms, and abbreviations used at Sandia. It will be published in an electronic format only. It can be retrieved from HTTPS://wfsprod01.sandia.gov/groups/srn-uscitizens/documents/document/wfs048643.pdf.
This report summarizes an investigation of the use of high-gain Photo-Conductive Semiconductor Switch (PCSS) technology for a deployable impulse source. This includes a discussion of viability, packaging, and antennas. High gain GaAs PCSS-based designs offer potential advantages in terms of compactness, repetition rate, and cost.
The mathematical description of acoustic wave propagation within a time- and space-varying, and moving, linear viscous fluid is formulated as a system of coupled linear equations. This system is rigorously developed from fundamental principles of continuum mechanics (conservation of mass, balance of linear and angular momentum, balance of entropy) and various constitutive relations (for stress, entropy production, and entropy conduction) by linearizing all expressions with respect to the small-amplitude acoustic wavefield variables. A significant simplification arises if the fluid medium is neither viscous nor heat conducting (i.e., an ideal fluid). In this case the mathematical system can be reduced to a set of five, coupled, first-order partial differential equations. Coefficients in the systems depend on various mechanical and thermodynamic properties of the ambient medium that supports acoustic wave propagation. These material properties cannot all be arbitrarily specified, but must satisfy another system of nonlinear expressions characterizing the dynamic behavior of the background medium. Dramatic simplifications in both systems occur if the ambient medium is simultaneously adiabatic and stationary.
Borehole radar systems can provide essential subsurface structural information for environmental evaluation, geotechnical analysis, or energy exploration. Sandia developed a prototype continuous-wave Borehole Radar (BHR) in 1996, and development of a practical tool has been continuing at a Russian institute under a Sandia contract. The BHR field experiments, which were planned for the summer of 2001 in Russia, provided a unique opportunity to evaluate the latest Sandia algorithms with actual field data. A new three-dimensional code was developed to enable the analysis of BHR data on modest-sized desktop workstations. The code is based on the staggered grid, finite difference technique, and eliminates 55% of the massive storage associated with solving the system of finite-difference linear equations. The code was used to forward-model the Russian site geometry and placement of artificial targets to anticipate any problems that might arise when the data was received. Technical software and equipment problems in the Russian field tests, conducted in August 2001, invalidated all but one of the data sets. However, more field tests with improved equipment and software are planned for 2002, and analysis of that data will be presented in a future report.
The Gulf of Mexico (GoM) is the most active deepwater region in the world and provides some of the greatest challenges in scope and opportunity for the oil and gas industry. The complex geologic settings and significant water and reservoir depths necessitate high development costs, in addition to requiring innovating technology. The investment costs are substantial: because of the extreme water depths (up to 8000 feet) and considerable reservoir depths (to 30,000 feet below mudline), the cost of drilling a single well can be upwards of 50 to 100 million dollars. Central, therefore, to successful economic exploitation are developments with a minimum number of wells combined with a well service lifetime of twenty to thirty years. Many of the wells that are planned for the most significant developments will penetrate thick salt formations, and the combined drilling costs for these fields are estimated in the tens of billions of dollars. In May 2001, Sandia National Laboratories initiated a Joint Industry Project focused on the identification, quantification, and mitigation of potential well integrity issues associated with sub-salt and near-salt deepwater GoM reservoirs. The project is jointly funded by the DOE (Natural Gas and Oil Technology Partnership) and nine oil companies (BHP Billiton Petroleum, BP, ChevronTexaco, Conoco, ExxonMobil, Halliburton, Kerr-McGee, Phillips Petroleum, and Shell). This report provides an assessment of the state of the art of salt mechanics, and identifies potential well integrity issues relevant to deepwater GoM field developments. Salt deformation is discussed and a deformation mechanism map is provided for salt. A bounding steady-state strain rate contour map is constructed for deepwater GoM field developments, and the critical issue of constraint in the subsurface, and resultant necessity for numerical analyses is discussed.
This document describes the 2002 SNL Accelerated Strategic Computing Initiative (ASCI) Applications Software Quality Engineering (SQE) Assessment and the assessment results. The primary purpose of the assessment was to establish the current state of software engineering practices within the SNL ASCI Applications Program.
The Materials Chemistry Department 1846 has developed a lab-scale chem-prep process for the synthesis of PNZT 95/5, a ferroelectric material that is used in neutron generator power supplies. This process (Sandia Process, or SP) has been successfully transferred to and scaled by Department 14192 (Ceramics and Glass Department), (Transferred Sandia Process, or TSP), to meet the future supply needs of Sandia for its neutron generator production responsibilities. In going from the development-size SP batch (1.6 kg/batch) to the production-scale TSP powder batch size (10 kg/batch), it was important that it be determined if the scaling process caused any ''performance-critical'' changes in the PNZT 95/5 being produced. One area where a difference was found was in the particle size distributions of the calcined PNZT powders. Documented in this SAND report are the results of an experimental study to determine the origin of the differences in the particle size distribution of the SP and TSP powders.