For several years Phillips Petroleum Company has been waterflooding portions of the Ekofisk Field reservoir for purposes of enhanced oil recovery. Boreholes drilled in waterflooded portions of the reservoir have encountered poor core recoveries and highly fractured rock (poor core recoveries and highly fractured zones were not uncommon in the Ekofisk reservoir before waterflooding, however). Results of laboratory compression tests designed to simulate production-related compaction and subsequent waterflooding indicate that injection pressures currently used to inject seawater into the reservoir are high enough to induce shear failure in high porosity reservoir chalks. A model of chalk deformation explains brittle failure of chalk that has been subjected to stresses well in excess of yield stress.
The Sandia National Laboratories Pre-Tiger Team Self-Assessment Report contains an introduction that describes the three sites in Albuquerque, New Mexico, Kauai, Hawaii, and Tonopah, Nevada, and the activities associated therewith. The self-assessment was performed October 1990 through December 1990. The paper discusses key findings and root causes associated with problem areas; environmental protection assessment with respect to the Clean Air Act, Clean Water Act, Comprehensive Environmental Response, Compensation, and Liability Act and the Superfund amendments, Resource Conservation and Recovery Act; and other regulatory documents; safety and health assessment with respect to organization administration, quality assurance, maintenance, training, emergency preparedness, nuclear criticality safety, security/safety interface, transportation, radiation protection, occupational safety, and associated regulations; and management practices assessment. 5 figs. (MHB)
Large two- and three-dimensional simulations of shock wave physics problems constitute a major expense in ongoing research efforts at Sandia National Laboratories. Massively parallel computing may provide a solution. A simplified version of the production hydrocode CTH, in current use at Sandia National Laboratories, has been successfully developed for the Connection Machine. The parallel version, named PCTH, solves problems in multi-fluid shock wave physics. The development of the Connection Machine code is described and initial performance statistics are presented. These are compared with similar results for the CRAY Y-MP and nCUBE2. 7 refs., 3 figs., 1 tab.
The durability of a carbon-carbon composite, Aerolor A05, X-point divertor dump plate to thermal fatigue was evaluated for the Joint European Torus (JET) at Sandia's Plasma Material Test Facility. Of primary interest was the effect of thermal cycling on the carbon-carbon threads of the bolted attachment scheme for the Aerolor X-point divertor. This report describes the testing performed at the Ion Beam Test System and the test and analysis results obtained in support of this effort. After completing 1000 thermal cycles, where the surface temperature of the 8 cm by 8 cm by 2.3 cm Aerolor tile reached 2200{degree}C during a 10 s, 500 W/cm{sup 2} pulse, the tile survived without any noticeable damage. Post test inspection of the carbon-carbon threads showed only minor wear and no signs of significant damage. Thermal modeling of the test article using the ABAQUS finite element code agreed very well with experimental results. The thermal creep experienced by the M-12 stainless steel bolt during ion beam testing will not be expected to occur during normal operations in JET because of the longer cycle times between the thermal events. Finite element analysis indicates that the longer cycle times at the JET will reduce the peak temperatures in the vicinity of the bolt and bolt insert below the level at which thermal creep would occur. An additional margin of safety could be obtained by using Inconel or Nimonic fasteners. Overall, the performance of the bolted divertor design to thermal fatigue was acceptable. 12 figs., 2 tabs.
An electromagnetic measurement system (EMMS) was designed and constructed to provide essential data relating to electromagnetic compatibility (EMC) of modern weapons carried on military aircraft. This system measures the equivalent plane wave electric and magnetic fields impinging on a weapon's exterior surface arising from electromagnetic radiators on board host or nearby aircraft. To relate practical sensor responses to specified equivalent plane wave EMC field levels, a modern weapon shape was used as the primary sensor element which responds with a simple dipole antenna response at the lower frequencies and is instrumented with local skin current sensors. At higher frequencies, the locally induced currents can be related to the incident fields by simple scattering theory. Finally, an error analysis that catalogs all measurement path elements was performed to provide an error bound on the equivalent free electric field measurements reported by the EMMS. 6 refs., 9 figs.
A Low Altitude High Speed Cargo (LAHSC) parachute is being developed for deployment at velocities up to 250 knots at 300 ft altitude. The LAHSC parachute will decelerate and turnover a load to a 40 to 60 ft/sec vertical velocity at first vertical at approximately 30 ft AGL. The acceleration limit is 5 g's. Main chute cargo extraction will be necessary. A single parachute will be utilized for a 7500 lb load, and clusters will be used for larger loads. The 64-gore, 70-ft-dia parachute has a ring-slot/solid construction with a flare at the skirt to aid the inflation. This paper describes the parachute, the design process and testing to date. Model parachutes have been tested in wind tunnels and in free flight. A single full-scale parachute has been tested at low speeds with conventional load extraction, and with a vertical trajectory at deployment. 5 refs., 18 figs., 3 tabs.
High-speed water-entry is a very complex, dynamic process. As a first attempt at modeling the process, a numerical solution was developed at Sandia National Laboratories for predicting the forces and moments acting on a body with a steady supercavity, that is, a cavity which extends beyond the base of the body. The solution is limited to supercavities on slender, axisymmetric bodies at small angles of attack. Limited data were available with which to benchmark the axial force predictions at zero angle of attack. Even less data were available with which to benchmark the pitching moment and normal force predictions at nonzero angles of attack. A water tunnel test was conducted to obtain force and moment data on a slender shape. This test produced limited data because of waterproofing problems with the balance. A new balance was designed and a second water tunnel test was conducted at Tracor Hydronautics, Inc. This paper describes the numerical solution, the experimental equipment and test procedures, and the results of the second test. 8 refs., 11 figs.
Micromachining is a rapidly growing field which allows for the fabrication of extremely small sensors and actuators using many of the techniques common to microelectronics. Two methods are commonly used: bulk micromachining, which involves the sculpting of single crystal silicon, and surface micromachining, which uses etched thin films that have been deposited on the substrate. Sensors are the primary commercial application, but microactuators are being actively researched at several laboratories and universities. Sandia National Laboratories is pursuing applications of both bulk and surface micromachining for silicon microsensors, microactuators, and high-performance, silicon packages for microelectronics. 3 figs.
This is an extension of two previous analytical studies to investigate a technique for generating high frequency, high amplitude vibration environments. These environments are created using a device attached to a common vibration exciter that permits multiple metal on metal impacts driving a test surface. These analytical studies predicted that test environments with an energy content exceeding 10 kHz could be achieved using sinusoidal and random shaker excitations. The analysis predicted that chaotic vibrations yielding random like test environments could be generated from sinusoidal inputs. In this study, a much simplified version of the proposed system was fabricated and tested in the laboratory. Experimental measurements demonstrate that even this simplified system, utilizing a single impacting object, can generate environments on the test surface with significant frequency content in excess of 40 kHz. Results for sinusoidal shaker inputs tuned to create chaotic impact response are shown along with the responses due to random vibration shaker inputs. The experiments and results are discussed. 4 refs., 5 figs.
Sandia National Laboratories has utilized pool fires for over thirty years to subject military components, weapon mockups and hazardous material shipping containers to postulated transportation accident environments. Most of the tests have been performed in either open pools or wind shielded facilities with little control of visible smoke emissions. Because of the increased sensitivity of environmental issues and because wind generates the biggest uncontrollable effect on the thermal environment in open pool fires, enclosed test facilities with reduced visible emissions have been developed. The facilities are basically water cooled enclosures fitted with controlled air supply systems and high temperature afterburners. The purpose of this paper is to present our experience with both open and enclosed fires. In the first section, a review of the fire test facilities is given. A following section presents a mathematical model behind our approach to characterizing the fire environment. In the last section, data from open and closed fires are compared.
Research programs from Sandia Laboratory in Materials Science are briefly presented. Significant accomplishments include: preparation of Tl superconductors under equilibrium conditions, development of force-feedback sensor for interfacial force microscope, predictive model of hydrogen interactions in silicon dioxide on silicon, layer-by-layer sputtering of Si (001), oscillatory As{sub 4} surface reaction rates during molecular beam epitaxy of AlAs, GaAs and InAs, the effects of interfacial strain on the band offsets of lattice matched III-V semiconductor, a new mechanism for surface diffusion, solid solution effects in Tl-containing superconductors, record high superconducting transitions for organic materials, atomic vibrations in boron carbides and a method for studying radical/surface reactions in chemical vapor deposition (CVD).
Recent government actions to eliminate Chlorofluorocarbons (CFCs) and Chlorinated Hydrocarbons (CHCs) from the industrial environment require the evaluation of new cleaning solvents and processes. High reliability printed wiring board (PWB) assemblies require cleaning to remove process materials which could lead to corrosion or degradation of the electrical performance of the boards. In the past, CFCs have been used extensively for PWB cleaning purposes. Concerns about CFC emissions and their effect on ozone depletion in the atmosphere, greater demands on cleaning systems, and the availability of alternative cleaning methods are requiring manufacturers of electronic assemblies to reconsider the choice of cleaning methods. We will review some of the presently available cleaning solvents and discuss the results of our work using a terpene-based cleaner. 5 refs., 4 figs.
The Recirculating Linear Accelerator (RLA) is returning to operation with a new electron beam injector and a modified accelerating cavity. Upon completion of our experimental program the RLA will capture the injected beam on an IFR guiding plasma channel in either a spiral or a closed racetrack drift tube. The relativistic beam will be efficiently recirculated for up to four passes through two or more accelerating cavities, in phase with the ringing cavity voltage waveforms, and thereby increased in energy to 10 MeV before being extracted. The inductively isolated four-stage injector was designed to produce beam parameters of 4 MeV, 10--20 kA, and 40--55 ns FWHM. The three-line radial cavity is being modified to improve the 1-MV accelerating voltage pulse shape while an advanced cavity design study is in progress. The actual versus predicted pulsed-power performance of the RLA injector and cavity and the associated driving hardware will be discussed in this paper.
Last year at the HP82000 Users Group Meeting, Sandia National Laboratories gave a presentation on I{sub DDQ} testing. This year, we will present some advances on this testing including DUT board fixturing, external DC PMU measurement, and automatic IDD-All circuit calibration. This paper is geared more toward implementation than theory, with results presented from Sandia tests. After a brief summary I{sub DDQ} theory and testing concepts, we will describe how the break (hold state) vector and data formatting present a test vector generation concern for the HP82000. We than discuss fixturing of the DUT board for both types of I{sub DDQ} measurement, and how the continuity test and test vector generation must be taken into account. Results of a test including continuity, IDD-All and I{sub DDQ} Value measurements will be shown. Next, measurement of low current using an external PMU is discussed, including noise considerations, implementation and some test results showing nA-range measurements. We then present a method for automatic calibration of the IDD-All analog comparator circuit using RM BASIC on the HP82000, with implementation and measurement results. Finally, future directions for research in this area will be explored. 14 refs., 16 figs.
Proceedings - IEEE International Conference on Robotics and Automation
Hwang, Yong K.
Path planning among movable obstacles is a practical problem that is in need of a solution. An efficient heuristic algorithm is presented that uses a generate-and-test paradigm: a good candidate path is hypothesized by a global planner and subsequently verified by a local planner. In the process of formalizing the problem, a technique for modeling object interactions through contact is presented. The algorithm has been tested on a variety of examples, and was able to generate solutions within 10 s on a 17-MIPS Sun Sparc workstation.
CEPXS/ONELD is the only discrete ordinates code capable of modelling the fully-coupled electron-photon cascade at high energies. Quantities that are related to the particle flux such as dose and charge deposition can readily be obtained. This deterministic code is much faster than comparable Monte Carlo codes. The unique adjoint transport capability of CEPXS/ONELD also enables response functions to be readily calculated. Version 2.0 of the CEPXS/ONELD code package has been designed to allow users who are not expert in discrete ordinates methods to fully exploit the code's capabilities. 14 refs., 15 figs.
With the current trends towards miniaturization, high performance, high quality and cost competiveness, the electrodeposition process has become an important manufacturing technology in many new microelectronic applications. Gold electrodeposition plays an increasing role in processes that require this noble metal. Added to these trends is the continuing and increasing emphasis on manufacturing processes which are less damaging to the environment and potentially less hazardous to the operator and personnel in the vicinity of the operation. The present standard gold plating solutions are based on cyanide salts and are considered acutely hazardous solutions. The trend away from their use is gaining momentum as new non-hazardous gold plating solutions and manufacturing processes making use of them are developed. 2 refs.
A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides detailed guidance on input and output procedures for the computer codes recommended for use in the methodology. Seven sample problems are provided for various aspects of a performance assessment analysis of a simple hypothetical conceptual model. When combined, these sample problems demonstrate how the methodology is used to produce a dose history for the site under normal conditions, and to demonstrate an analysis of an intruder scenario. 20 refs., 26 figs., 4 tabs.
Results of calculations performed with MELCOR and HECTR in support of the NUREG-1150 study are presented in this report. The analyses examined a wide range of issues. The analyses included integral calculations covering an entire accident sequence, as well as calculations that addressed specific issues that could affect several accident sequences. The results of the analyses for Grand Gulf, Peach Bottom, LaSalle, and Sequoyah are described, and the major conclusions are summarized.
Electromagnetic coupling to electronic components or subsystems is a concern in modern system design. Undesired coupling can cause interference or, in the extreme, system upset. To be able to characterize the coupling is an important step to understanding the limitations on system performance. Often the approach is taken to shield the electronic equipment inside some kind of enclosure. However, there are usually inadvertent cracks or bowing at mechanical interfaces. These gaps are apparent slot apertures. An equivalent antenna/local transmission line model for narrow slot apertures with depth including losses has been developed. It may be applied tortuous paths and hence may be used to model practical situations. This model has been previously verified by measuring the coupling through narrow slot apertures with varying width and depth. These measurements were performed for brass slots radiating into a half-space. The results were in good agreement with the model of Warne and Chen. The models, as well as the measurements showed that for very narrow slots the wall loss becomesdominant -- it has been demonstrated that the inclusion of loss is important in making realistic coupling estimates in practical configurations. This paper presents results showing the effects of varying conductivity and surface preparations for half-space coupling as well as different loadings of the narrow slot apertures. The coupling through narrow slot apertures having depth was measured for a variety of resonant cavity loadings. The loadings were chosen such that the cavity resonant frequencies were above, near and below the resonant peak of the half-space coupling curve. Measurements were made in the 2--4 GHz band with vertical polarization. 3 refs., 6 figs., 1 tab.
The solidification behavior of Custom Age 625 PLUS{reg sign} is examined using an integrated analytical approach. Like its predecessors, Alloys 625 and 718, the solidification behavior of this new alloy is dominated by the presence and segregation of Nb, which gives rise to a {gamma}/Laves terminal solidification constituent. 8 refs., 5 figs., 2 tabs.
Iterative, annual performance-assessment calculations are being performed for the Waste Isolation Pilot Plant (WIPP), a planned underground repository in southeastern New Mexico, USA for the disposal of transuranic waste. The performance-assessment calculations estimate the long-term radionuclide releases from the disposal system to the accessible environment. Because direct experimental data in some areas are presently of insufficient quantity to form the basis for the required distributions. Expert judgment was used to estimate the concentrations of specific radionuclides in a brine exiting a repository room or drift as it migrates up an intruding borehole, and also the distribution coefficients that describe the retardation of radionuclides in the overlying Culebra Dolomite. The variables representing these concentrations and coefficients have been shown by 1990 sensitivity analyses to be among the set of parameters making the greatest contribution to the uncertainty in WIPP performance-assessment predictions. Utilizing available information, the experts (one expert panel addressed concentrations and a second panel addressed retardation) developed an understanding of the problem and were formally elicited to obtain probability distributions that characterize the uncertainty in fixed, but unknown, quantities. The probability distributions developed by the experts are being incorporated into the 1991 performance-assessment calculations. 16 refs., 4 tabs.
Segmentation is a process of separating objects of interest from their background or from other objects in an image. Without a suitable segmentation scheme, it is very difficult to detect contraband in X-rays images. In this paper, a Probabilistic Relaxation Labeling (PRL) segmentation scheme is presented and compared with other segmentation methods. PRL segmentation is an interative algorithm that labels each pixel in an image by cooperative use of two information sources: the pixel probability and the degree of certainty of its probability supported by the neighboring pixels. The practical implementation and results of the PRL segmentation on X-ray baggage images are also discussed and compared with other segmentation methods. 13 refs., 12 figs.
This investigation describes how a statistically designed experiment can be useful to characterize the relationship between a fundamental material property such as the glass transition temperature, Tg, and various processing parameters, e.g. composition, cure time, and temperature. To illustrate, formulation weighing errors can have a dramatic affect on material properties such as thermal, mechanical, and electrical properties. The glass transition temperature was selected for monitoring because it represents the materials state of cure and it is relatively easy to determine. Specifically, EPON 828 systems cured with diethanolamine and Shell Z, respectively, were investigated plus a mixture of the latter that employed aluminum oxide as a filler. This investigation showed that Tg changed very little with cure temperature in the DEA system compared to Shell Z, whereas the latter system appeared to display synergistic effects contrary to the DEA system. In the filled formulation, loading level had very little effect on Tg. The significance of this study is that the relationship between Tg, the composition and processing factors can be used to help diagnose the cause of misprocessed material. 2 refs., 11 figs., 3 tabs.
The detonability of hydrogen-air-diluent mixtures was investigated experimentally in the 0.43 m diameter, 13.1 m long Heated Detonation Tube (HDT) for the effects of variations in hydrogen and diluent concentration, initial pressure, and initial temperature. The data were correlated using a ZND chemical kinetics model. The detonation limits in the HDT were obtained experimentally for lean and rich hydrogen-air mixtures and stoichiometric hydrogen-air-steam mixtures. The addition of a diluent, such as steam or carbon dioxide, increases the detonation cell width for all mixtures. In general, an increase in the initial pressure or temperature produces a decrease in the cell width. In the HDT, the detonable range of hydrogen in a hydrogen-air mixture initially at 1 atm pressure is between 11.6 percent and 74.9 percent for mixtures at 20°C, and 9.4 percent and 76.9 percent for mixtures at 100°C. The detonation limit is between 38.8 percent and 40.5 percent steam for a stoichiometric hydrogen-air-steam mixture initially at 100°C and 1 atm. The detonation limit is between 29.6 percent and 31.9 percent steam for a stoichiometric hydrogen-air-steam mixture for the case where hydrogen and steam are added to air initially at 20{degree}C and 1 atm resulting in a final predetonation mixture temperature and pressure of approximately 100°C and 2.6 atm, respectively.
Currently, over 90% of the world's large-scale solar electric energy is generated with concentrating solar thermal power plants. Such plants have the potential to meet many of the world's future energy needs. Research efforts are generally focused on generating electricity, though a variety of other applications are being pursued. Today, the technology for using solar thermal energy is well developed, cost competitive, and in many cases, ready for widespread application. The current state of each of the solar thermal technologies and their applications is reviewed, and recommendations for increasing their use are presented. The technologies reviewed in detail are: parabolic trough systems, central tower systems, and parabolic dish systems. 20 refs., 1 fig., 1 tab.
TRANSNET is a compilation of risk and systems analysis codes, routing and cost models and related data that address hazardous and radioactive materials transportation. TRANSNET is the acronym assigned to this system of models and associated data which reside on a dedicated MicroVAX 3800. After obtaining a password, users may access TRANSNET with a modem-equipped personal computer. TRANSNET was developed by Sandia National Laboratories (SNL) under the sponsorship of the United States Department of Energy (DOE) Office of Defense Programs (subsequently reorganized to the Office of Environmental Restoration and Waste Management). The goals of the TRANSNET system are to speed transfer of technology and data to qualified users by permitting access to the most comprehensive and up-to-date transportation risk and systems analysis models and associated databases. 13 refs.
Relational databases have many advantages over former hierarchical and network systems -- the most important advantage is their ease of modification. This leads designers to a new approach, which we at our company are finding very useful in building an information system of corporate-wide shared data. This approach is a phased bottom-up design and application development which is supported by an information modeling method called NIAM (Nijssen's Information Analysis Method). NIAM is not well known in the USA, but is widely used in Europe. An introduction to the NIAM approach and its advantages will be followed by examples of models and their corresponding relational database designs that have been developed in step-wise fashion at our company. Since NIAM algorithms yield tables in fifth normal form, our relational systems are implemented for optimum update capabilities and enforceable referential integrity. 4 refs., 6 figs.
In FY90 important milestones from past Administrative Information Systems (AIS) plans were realized. The first phase of the Payroll migration was implemented early in the year. This event signified the completion of a major migration milestone and the transition of the Laboratory Information Systems (LIS) machine to a production environment. The Access Clearance System (A CS) system and several early deliverables from other migration projects were also implemented during the year. FY91 promises to be another challenging year for those involved with administrative information systems. Aggressive schedules are in effect for the migration projects; the Financial Migration, Human Resources (HR) Migration, and Integrated Procurement System Replacement (IPS/R) efforts will deliver major system components this year. The administrative computing consolidation is underway and will be completed early in FY91. Consolidating computing hardware resources will provide adequate resources and better systems support for the entire AIS community.
In this paper, we review correlation filters as an approach to pattern recognition with a special emphasis on the consequences of normalizing the correlation to achieve intensity invariance. Intensity invariance is effected using the Cauchy-Schwarz inequality to normalize the correlation integral. We discuss the implications of this criterion for the application of correlation filters to the pattern recognition problem. It is shown that normalized phase-only and synthetic discriminate functions do not provide the recognition/discrimination obtained with the classical matched filter. 34 refs., 5 figs.
In keeping with the philosophy of the external events analyses for NUREG-1150, which are intended to be smart probabilistic risk assessments (PRAs) making full use of all insights gained during the past 10 years of development in risk assessment methodologies, the Savannah River K-Reactor fire analysis was performed using newly developed and simplified methods. These methods have been under development at Sandia National Laboratories under sponsorship of the Nuclear Regulatory Commission (NRC) Division of Risk Assessment as part of the Dependent Failure Methodology Development Program. A detailed screening analysis was performed which showed most plant areas had a negligible contribution to fire-induced core damage frequency. Detailed analysis of the fire risk resulted in a total (mean) core damage frequency of 1.35E-7 per year. 18 refs., 12 figs., 17 tabs.
The project comprises the development of concentrating solar collectors, heliostats and dishes, and the development of optical materials. Because the solar concentrator represents from 40 to 60% of the cost of a solar thermal electric system, the continued development of high-performance concentrators is very important to the commercial viability of these systems. The project is currently testing two large area heliostats, the SPECO 200 m{sup 2} heliostat and the ATS 150 m{sup 2} heliostat and also trying to reduce the cost of the heliostats through the development of stretched-membrane heliostats. Stretched-membrane heliostats are made by attaching thin metal membranes to the two sides of a circular, metal ring. A slight vacuum in the plenum between the two membranes is used to focus the heliostat. The optical surface is provided by a silver-acrylic film, ECP 305. A prototype 100 m{sup 2} commercial unit has been built and is currently being tested. Parabolic dish concentrators are under development for use on dish-Stirling electric systems. The state-of-the-art dish is the McDAC/SCE faceted glass concentrator. Because of the success of stretched-membrane technology for heliostats, the project applied the technology to parabolic dish development and is currently designing a near-term, faceted, stretched-membrane dish. The current thrust of the program in optical materials development is the development of a low-cost, high-performance, silver-acrylic film. 3M's ECP 305 has demonstrated substantial improvement over previous films in its resistance to corrosion, longer life. An experimental film, developed at SERI, has promise for further improving the lifetime of the ECP 305. The project is currently investigating solutions to the problem of separation between the silver and acrylic layers of the film in the presence of water.
A technique to localize errors between two modal models is presented. Mode shape difference are calculated from each model. A global comparison of the ratios of these corresponding differences is used to identify the physical locations on the structure where stiffness differences exist between the two models. Some of the strengths and limitations of the technique are illustrated using the mode shapes of two similar finite element models with a known stiffness difference. The technique is then applied to a two link robot arm for which a finite element model exists and a modal test has been conducted. The results of the error localization aid the selection of physical parameters to be updated in the finite element model. Sensitivity methods are used to correlate the finite element model to the modal test. The results of the correlation are presented. 5 refs., 3 figs., 3 tabs.
Sandia National Laboratories is conducting performance assessments for the United States Department of Energy to use in evaluating compliance of the Waste Isolation Pilot Plant with EPA 40 CFR 191, Subpart B. Performance assessment is an iterative process that will lead to final compliance evaluation in 1994 or later. Monte Carlo simulations examine modeling system sensitivity to the probability of intrusion and uncertainty in the transport model for the overlying water-bearing unit. Simulations of two-phase (gas and brine) flow indicate gas generation may substantially reduce brine saturation in the waste, limiting radionuclide transport. All results are preliminary and are not suitable for evaluating compliance. Results suggest, however, that compliance can be achieved.
This paper will provide an overview of cleaning qualifications used in a variety of industries: from small-scale manufacturer's of precision-machined products to large-scale manufacturer's of electronics (printed wiring boards and surface mount technology) and microelectronics. Cleanliness testing techniques used in the production of precision-machined products, will be described. The on-going DOD program to obtain high-reliability electronics, through the use of military specifications for cleaning and cleanliness levels, will be reviewed. In addition, the continually changing cleanroom/materials standards of the microelectronics industry will be discussed. Finally, we will speculate on the role that new and improved analytical techniques and sensor technologies will play in the factories of the future. 4 refs., 1 tab.
Ray paths and focal lengths are derived to fourth order for a nuclear-reactor wall-pumped gas laser. Ray paths in the laser gain cell are shown to be nearly random for a long gain region. Focal lengths calculated from the ray paths exiting the laser are shown to oscillate between {plus minus} {infinity} during pumping. The use of stimulated Brillouin scattering as a means for beam clean-up is discussed with the conclusion that the phase conjugated beam would cycle on and off as the ray paths and focal lengths oscillate between extremes. The parameters determining this cycling effect and its characteristics are also derived. 17 refs., 11 figs., 1 tab.
MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. The entire spectrum of severe accident phenomena, including reactor coolant system and containment thermal-hydraulic response, core heatup, degradation and relocation, and fission product release and transport, is treated in MELCOR in a unified framework for both boiling water reactors and pressurized water reactors. MELCOR has been especially designed to facilitate sensitivity and uncertainty analyses. Its current uses include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This report is a summary of MELCOR 1.8.0, the code version released in March 1989. Condensed information is presented on its developmental history, structure, modeling features and capabilities, verification and validation, and quality assurance. Detailed documentation on these aspects of MELCOR, including users' guides, reference manuals, programmers' guides, and assessment and application reports, is available in draft form and is distributed to MELCOR users.
Experimental tests on the Annular Core Research Reactor have confirmed that the Three-Bean-Salad'' control algorithm based on the Pontryagin maximum principle can change the power of a nuclear reactor many decades with a very fast startup rate and minimal overshoot. The paper describes the results of simulations and operations up to 25 MW and 87 decades per minute. 3 refs., 4 figs., 1 tab.
The spherical element computer code DMC (Distinct Motion Code) used to model rock motion resulting from blasting has been enhanced to allow routine computer simulations of bench blasting. The enhancements required for bench blast simulation include: (1) modifying the gas flow portion of DMC, (2) adding a new explosive gas equation of state capability, (3) modifying the porosity calculation, and (4) accounting for blastwell spacing parallel to the face. A parametric study performed with DMC shows logical variation of the face velocity as burden, spacing, blastwell diameter and explosive type are varied. These additions represent a significant advance in the capability of DMC which will not only aid in understanding the physics involved in blasting but will also become a blast design tool. 8 refs., 7 figs., 1 tab.
Nd:YAG laser cleaning of metal oxides from 304L stainless steel surfaces has been characterized. Thin chromium oxide films can be completely removed from the surface using a single 10 nsec pulse of laser radiation with an average surface irradiance greater than 120 MW/cm{sup 2}. Laser etching of thicker iron oxide films exhibit a self-limiting effect that prevents overetching into the stainless steel substrate. 8 figs.
Electron-Beam (EBeam) melting furnaces are routine used to minimize the occurrence of second-phase particles in the processing of segregation-sensitive alloys. As one part of the process, a circulating electron beam impinges the surface of a crucible melt pool to help control the shape of the solidification front below. By modeling melt pool hydrodynamics, heat transfer, and the shape of solidification boundaries, we plan to optimize the dwell pattern of the beam so that the material solidifies with a composition as spatially homogeneous as possible. Both two- and three-dimensional models are being pursued with FIDAP 5.02, the former serving as a test bed for various degrees of model sophistication. A heat flux distribution is specified on the top of the domain to simulate the EBeam dwell pattern. In two dimensions it is found that an inertially-driven recirculation in the melt pool interacts with a counter-rotating buoyancy-driven recirculation, and that both recirculations are influenced heavily by surface tension gradients on the melt-pool surface. In three dimensions the inertial cell decays quickly with distance from the position of the inlet stream, causing the fluid to precess the crucible. Ingot macrosegregation patterns for a U-6 wt. % Nb alloy are calculated with the Flemings-Mehrabian equation of solute redistribution; the sensitivity of these patterns to EBeam dwell pattern is explored.
Proceedings - IEEE International Conference on Robotics and Automation
Novak, J.L.; Wiczer, J.J.
A high-resolution capacitive image sensing technique for measuring edge and surface profiles during manufacturing processes has been invented. A prototype device utilizing this technique consists of two 500-μm-diameter electrodes fabricated on a printed circuit board with a 250-μm gap between them. As the device is mechanically scanned over the workpiece, the spatial variations in the edge or surface to be measured interfere with an electric field imposed between the electrodes, altering the mutual capacitance. The sensor functions as a near-field proximity sensor producing range images of surface imperfections. This sensor has been used in applications requiring a preview image of burns on the edge of a machined part and other processes requiring an inspection image after automated deburring operations.
Proceedings - IEEE International Conference on Robotics and Automation
Stansfield, S.A.
A series of haptic exploratory procedures (EPs) implemented for a multifingered, articulated, sensate robot hand is discussed. These EPs are designed to extract specific tactile and kinesthetic information from an object via their purposive invocation by an intelligent robotic system. Taken together, they form an active robotic touch perception system. This system utilizes a PUMA 560 robot arm, a JPL/Stanford robot hand with joint torque sensing in the fingers, a wrist force/torque sensor, and a 256-element spatially resolved fingertip tactile array. The EPs are described, and experimental results are given.
The LIFE2 computer code is a fatique/fracture analysis code that is specialized to the analysis of wind turbine components. The numerical formulation of the code uses a series of cycle count matrices to describe the cyclic stress states imposed upon the turbine. In this formulation, each stress cycle is counted or binsed'' according to the magnitude of its mean stress and alternating stress components and by the operating condition of the turbine. A set of numerical algorithms has been incorporated into the LIFE2 code. These algorithms determine the cycle count matrices for a turbine component using stress-time histories of the imposed stress states. This paper describes the design decisions that were made and explains the implementation of these algorithms using Fortran 77. 7 refs., 7 figs.
The CONTAIN quality assurance program follows a strict set of procedures designed to ensure the integrity of the code, to avoid errors in the code, and to prolong the life of the code. The code itself is maintained under a code-configuration control system that provides a historical record of changes. All changes are incorporated using an update processor that allows separate identification of improvements made to each successive code version. Code modifications and improvements are formally reviewed and checked. An exhaustive, multilevel test program validates the theory and implementation of all codes changes through assessment calculations that compare the code-predicted results to standard handbooks of idealized test cases. A document trail and archive establish the problems solved by the software, the verification and validation of the software, software changes and subsequent reverification and revalidation, and the tracking of software problems and actions taken to resolve those problems. This document describes in detail the CONTAIN quality assurance procedures.
The 10-MW{sub e} Solar One Pilot Plant was the world's largest solar central receiver power plant. During its power production years it delivered over 37,000 MWhrs (net) to the utility grid. In this type of electric power generating plant, large sun-tracking mirrors called heliostats reflect and concentrate sunlight onto a receiver mounted on top a of a tower. The receiver transforms the solar energy into thermal energy that heats water, turning it into superheated steam that drives a turbine to generate electricity. The Solar One Pilot Plant successfully demonstrated the feasibility of generating electricity with a solar central receiver power plant. During the initial 2 years the plant was tested and 4 years the plant was operated as a power plant, a great deal of data was collected relating to the efficiency and reliability of the plant's various systems. This paper summarizes these statistics and compares them to goals developed by the US Department of Energy. Based on this comparison, improvements in the design and operation of future central receiver plants are recommended. Research at Sandia National Laboratories and the US utility industry suggests that the next generation of central receiver power plants will use a molten salt heat transfer fluid rather than water/steam. Sandia has recently completed the development of the hardware needed in a molten salt power plant. Use of this new technology is expected to solve many of the performance problems encountered at Solar One. Projections for the energy costs from these future central receiver plants are also presented. For reference, these projections are compared to the current energy costs from the SEGS parabolic trough plants now operating in Southern California.
The results of the first year of an evaluation of charge controllers for stand-alone photovoltaic (PV) systems are presented. The objectives of the test program are to positively influence the development of battery charge controllers for stand-alone PV applications and to develop design and application criteria that will improve PV system reliability and battery performance. Future goals are to expand the evaluation program to include various battery technologies and controller algorithms. Also, the information is being communicated to manufacturers to aid in the design of more effective and reliable charge controllers for PV systems. Eight different models of small (nominal 10 amp) charge controllers are being subjected to a comprehensive evaluation. These evaluations include operational tests in identical stand-alone PV systems and environmental and electrical cycling tests. Selected custom tests are also performed on the controllers to determine the response to transients, installation requirements and system design compatibilities. Data presented in this paper include measured electrical characteristics of the controllers, temperature effects on set points, and operational performance in PV systems both in the lab and in the field. A comparison is presented for four different charge controller algorithms which include array-shunt, series-interrupting, series-linear constant-voltage and series-linear-multistep constant-current. 9 refs., 11 figs., 2 tabs.
SAS software is being used to analyze product test data stored in an INGRES relational database. The database has been implemented at Allied-Signal in Kansas City on a Digital Equipment Corporation (DEC) VAX computer. The INGRES application development has been a joint project between Sandia National Laboratories and Allied-Signal. Application screens have been developed so that the user can query the database for selected data. Fourth generation language procedures are used to retrieve all data requested. FORTRAN and VAX/VMS DCL (DIGITAL Control Language) procedures are invoked from the application to create SAS data sets and dynamically build SAS programs that are executed to build custom reports or graphically display the retrieved test data along with control and specification limits. A retrieval screen has also been developed which invokes SAS software to calculate the mean and standard deviation of the retrieved data. These parameters are passed back into the application for display and may then be used as an aid in setting new control limits for future test runs. Screens have been developed to provide an interface for the user to select from a library of SAS programs, edit the selected program, and run the program with a user-defined SAS data set as input. This paper will give a brief description of the application screens and provide details of how information is passed between the application and SAS programs.
I was asked to write a database application that would be user friendly to the extent that a minimum amount of learning would be required of the user to run the application, yet it would be flexible enough to gather the data in various combinations. Writing SELECT'' or RETRIEVE'' queries required to much initial training. Hard coding queries into the application meant the users could not pick columns or create contraints. I decided to compromise somewhat, requiring my users to learn how to manipulate VIFRED menus and by doing so they could pick any combination of columns for output, select any column variable to sort on, and impose simple, yet practical constaints on the data, all this possible at run time. This handout contains copies of the VIFRED menus, the help message for imposing constraints, output from sample retrievals, descriptions of the relational tables needed to implement the methodology, and the computer coding of the actual retrieval construction.
Burnup credit is the application of the effects of fuel burnup to nuclear criticality design. When burnup credit is considered in the design of storage facilities and transportation casks for spent fuel, the objectives are to reduce the requirements for storage space and to increase the payload of casks with acceptable nuclear criticality safety margins. The spent-fuel carrying capacities of previous-generation transport casks have been limited primarily by requirements to remove heat and/or to provide shielding. Shielding and heat transfer requirements for casks designed to transport older spent fuel with longer decay times are reduced significantly. Thus a considerable weight margin is available to the designer for increasing the payload capacity. One method to achieve an increase in capacity is to reduce fuel assembly spacing. The amount of reduction in assembly spacing is limited by criticality and fuel support structural concerns. The optimum fuel assembly spacing provides the maximum cask loading within a basket that has adequate criticality control and sufficient structural integrity for regulatory accident scenarios. The incorporation of burnup credit in cask designs could result in considerable benefits in the transport of spent fuel. The acceptance of burnup credit for the design of transport casks depends on the resolution of system safety issues and the uncertainties that affect the determination of criticality safety margins. The remainder of this report will examine these issues and the integrated approach under way to resolve them. 20 refs., 2 figs.