Publications

Results 97001–97050 of 99,299

Search results

Jump to search filters

Micro-radiosurgery: a new concept for radiotherapy based upon low energy, ion-induced nuclear reactions

Nuclear Inst. and Methods in Physics Research, B

Horn, Kevin M.

Traditionally, proton radiotherapy has required the use of high energy proton beams (50-200 MeV) which can penetrate into a patient's body to the site of a tumor that is to be destroyed through irradiation. However, substantial damage is still done to healthy tissue along the path of the incident proton beam, as much as 30% of that done at the tumor site. We propose a new concept for the production and delivery of energetic protons for use in medical radiotherapy, based upon the fact that low energy, ion-induced nuclear reactions can produce radiation products suitable for use in radiotherapy applications. By employing specially fabricated "conduit needles" to deliver beams of energetic ions to selected target materials plugging the end of the needle, ion beam-induced nuclear reactions can be generated at the needle tip, emitting reaction-specific radiation products directly at the tumor site. In this paper, we show that the 13.6 MeV protons produced by the d(3He, p)4He nuclear reaction can deliver a lethal dose (7 krad) of radiation to a 4.4 mm diameter sphere of tissue in only 30 s using a 1 μA, 800 keV 3He ion beam. If also proven clinically feasible, the use of low energy, ion-induced nuclear reactions would allow the utilization of relatively inexpensive, compact, low energy ion accelerators for proton radiotherapy and minimize unintended radiation damage to healthy tissue by providing much greater precision in controlling the irradiated volume. © 1993.

More Details

Performance of the CPG 7.5-kW{sub e} Dish-Stirling system

Bean, J.R.; Diver, R.B.

Through the Dish-Stirling Joint Venture Program (JVP) sponsored by the US Department of Energy (DOE), Cummins Power Generation, Inc., (CPG) and Sandia National Laboratories (SNL) have entered into a joint venture to develop and commercialize economically competitive dish-Stirling systems for remote power applications. The $14 million JVP is being conducted in three phases over a 3 1/2-year period in accordance with the Cummins Total Quality System (TQS) for new product development. The JVP is being funded equally by CPG, including its industrial partners, and the DOE. In June 1992, a ``concept validation`` (prototype) 5-kW{sub e}, dish-Stirling system became operational at the CPG test site m Abilene, TX. And on January 1, 1993, the program advanced to phase 2. On the basis of the performance of the 5-kW{sub e} system, a decision was made to increase the rated system output to 7.5-kW{sub e}. The CPG system uses advanced components that have the potential for low cost and reliable operation, but which also have technical risks. In this paper, the status of the advanced components and results from system integration testing are presented and discussed. Performance results from system testing of the 5-kW{sub e} prototype along with phase 2 goals for the 7.5-kW{sub e} system are also discussed.

More Details

A ten year review of performance of photovoltaic systems

Rosenthal, A.L.; Durand, S.J.; Thomas, M.G.

This paper presents data compiled by the Photovoltaic Design Assistance Center at Sandia National Laboratories from more than eighty field tests performed at over thirty-five photovoltaic systems in the United States during the last ten years. The recorded performance histories, failure rates, and degradation of post-Block IV modules and balance-of-system (BOS) components are described in detail.

More Details

Slim-hole drilling for geothermal exploration

Finger, John T.

Drilling production-size holes for geothermal exploration puts a large expense at the beginning of the project, and thus requires a long period of debt service before those costs can be recaptured from power sales. If a reservoir can be adequately defined and proved by drilling smaller, cheaper slim-holes, production well drilling can be delayed until the power plant is under construction, saving years of interest payments. In the broadest terms, this project`s objective is to demonstrate that a geothermal resevoir can be identified and evaluated with data collected in slim holes. We have assembled a coordinated working group, including personnel from Sandia, Lawrence Berkeley Lab, University of Utah Research Institute, US Geological Survey, independent consultants, and geothermal operators, to focus on the development of this project. This group is involved to a greater or lesser extent in all decisions affecting the direction of the research. Specific tasks being pursued include: Correlation of fluid flow and injection tests between slim-holes and production size wells. Transfer of slim-hole exploration drilling and reservoir assessment to industry so that slim-hole drilling becomes an accepted method for geothermal exploration.Development and validation of a coupled wellbore-reservoir flow simulator which can be used for reservoir evaluation from slim-hole flow data. Collection of applicable data from commercial wells in existing geothermal fields. Drilling of at least one new slim-hole and use it to evaluate a geothermal reservoir.

More Details

C++ as a language for object-oriented numerics

Budge, Kent G.

C++ is commonly described as an object-oriented programming language because of its strong support for classes with multiple inheritance and polymorphism. However, for a growing community of numerical programmers, an equally important feature of C++ is its support of operator overloading on abstract data types. The authors choose to call the resulting style of programming object-oriented numerics. They believe that much of object-oriented numerics is orthogonal to conventional object-oriented programming. As a case study, they discuss two strong shock physics codes written in C++ that they`re currently developing. These codes use both polymorphic classes (typical of traditional object-oriented programming) and abstract data types with overloaded operators (typical of object-oriented numerics). They believe that C++ translators can generate efficient code for many numerical objects. However, for the important case of smart arrays (which are used to represent matrices and the fields found in partial differential equations) fundamental difficulties remain. The authors discuss the two most important of these, namely, the aliasing ambiguity and the proliferation of temporaries, and present some possible solutions.

More Details

A perspective on AVS in an engineering sciences environment

Glass, Micheal W.

At Sandia National Laboratories, the Engineering Sciences Center has made a commitment to integrate Application Visualization System (AVS) into our computing environment as the primary tool for scientific visualization. AVS will be used on an everyday basis by a broad spectrum of users ranging from the occasional computer user to AVS module developers. Additionally, AVS will be used to visualize structured grid, unstructured grid, gridless, 1D, 2D, 3D, steady-state, transient, computational, and experimental data. The following is one user`s perspective on how AVS meets this task. Several examples of how AVS is currently being utilized will be given along with some future directions.

More Details

Environmentally conscious manufacturing life cycle analysis

Watkins, R.D.; Baca, A.

Sandia National Laboratories and the Allied Signal-Kansas City Plant (AS-KCP) are engaged in a program called the Integrated Manufacturing and Design Initiative, or IMDI. The focus of IMDI is ``to develop and implement concurrent engineering processes for the realization of weapon components.`` An explicit part of each of the activities within IMDI is an increased concern for environmental impacts associated with design, and a desire to minimize those impacts through the implementation of Environmentally Conscious Manufacturing, or ECM. These same concerns and desires are shared within the Department of Energy`s Manufacturing Complex, and are gaining strong support throughout US industrial sectors as well. Therefore, the development and application of an environmental life cycle analysis framework, the thrust of this specific effort, is most consistent not only with the overall objectives of IMDI, but with those of DOE and private industry.

More Details

GREPOS: A GENESIS database repositioning program

Sjaardema, Gregory D.

GREPOS is a mesh utility program that repositions or modifies the configuration of a two-dimensional or three-dimensional mesh. GREPOS can be used to change the orientation and size of a two-dimensional or three-dimensional mesh; change the material block, nodeset, and sideset IDs; or ``explode`` the mesh to facilitate viewing of the various parts of the model. GREPOS also updates the EXODUS quality assurance and information records to help track the codes and files used to generate the mesh. GREPOS reads and writes two-dimensional and three-dimensional mesh databases in the GENESIS database format; therefore, it is compatible with the preprocessing, postprocessing, and analysis codes in the Sandia National Laboratories Engineering Analysis Code Access System (SEACAS).

More Details

Condensed phase thermochemistry of reactor core debris

Powers, Dana A.

This paper discusses a nonideal solution model of the metallic phases of reactor core debris. The metal phase model is based on the Kohler equation for a 37 component system. The binary subsystems are assumed to have subregular interactions. The model is parameterized by comparison to available data and by estimating subregular interactions using the methods developed by Miedama et al. The model is shown to predict phase separation in the metallic phase of core debris. The model also predicts reduced chemical activities of zirconium and tellurium in the metal phase. A model of the oxide phase of core debris is described briefly. The model treats the oxide phase as an associated solution. The chemical activities of solution components are determined by the existence and interactions of species formed from the components.

More Details

NaK pool-boiler bench-scale receiver durability test: Test design and initial results

Andraka, Charles E.

Pool-boiler reflux receivers have been considered as an alternative to heat pipes for the input of concentrated solar energy to Stirling-cycle engines in dish-Stirling electric generation systems. Fool boilers offer simplicity in desip and fabrication. Pool-boiler solar receiver operation has been demonstrated for short periods of time. However, in order to generate cost-effective electricity, the receiver must operate without significant maintenance for the entire system life. At least one theory explaining incipient-boiling behavior of alkali metals indicates that favorable start-up behavior should deteriorate over time. Many factors affect the stability and startup behavior of the boiling system. Therefore, it is necessary to simulate the full-scale design in every detail as much as possible, including flux levels materials, and operating cycles. On-sun testing is impractical due to the limited test time available. No boiling system has been demonstrated with the current porous boiling enhancement surface and materials for a significant period of time. A test vessel was constructed with a Friction Coatings Inc. porous boiling enhancement surface. The vessel is heated with a quartz lamp array providing about 92 W/Cm{sup 2} peak incident thermal flux. The vessel is charged with NaK-78, which is liquid at room temperature. This allows the elimination of costly electric preheating, both on this test and on full-scale receivers. The vessel is fabricated from Haynes 230 alloy, selected for its high temperature strength and oxidation resistance. The vessel operates at 750{degrees}C around the clock, with a 1/2-hour shutdown cycle to ambient every 8 hours. Temperature data is continually collected. The test design and initial (first 2500 hours and 300 start-ups) test data are presented here. The test is designed to operate for 10,000 hours, and will be complete in the spring of 1994.

More Details

Propagation of transient signals from a spherical source in a half-space with surface layers

Norwood, F.R.

The title problem is of particular interest for the analysis of seismic signals arising from underground nuclear explosions. Previous attempts at the solution have indicated that, although cylindrical symmetry exists, conventional methods cannot be applied because of the existence of plane and spherical boundaries. The present paper develops a ray-grouping technique for finding the solution to the title problem. This technique allows the separation of the problem into a series of canonical problems. Each such problem deals with a given boundary condition (e.g., continuity conditions at a material interface). Using this technique, one may follow waves along ray paths. It is easy to identify, after n reflections, (a) rays which arrive simultaneously at a given point and (b) the terms in the solution which need to be included at a given time. It is important to note that a cylindrical coordinate system is not employed, even though the problem is axially symmetric. Instead, the equations are carefully transformed making it possible to use a Cartesian coordinate system. This results in a spectral representation of the solution in terms of algebraic expressions in lieu of Bessel functions.

More Details

Thermal coating development for impulse drying

Journal of Thermal Spray Technology

Smith, Mark F.

A plasma-sprayed coating has been developed for the heated surface of rolls used in a new energy-efficient paper drying process, known as"Impulse Drying," which could save the US paper industry an estimated $800 million annually in reduced energy costs. Because impulse drying rolls operate at substantially higher surface temperatures than conventional drying rolls, the thermal properties of the roll surface must be carefully tailored to control energy transfer to the paper and thus prevent sheet delamination or other undesirable effects. To meet this requirement, a plasma-sprayed thermal barrier coating has been developed to control thermal mass, heat transfer, and steam infiltration. A coated test platen significantly outperformed a comparable uncoated steel platen in preliminary experiments with a heavyweight grade of paper on a laboratory-scale impulse drying simulator. Based on these results, the coating was then tested on the roll of a pilot-scale impulse dryer. Compared to conventional wet pressing, linerboard that was impulse dried with the coated test roll showed marked improvements in water removal as well as improved physical properties, such as density and specific elastic modulus. The successful prototype coating design has three plasma-sprayed layers that are deposited sequentially: a nickel alloy bond coat, a thick, 17% porous zirconia thermal barrier, and a thin, 5 to 7% porous zirconia top coat. © 1993 ASM International.

More Details

Particle interations in concentrated suspensions

Mondy, Lisa A.

An overview is presented of research that focuses on slow flows of suspensions in which colloidal and inertial effects are negligibly small. We describe nuclear magnetic resonance imaging experiments to quantitatively measure particle migration occurring in concentrated suspensions undergoing a flow with a nonuniform shear rate. These experiments address the issue of how the flow field affects the microstructure of suspensions. In order to understand the local viscosity in a suspension with such a flow-induced, spatially varying concentration, one must know how the viscosity of a homogeneous suspension depends on such variables as solids concentration and particle orientation. We suggest the technique of falling ball viscometry, using small balls, as a method to determine the effective viscosity of a suspension without affecting the original microstructure significantly. We also describe data from experiments in which the detailed fluctuations of a falling ball`s velocity indicate the noncontinuum nature of the suspension and may lead to more insights into the effects of suspension microstructure on macroscopic properties. Finally, we briefly describe other experiments that can be performed in quiescent suspensions (in contrast to the use of conventional shear rotational viscometers) in order to learn more about boundary effects in concentrated suspensions.

More Details

Rock mass mechanical property estimations for the Yucca Mountain Site Characterization Project; Yucca Mountain Site Characterization Project

Bauer, Stephen J.

Rock mass mechanical properties are important in the design of drifts and ramps. These properties are used in evaluations of the impacts of thermomechanical loading of potential host rock within the Yucca Mountain Site Characterization Project. Representative intact rock and joint mechanical properties were selected for welded and nonwelded tuffs from the currently available data sources. Rock mass qualities were then estimated using both the Norwegian Geotechnical Institute (Q) and Geomechanics Rating (RMR) systems. Rock mass mechanical properties were developed based on estimates of rock mass quality, the current knowledge of intact properties, and fracture/joint characteristics. Empirical relationships developed to correlate the rock mass quality indices and the rock mass mechanical properties were then used to estimate the range of rock mass mechanical properties.

More Details

Review of radionuclide source terms used for performance-assessment analyses; Yucca Mountain Site Characterization Project

Barnard, R.

Two aspects of the radionuclide source terms used for total-system performance assessment (TSPA) analyses have been reviewed. First, a detailed radionuclide inventory (i.e., one in which the reactor type, decay, and burnup are specified) is compared with the standard source-term inventory used in prior analyses. The latter assumes a fixed ratio of pressurized-water reactor (PWR) to boiling-water reactor (BWR) spent fuel, at specific amounts of burnup and at 10-year decay. TSPA analyses have been used to compare the simplified source term with the detailed one. The TSPA-91 analyses did not show a significant difference between the source terms. Second, the radionuclides used in source terms for TSPA aqueous-transport analyses have been reviewed to select ones that are representative of the entire inventory. It is recommended that two actinide decay chains be included (the 4n+2 ``uranium`` and 4n+3 ``actinium`` decay series), since these include several radionuclides that have potentially important release and dose characteristics. In addition, several fission products are recommended for the same reason. The choice of radionuclides should be influenced by other parameter assumptions, such as the solubility and retardation of the radionuclides.

More Details

Permeability and hydraulic diffusivity of Waste Isolation Pilot Plant repository salt inferred from small-scale brine inflow experiments

Mctigue, D.F.

Brine seepage to 17 boreholes in salt at the Waste Isolation Pilot Plant (WIPP) facility horizon has been monitored for several years. A simple model for one-dimensional, radial, darcy flow due to relaxation of ambient pore-water pressure is applied to analyze the field data. Fits of the model response to the data yield estimates of two parameters that characterize the magnitude of the flow and the time scale over which it evolves. With further assumptions, these parameters are related to the permeability and the hydraulic diffusivity of the salt. For those data that are consistent with the model prediction, estimated permeabilities are typically 10{sup {minus}22} to 10{sup {minus}21} m{sup 2}. The relatively small range of inferred permeabilities reflects the observation that the measured seepage fluxes are fairly consistent from hole to hole, of the order of 10{sup {minus}10} m/s. Estimated diffusivities are typically 10{sup {minus}10} to 10{sup {minus}8} m{sup 2}/s. The greater scatter in inferred hydraulic diffusivities is due to the difficulty of matching the idealized model history to the observed evolution of the flows. The data obtained from several of the monitored holes are not consistent with the simple model adopted here; material properties could not be inferred in these cases.

More Details

Core based stress measurements: A guide to their application. Topical report, July 1991--June 1993

Warpinski, Norman R.

This report is a summary and a guide to core-based stress measurements. It covers anelastic strain recovery, circumferential velocity anistropy, differential strain curve analysis, differential wave velocity analysis, petrographic examination of microcracks, overcoring of archieved core, measurements of the Kaiser effect, strength anisotropy tests, and analysis of coring-induced fractures. The report begins with a discussion of the stored energy within rocks, its release during coring, and the subsequent formation of relaxation microcracks. The interogation or monitoring of these microcracks form the basis for most of the core-based techniques (except for the coring induced fractures). Problems that can arise due to coring or fabric are also presented, Coring induced fractures are discussed in some detail, with the emphasis placed on petal (and petal-centerline) fractures and scribe-knife fractures. For each technique, a short description of the physics and the analysis procedures is given. In addition, several example applications have also been selected (where available) to illustrate pertinent effects. This report is intended to be a guide to the proper application and diagnosis of core-based stress measurement procedures.

More Details

Status of lost circulation research

Glowka, D.A.; Schafer, D.M.; Wright, E.K.; Whitlow, G.L.; Bates, C.W.

This paper describes progress made in the Lost Circulation Technology Development Program over the period March, 1992--April, 1993. The program is sponsored at Sandia National Laboratories by the US Department of Energy, Geothermal division. The goal of the program is to develop technology to reduce lost circulation costs associated with geothermal drilling by 30--50%.

More Details

ITS Version 3.0: The Integrated TIGER Series of coupled electron/photon Monte Carlo transport codes

Halbleib, J.A.; Kensek, R.P.; Valdez, G.D.; Mehlhorn, T.A.; Seltzer, S.M.; Berger, M.J.

ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. It combines operational simplicity and physical accuracy in order to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Flexibility of construction permits tailoring of the codes to specific applications and extension of code capabilities to more complex applications through simple update procedures.

More Details

Environmentally assisted cracking of nickel anode substrates in Li/SOCl{sub 2} cells: An engineering approach

Cieslak, Wendy R.

Intergranular environmentally assisted cracking (EAC) of Ni anode substrates is likely to occur in a large proportion of Li/SOCl{sub 2} cells, but it is not generally detected because in the majority of cases it does not lead to catatrophic failure. However, EAC could become a problem for applications requiring continuous power with high reliability for 10--15 years. In the present work, we determine why simple galvanic couple constant-strain tests do not produce cracking, and introduce a constant strain test that does produce cracking. Objective of this investigation is to determine the stress threshold for cracking as a function of Ni composition and microstructure.

More Details

Current development in selected stress and thermal analysis software interfaces with PRO-ENGINEER

Schulze, J.

Ever since PRO-ENGINEER has become a dominating CAD package available to the public, some of us have been saying, ``Gee, if only I could export my geometry to a stress analysis program without having to recreate any of the details already created, wouldn`t that be spectacular?`` Well, much to the credit of the major stress and thermal analysis software vendors, some of them have been listening to design engineers like me badger them to furnish a seamless interface between PRO and their stress analysis programs. The down side of this problem is the fact that a lot of problems still exist with most of the vendors and their interfaces. I want to discuss the interfaces that I feel are currently ``State of the Art``, and how they are developing and the future for finally arriving at a transparent procedure that an engineer at a workstation can utilize in his or her design process. In years past, engineers would develop a design and changes would evolve based on intuition, or somebody else`s critical evaluation. Then the design would be forwarded to the production group, or the stress analysis group for further evaluation and analysis. Maybe data from a preliminary prototype would be collected and an evaluation report made. All of this took time and increased the cost of the item to be manufactured. Today, the engineer must assume responsibility for design and functional capability early on in the design process, if for no other reason than costs associated with diverse channels of critiquing. For that reason, one place to enhance the design process is to have the ability to do preliminary stress and thermal analysis during the initial design phase. This is both cost and time effective. But, as I am sure you are aware, this has been easier said than done.

More Details

A visualization environment for supercomputing-based applications in computational mechanics

Pavlakos, Constantine

In this paper, we characterize a visualization environment that has been designed and prototyped for a large community of scientists and engineers, with an emphasis in superconducting-based computational mechanics. The proposed environment makes use of a visualization server concept to provide effective, interactive visualization to the user`s desktop. Benefits of using the visualization server approach are discussed. Some thoughts regarding desirable features for visualization server hardware architectures are also addressed. A brief discussion of the software environment is included. The paper concludes by summarizing certain observations which we have made regarding the implementation of such visualization environments.

More Details

A simplified model of aerosol removal by containment sprays

Powers, Dana A.

Spray systems in nuclear reactor containments are described. The scrubbing of aerosols from containment atmospheres by spray droplets is discussed. Uncertainties are identified in the prediction of spray performance when the sprays are used as a means for decontaminating containment atmospheres. A mechanistic model based on current knowledge of the physical phenomena involved in spray performance is developed. With this model, a quantitative uncertainty analysis of spray performance is conducted using a Monte Carlo method to sample 20 uncertain quantities related to phenomena of spray droplet behavior as well as the initial and boundary conditions expected to be associated with severe reactor accidents. Results of the uncertainty analysis are used to construct simplified expressions for spray decontamination coefficients. Two variables that affect aerosol capture by water droplets are not treated as uncertain; they are (1) {open_quote}Q{close_quote}, spray water flux into the containment, and (2) {open_quote}H{close_quote}, the total fall distance of spray droplets. The choice of values of these variables is left to the user since they are plant and accident specific. Also, they can usually be ascertained with some degree of certainty. The spray decontamination coefficients are found to be sufficiently dependent on the extent of decontamination that the fraction of the initial aerosol remaining in the atmosphere, m{sub f}, is explicitly treated in the simplified expressions. The simplified expressions for the spray decontamination coefficient are given. Parametric values for these expressions are found for median, 10 percentile, and 90 percentile values in the uncertainty distribution for the spray decontamination coefficient. Examples are given to illustrate the utility of the simplified expressions to predict spray decontamination of an aerosol-laden atmosphere.

More Details

Nuclear weapon reliability evaluation methodology

Wright, D.L.

This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

More Details

Bench-scale screening tests for a boiling sodium-potassium alloy solar receiver

Moreno, James B.

Bench-scale tests were carried out in support of the design of a second-generation 75-kW{sub t} reflux pool-boiler solar receiver. The receiver will be made from Haynes Alloy 230 and will contain the sodium-potassium alloy NaK-78. The bench-scale tests used quartz-lamp-heated boilers to screen candidate boiling-stabilization materials and methods at temperatures up to 750{degree}C. Candidates that provided stable boiling were tested for hot-restart behavior. Poor stability was obtained with single 1/4-inch diameter patches of powdered metal hot-press-sintered onto the wetted side of the heat-input area. Laser-drilled and electric-discharge-machined cavities in the heated surface also performed poorly. Small additions of xenon, and heated-surface tilt out of the vertical dramatically improved poor boiling stability; additions of helium or oxygen did not. The most stable boiling was obtained when the entire heat-input area was covered by a powdered-metal coating. The effect of heated-area size was assessed for one coating: at low incident fluxes, when even this coating performed poorly, increasing the heated-area size markedly improved boiling stability. Good hot-restart behavior was not observed with any candidate, although results were significantly better with added xenon in a boiler shortened from 3 to 2 feet. In addition to the screening tests, flash-radiography imaging of metal-vapor bubbles during boiling was attempted. Contrary to the Cole-Rohsenow correlation, these bubble-size estimates did not vary with pressure; instead they were constant, consistent with the only other alkali metal measurements, but about 1/2 their size.

More Details

Experimental investigation of pressure and blockage effects on combustion limits in H{sub 2}-air-steam mixtures

Sherman, M.P.

Experiments with hydrogen-air-steam mixtures, such as those found within a containment system following a reactor accident, were conducted in the Heated Detonation Tube (43 cm diameter and 12 m long) to determine the region of benign combustion; i.e., the region between the flammability limits and the deflagration-to-detonation transition limits. Obstacles were used to accelerate the flame; these include 30% blockage ratio annular rings, and alternate rings and disks of 60% blockage ratio. The initial conditions were 110 {degree}C and one or three atmospheres pressure. A benign burning region exists for rich mixtures, but is generally smaller than for lean mixtures. Effects of the different obstacles and of the different pressures are discussed.

More Details

Downsizing a database platform for increased performance and decreased costs

Miller, M.M.; Tolendino, L.F.

Technological advances in the world of microcomputers have brought forth affordable systems and powerful software than can compete with the more traditional world of minicomputers. This paper describes an effort at Sandia National Laboratories to decrease operational and maintenance costs and increase performance by moving a database system from a minicomputer to a microcomputer.

More Details

A methodology for the evaluation of the turbine jet engine fragment threat to generic air transportable containers

Harding, David C.

Uncontained, high-energy gas turbine engine fragments are a potential threat to air-transportable containers carried aboard jet aircraft. The threat to a generic example container is evaluated by probability analyses and penetration testing to demonstrate the methodology to be used in the evaluation of a specific container/aircraft/engine combination. Fragment/container impact probability is the product of the uncontained fragment release rate and the geometric probability that a container is in the path of this fragment. The probability of a high-energy rotor burst fragment from four generic aircraft engines striking one of the containment vessels aboard a transport aircraft is approximately 1.2 {times} 10{sup {minus}9} strikes/hour. Finite element penetration analyses and tests can be performed to identify specific fragments which have the potential to penetrate a generic or specific containment vessel. The relatively low probability of engine fragment/container impacts is primarily due to the low release rate of uncontained, hazardous jet engine fragments.

More Details

Stockpile Transition Enabling Program (STEP): Process and project requirements

Ma, Kwok-Kee

The Stockpile Transition Enabling Program (STEP) is aimed at identifying weapon components suitable for use in more than one weapon and for qualifying components so identified for multiple use. Work includes identifying the means to maintain the manufacturing capability for these items. This document provides the participants in STEP a common, consistent understanding of the process and requirements. The STEP objectives are presented and the activities are outlined. The STEP project selections are based on the customer needs, product applicability, and maturity of the technology used. A formal project selection process is described and the selection criteria are defined. The concept of {open_quotes}production readiness{close_quotes} is introduced, along with a summary of the project requirements and deliverables to demonstrate production readiness.

More Details

MELCOR 1.8.1 calculations of ISP31: The CORA-13 experiment

Gross, Robert J.

The MELCOR code was used to simulate one of GRS`s (a reactor research group in Germany) core degradation experiments conducted in the CORA out-of-pile test facility. This test, designated CORA-13, was selected as one of the International Standard Problems, Number ISP31, by the Organization for Economic Cooperation and Development. In this blind calculation, only initial and boundary conditions were provided. The experiment consisted of a small core bundle of twenty-five PWR fuel elements that was electrically heated to temperatures greater than 2,800 K. The experiment composed three phases: a 3,000 second gas preheat phase, an 1,870 second transient phase, and a 180 second water quench phase. MELCOR predictions are compared both to the experimental data and to eight other ISP31 submittals. Temperatures of various components, energy balance, zircaloy oxidation, and core blockage are examined. Up to the point where oxidation was significant, MELCOR temperatures agreed very well with the experiment -- usually to within 50 K. MELCOR predicted oxidation to occur about 100 seconds earlier and at a faster rate than experimental data. The large oxidation spike that occurred during quench was not predicted. However, the experiment produced 210 grams of hydrogen, while MELCOR predicted 184 grams, which was one of the closest integral predictions of the nine submittals. Core blockage was of the right magnitude; however, material collected on the lower grid spacer in the experiment at an axial location of 450 mm, while in MELCOR the material collected at the 50 to 150 mm location. In general, compared to the other submittals, the MELCOR calculation was superior.

More Details

Characterization of the Facility for Atmospheric Corrosion Testing (FACT) at Sandia

Greenholt, Charles J.

The capability to perform atmospheric corrosion testing of materials and components now exists at Sandia resulting from the installation of a system called the Facility for Atmospheric Corrosion Testing (FACT). This report details the design, equipment, operation, maintenance, and future modifications of the system. This report also presents some representative data acquired from testing copper in environments generated by the FACT.

More Details

The unique signal concept for detonation safety in nuclear weapons

Hoover, Marcey L.

The purpose of a unique signal (UQS) in a nuclear weapon system is to provide an unambiguous communication of intent to detonate from the UQS information input source device to a stronglink safety device in the weapon in a manner that is highly unlikely to be duplicated or simulated in normal environments and in a broad range of ill-defined abnormal environments. This report presents safety considerations for the design and implementation of UQSs in the context of the overall safety system.

More Details

Preliminary performance assessment of the Greater Confinement Disposal facility at the Nevada Test Site. Volume 3: Supporting details

Price, Laura L.

The Department of Energy`s Nevada Operations Office (DOE/NV) has disposed of a small quantity of transuranic waste at the Greater Confinement Disposal facility in Area 5 of the Nevada Test Site. In 1989, DOE/NV contracted with Sandia National Laboratories to perform a preliminary performance assessment of this disposal facility. This preliminary performance assessment consisted of analyses designed to assess the likelihood of complying with Environmental Protection Agency standards for the disposal of transuranic waste, high level waste, and spent fuel. The preliminary nature of this study meant that no other regulatory standards were considered and the analyses were conducted with specific limitations. The procedure for the preliminary performance assessment consisted of (1) collecting information about the site, (2) developing models based on this information, (3) implementing these models in computer codes, (4) performing the analyses using the computer codes, and (5) performing sensitivity analyses to determine the more important variables. Based on the results of the analyses, it appears that the Greater Confinement Disposal facility will most likely comply with the Environmental Protection Agency`s standards for the disposal of transuranic waste. The results of the sensitivity analyses are being used to guide site characterization activities related to the next iteration of performance assessment analyses for the Greater Confinement Disposal facility.

More Details

Insights into the behavior of nuclear power plant containments during severe accidents

Ludwigsen, John S.

The containment building surrounding a nuclear reactor offers the last barrier to the release of radioactive materials from a severe accident into the environment. The loading environment of the containment under severe accident conditions may include much greater than design pressures and temperatures. Investigations into the performance of containments subject to ultimate or failure pressure and temperature conditions have been performed over the last several years through a program administered by the Nuclear Regulatory Commission (NRC). These NRC sponsored investigations are subsequently discussed. Reviewed are the results of large scale experiments on reinforced concrete, prestressed concrete, and steel containment models pressurized to failure. In conjunction with these major tests, the results of separate effect testing on many of the critical containment components; that is, aged and unaged seals, a personnel air lock and electrical penetration assemblies subjected to elevated temperature and pressure have been performed. An objective of the NRC program is to gain an understanding of the behavior of typical existing and planned containment designs subject to postulated severe accident conditions. This understanding has led to the development of experimentally verified analytical tools that can be applied to accurately predict their ultimate capacities useful in developing severe accident mitigation schemes. Finally, speculation on the response of containments subjected to severe accident conditions is presented.

More Details

Object-oriented DFD models to present the functional and behavioral views

Maxted, A.

An object-oriented methodology is presented that is based on two sets of Data Flow Diagrams (DFDs): one for the functional view, and one for the behavioral view. The functional view presents the information flow between shared objects. These objects map to the classes identified in the structural view (e.g., Information Model). The behavioral view presents the flow of information between control components and relates these components to their state models. Components appearing in multiple views provide a bridge between the views. The top-down hierarchical nature of the DFDs provide a needed overview or road map through the software system.

More Details

A proposal for reverse engineering CASE tools to support new software development

Maxted, A.

Current CASE technology provides sophisticated diagramming tools to generate a software design. The design, stored internal to the CASE tool, is bridged to the code via code generators. There are several limitations to this technique: (1) the portability of the design is limited to the portability of the CASE tools, and (2) the code generators offer a clumsy link between design and code. The CASE tool though valuable during design, becomes a hindrance during implementation. Frustration frequently causes the CASE tool to be abandoned during implementation, permanently severing the link between design and code. Current CASE stores the design in a CASE internal structure, from which code is generated. The technique presented herein suggests that CASE tools store the system knowledge directly in code. The CASE support then switches from an emphasis on code generators to employing state-of-the-art reverse engineering techniques for document generation. Graphical and textual descriptions of each software component (e.g., Ada Package) may be generated via reverse engineering techniques from the code. These reverse engineered descriptions can be merged with system over-view diagrams to form a top-level design document. The resulting document can readily reflect changes to the software components by automatically generating new component descriptions for the changed components. The proposed auto documentation technique facilitates the document upgrade task at later stages of development, (e.g., design, implementation and delivery) by using the component code as the source of the component descriptions. The CASE technique presented herein is a unique application of reverse engineering techniques to new software systems. This technique contrasts with more traditional CASE auto code generation techniques.

More Details

Optical diagnostic instrument for monitoring etch uniformity during plasma etching of polysilicon in a chlorine-helium plasma

Hareland, W.A.

Nonuniform etching is a serious problem in plasma processing of semiconductor materials and has important consequences in the quality and yield of microelectronic components. In many plasmas, etching occurs at a faster rate near the periphery of the wafer, resulting in nonuniform removal of specific materials over the wafer surface. This research was to investigate in situ optical diagnostic techniques for monitoring etch uniformity during plasma processing of microelectronic components. We measured 2-D images of atomic chlorine at 726 nm in a chlorine-helium plasma during plasma etching of polysilicon in a parallel-plate plasma etching reactor. The 3-D distribution of atomic chlorine was determined by Abel inversion of the plasma image. The experimental results showed that the chlorine atomic emission intensity is at a maximum near the outer radius of the plasma and decreases toward the center. Likewise, the actual etch rate, as determined by profilometry on the processed wafer, was approximately 20% greater near the edge of the wafer than at its center. There was a direct correlation between the atomic chlorine emission intensity and the etch rate of polysilicon over the wafer surface. Based on these analyses, 3-D imaging would be a useful diagnostic technique for in situ monitoring of etch uniformity on wafers.

More Details

Surface acoustic wave sensing of VOCs in harsh chemical environments

Pfeifer, Kent B.

The measurement of VOC concentrations in harsh chemical and physical environments is a formidable task. A surface acoustic wave (SAW) sensor has been designed for this purpose and its construction and testing are described in this paper. Included is a detailed description of the design elements specific to operation in 300{degree}C steam and HCl environments including temperature control, gas handling, and signal processing component descriptions. In addition, laboratory temperature stability was studied and a minimum detection limit was defined for operation in industrial environments. Finally, a description of field tests performed on steam reforming equipment at Synthetica Technologies Inc. of Richmond, CA is given including a report on destruction efficiency of CCl{sub 4} in the Synthetica moving bed evaporator. Design improvements based on the field tests are proposed.

More Details

MELCOR 1.8.1 assessment: PNL Ice Condenser Aerosol Experiments

Gross, Robert J.

The MELCOR code was used to simulate PNL`s Ice Condenser Experiments 11-6 and 16-11. In these experiments, ZnS was injected into a mixing chamber, and the combined steam/air/aerosol mixture flowed into an ice condenser which was l4.7m tall. Experiment 11-6 was a low flow test; Experiment l6-1l was a high flow test. Temperatures in the ice condenser region and particle retention were measured in these tests. MELCOR predictions compared very well to the experimental data. The MELCOR calculations were also compared to CONTAIN code calculations for the same tests. A number of sensitivity studies were performed. It as found that simulation time step, aerosol parameters such as the number of MAEROS components and sections used and the particle density, and ice condenser parameters such as the energy capacity of the ice, ice heat transfer coefficient multiplier, and ice heat structure characteristic length all could affect the results. Thermal/hydraulic parameters such as control volume equilibrium assumptions, flow loss coefficients, and the bubble rise model were found to affect the results less significantly. MELCOR results were not machine dependent for this problem.

More Details

Proposal for a numerical array library (Revised)

Budge, Kent G.

One of the most widely recognized inadequacies of C is its low-level treatment of arrays. Arrays are not first-class objects in C; an array name in an expression almost always decays into a pointer to the underlying type. This is unfortunate, especially since an increasing number of high-performance computers are optimized for calculations involving arrays of numbers. On such machines, double [] may be regarded as an intrinsic data type comparable to double or int and quite distinct from double. This weakness of C is acknowledged in the ARM where it is suggested that the inadequacies of the C array can be overcome in C++ by wrapping it in a class that supplies dynamic memory management, bounds checking, operator syntax, and other useful features. Such ``smart arrays`` can in fact supply the same functionality as the first-class arrays found in other high-level, general-purpose programming languages. Unfortunately, they are expensive in both time and memory and make poor use of advanced floating-point architectures. Is there a better solution? The most obvious solution is to make arrays first-class objects and add the functionality mentioned in the previous paragraph. However, this would destroy C compatibility and significantly alter the C++ language. Major conflicts with existing practice would seem inevitable. I propose instead that numerical array classes be adopted as part of the C++ standard library. These classes will have the functionality appropriate for the intrinsic arrays found on most high-performance computers, and the compilers written for these computers will be free to implement them as built-in classes. On other platforms, these classes may be defined normally, and will provide users with basic army functionality without imposing an excessive burden on the implementor.

More Details

Preliminary Nuclear Safety Assessment of the NEPST (Topaz II) Space Reactor Program

Marshall, Albert C.

The United States (US) Strategic Defense Initiative Organization (SDIO) decided to investigate the possibility of launching a Russian Topaz II space nuclear power system. A preliminary nuclear safety assessment was conducted to determine whether or not a space mission could be conducted safely and within budget constraints. As part of this assessment, a safety policy and safety functional requirements were developed to guide both the safety assessment and future Topaz II activities. A review of the Russian flight safety program was conducted and documented. Our preliminary nuclear safety assessment included a number of deterministic analyses, such as; neutronic analysis of normal and accident configurations, an evaluation of temperature coefficients of reactivity, a reentry and disposal analysis, an analysis of postulated launch abort impact accidents, and an analysis of postulated propellant fire and explosion accidents. Based on the assessment to date, it appears that it will be possible to safely launch the Topaz II system in the US with a modification to preclude water flooded criticality. A full scale safety program is now underway.

More Details

Modeling of the second stage of the STAR 1.125 inch two-stage gas gun

Longcope, Donald B.

The second stage of the Shock Technology and Applied Research (STAR) facility two-stage light gas gun at Sandia National Laboratories has been modeled to better assess its safety during operation and to determine the significance of various parameters to its performance. The piston motion and loading of the acceleration reservoir (AR), the structural response of AR, and the projectile motion are determined. The piston is represented as an incompressible fluid while the AR is modeled with the ABAQUS finite element structural analysis code. Model results are compared with a measured profile of AR diameter growth for a test at maximum conditions and with projectile exit velocities for a group of tests. Changes in the piston density and in the break diaphragm opening pressure are shown to significantly affect the AR loading and the projectile final velocity.

More Details

Message passing in PUMA

Wheat, S.R.

This paper provides an overview of the message passing primitives provided by PUMA (Performance-oriented, User-managed Messaging Architecture). Message passing in PUMA is based on the concept of a portal--an opening in the address space of an application process. Once an application process has established a portal, other processes can write values into the memory associated with the portal using a simple send operation. Because messages are written directly into the address space of the receiving process, there is not need to buffer messages in the PUMA kernel. This simplifies the design of the kernel, increasing its reliability and portability. Moreover, because messages are mapped directly into the address space of the application process, the application can manage the messages that it receives without needing direct support from the kernel.

More Details

A massively parallel adaptive finite element method with dynamic load balancing

Wheat, S.R.

We construct massively parallel, adaptive finite element methods for the solution of hyperbolic conservation laws in one and two dimensions. Spatial discretization is performed by a discontinuous Galerkin finite element method using a basis of piecewise Legendre polynomials. Temporal discretization utilizes a Runge-Kutta method. Dissipative fluxes and projection limiting prevent oscillations near solution discontinuities. The resulting method is of high order and may be parallelized efficiently on MIMD computers. We demonstrate parallel efficiency through computations on a 1024-processor nCUBE/2 hypercube. We also present results using adaptive p-refinement to reduce the computational cost of the method. We describe tiling, a dynamic, element-based data migration system. Tiling dynamically maintains global load balance in the adaptive method by overlapping neighborhoods of processors, where each neighborhood performs local load balancing. We demonstrate the effectiveness of the dynamic load balancing with adaptive p-refinement examples.

More Details

Design and testing of planar magnetic micromotors fabricated by deep x-ray lithography and electroplating

Karnowsky, M.

The successful design and testing of a three-phase planar integrated magnetic micromotor is presented. Fabrication is based on a modified deep X-ray lithography and electroplating or LIGA process. Maximum rotational speeds of 33,000 rpm are obtained in air with a rotor diameter of 285 {mu}m and do not change when operated in vacuum. Real time rotor response is obtained with an integrated shaft encoder. Long lifetime is evidenced by testing to over 5(10){sup 7} ration cycles without changes in performance. Projected speeds of the present motor configuration are in the vicinity of 100 krpm and are limited by torque ripple. Higher speeds, which are attractive for sensor applications. require constant torque characteristic excitation as is evidenced by ultracentrifuge and gyroscope design. Further understanding of electroplated magnetic material properties will drive these performance improvements.

More Details

Visualization for applications in shock physics

Pavlakos, Constantine

This case study presents work being done to provide visualization capabilities for a family of codes at Sandia in the area of shock physics. The codes, CTH and Parallel CTH, are running in traditional supercomputing as well as massively parallel environments. These are Eulerian codes which produce data on structured grids. Data sets can be large, so managing large data is a priority. A supercomputing-based distributed visualization environment has been implemented to support such applications. This environment, which is based in New Mexico, is also accessible from our branch site in California via a long haul FDDI/ATM link. Functionality includes the ability to track ongoing simulations. A custom visualization file has been developed to provide efficient, interactive access to result data. Visualization capabilities are based on the commercially available AVS software. A few example results are presented, along with a brief discussion of future work.

More Details

Comparison of analytic Whipple bumper shield ballistic limits with CTH simulations

Hertel, Eugene S.

A series of CTH simulations were conducted to assess the feasibility of using the hydrodynamic code for debris cloud formation and to predict any damage due to the subsequent loading on rear structures. Six axisymmetric and one 3-dimensional simulations were conducted for spherical projectiles impacting Whipple bumper shields. The projectile diameters were chosen to correlate with two well known analytic expressions for the ballistic limit of a Whipple bumper shield. It has been demonstrated that CTH can be used to simulate the debris cloud formation, the propagation of the debris across a void region, and the secondary impact of the debris against a structure. In addition, the results from the CTH simulations were compared to the analytic estimates of the ballistic limit. At impact velocities of 10 km/s or less, the CTH predicted ballistic limit lays between the two analytic estimates. However, for impact velocities greater than 10 km/s, CTH simulations predicted a ballistic limit larger than both analytical estimates. The differences at high velocities are not well understood. Structural failure at late times due to the time integrated loading of a very diffuse debris cloud has not been considered in the CTH model. In addition, the analytic predictions are extrapolated from relatively low velocity data and the extrapolation technique may not be valid. The discrepancy between the two techniques should be investigated further.

More Details

A multiphase model for shock-induced flow in low density foam

Baer, M.R.

A multiphase mixture model is applied to describe shocked-induced flow in deformable low-density foam. This model includes interphase drag and heat transfer and all phases are treated as compressible. Volume fraction is represented as an independent kinematic variable and the entropy inequality suggests a thermodynamically-admissable evolutionary equation to describe rate-dependent compaction. This multiphase model has been applied to shock tube experiments conducted by B. W. Skews and colleagues in the study of normal shock impingement on a wall-supported low density porous layer. Numerical solution of the multiphase flow equations employs a high resolution adaptive finite element method which accurately resolves contact surfaces and shock interactions. Additional studies are presented in an investigation of the effect of initial gas pressure in the foam layer, the shock interaction on multiple layers of foam and the shock- induced flow in an unsupported foam layer.

More Details

Quality and ES&H Self-Appraisal Program at the Center for Applied Physics, Engineering and Testing

Zawadzkas, Gerald A.

This report describes the Quality and ES&H Self-Appraisal Program at the Center for Applied Physics, Engineering and Testing, 9300 and explains how the program promotes good ``Conduct of Operations`` throughout the center and helps line managers improve efficiency and maintain a safe work environment. The program provides a means to identify and remove hazards and to ensure workers are following correct and safe procedures; but, most importantly, 9300`s Self-Appraisal program uses DOE`s ``Conduct of Operations`` and ``Quality Assurance`` guidelines to evaluate the manager`s policies and decisions. The idea is to draw attention to areas for improvement in ES&H while focusing on how well the organization`s processes and programs are doing. A copy of the Administrative Procedure which establishes and defines the program, as well as samples of a Self-Appraisal Report and a Manager`s Response to the Self-Appraisal Report are provided as appendixes.

More Details

Center of trace algorithms for extracting digitized waveforms from two-dimensional images

Lee, J.W.

A class of recording instruments records high-frequency signals as a two-dimensional image rather than converting the analog signal directly to digital output. This report explores the task of reducing the two-dimensional trace to a uniformly sampled waveform that best represents the signal characteristics. Many recorders provide algorithms for locating the center of trace. The author discusses these algorithms and alternative algorithms, comparing their effectiveness.

More Details
Results 97001–97050 of 99,299
Results 97001–97050 of 99,299