Publications

Results 94401–94425 of 96,771

Search results

Jump to search filters

The National Center for Advanced Information Components Manufacturing: Program update

Jorgensen, J.L.

The National Center for Advanced Information Components Manufacturing (NCAICM) was established by congressional appropriation in the FY93 Defense Appropriation Bill. The Center, located at Sandia National Laboratories in Albuquerque, NM, is funded through the Advanced Research Projects Agency (ARPA). The technical focus of NCAICM is emissive flat panel displays and associated microelectronics, specifically targeting manufacturing issues such as materials, processes, equipment, and software tools. This Center is a new avenue of collaboration between ARPA and the Department of Energy (DOE). It will help the government meet its obligation to develop dual-use capabilities for the defense and civilian sectors of the economy and provide a new method for cooperation and collaboration between the federal government and American industry. In particular, one of NCAICM`s goals is to provide industry access to the broad resource base available at three DOE Defense Programs laboratories -- Sandia National Laboratories, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory.

More Details

Merging photovoltaic hardware development with hybrid applications in the USA

Bower, Ward I.

The use of multi-source power systems, ``hybrids,`` is one of the fastest growing, potentially significant markets for photovoltaic (PV) system technology today. Cost-effective applications today include remote facility power, remote area power supplies, remote home and village power, and power for dedicated electrical loads such as communications systems. This market sector is anticipated to be one of the most important growth opportunities for PV over the next five years. The US Department of Energy (USDOE) and Sandia National Laboratories (SNL) are currently engaged in an effort to accelerate the adoption of market-driven PV hybrid power systems and to effectively integrate PV with other energy sources. This paper provides details of this development and the ongoing hybrid activities in the United States. Hybrid systems are the primary focus of this paper.

More Details

Chemometric analysis of IR external reflection spectra for quantitative determination of BPSG thin films

Haaland, David M.

Infrared (IR) reflection spectroscopy has been shown to be useful for making rapid and nondestructive quantitative determinations of B and P contents and film thickness for borophosphosilicate glass (BPSG) thin films on silicon monitor wafers. Preliminary data also show that similarly precise determinations can be made for BPSG films on device wafers.

More Details

Chemical vapor deposition and characterization of tungsten boron alloy films

Smith, Patrick A.

A low pressure chemical vapor deposition (LPCVD) process for depositing W{sub X}B{sub (1-X)} films from WF{sub 6} and B{sub 2}H{sub 6} is described. The depositions were performed in a cold wall reactor on 6 in. Si wafers at 400C. During deposition, pressure was maintained at a fixed level in the range of 200 to 260 mTorr. Ratio of WF{sub 6}/B{sub 2}H{sub 6} was varied from 0.05 to 1.07. Carrier gas was either 100 sccm of Ar with a gas flow of 308 to 591 sccm, or 2000 sccm of Ar and 2000 sccm of H{sub 2} with the overall gas flow from 4213 to 4452 sccm. Two stable deposition regions were found separated by an unstable region that produced non-uniform films. The B-rich films produced in one of the stable deposition regions had W concentrations of 30 at.% and resistivities between 200 and 300 {mu}ohm{center_dot}cm. The W-rich films produced in the other stable deposition region had W concentrations of 80 at.% and resistivities of 100 {mu}ohm{center_dot}cm. As-deposited films had densities similar to bulk material of similar stoichiometry. Barrier properties of the films against diffusion of Cu to 700C in vacuum were measured by 4-point probe. Also, annealing was carried out to 900C in order to determine phases formed as the films crystallize. These studies indicate that W{sub X}B{sub (1-X)} films may be useful barriers in ULSI metallization applications.

More Details

A general purpose I{sub DDQ} measurement circuit

Righter, A.W.

A relatively high-speed I{sub DDQ} measurement circuit called QuiC-Mon is described. Depending upon IC settling times, upper measurement rates range from 50 kHz to 250 kHz at 100 nA resolution. It provides an inexpensive solution for fast, sensitive I{sub DDQ} measurements in CMOS IC wafer probe or packaged part production testing.

More Details

Primary Standards Laboratory report 1st half 1993

Levy, W.G.T.

Sandia National Laboratories operates the Primary Standards Laboratory for the Department of Energy, Albuquerque Operations Office (DOE/AL). This report summarizes metrology activities that received emphasis in the first half of 1993 and provides information pertinent to the operation of the DOE/AL system-wide Standards and Calibration Program.

More Details

Defense programs: A Sandia weapon review bulletin

Floyd, H.L.; Goetsch, B.; Doran, L.

Sandia`s mission to explore technology that enhances US nuclear weapons capabilities has been the primary impetus for the development of a class of inertial measurement units not available commercially. The newest member of the family is the Ring Laser Gyro Assembly. The product of a five-year joint effort by Sandia and Honeywell`s Space and Strategic Systems Operation, the RLGA is a small, one-nautical-mile-per-hour-class inertial measurement unit that consumes only 16 watts - attributes that are important to a guidance and control capability for new or existing weapons. These same attributes led the Central Inertial Guidance Test Facility at Holloman Air Force Base to select the RLGA for their newest test instrumentation pod. The RLGA sensor assembly is composed of three Honeywell ring laser gyroscopes and three Sundstrand Data Control accelerometers that are selected from three types according to the user`s acceleration range and accuracy needs.

More Details

Geotechnology for low permeability gas reservoirs; [Progress report], April 1, 1992--September 30, 1993

Northrop, David A.

The objectives of this program are (1) to use and refine a basinal analysis methodology for natural fracture exploration and exploitation, and (2) to determine the important characteritics of natural fracture systems for their use in completion, stimulation and production operations. Continuing work on this project has demonstrated that natural fracture systems and their flow characteristics can be defined by a thorough study of well and outcrop data within a basin. Outcrop data provides key information on fracture sets and lithologic controls, but some fracture sets found in the outcrop may not exist at depth. Well log and core data provide the important reservoir information to obtain the correct synthesis of the fracture data. In situ stress information is then linked with the natural fracture studies to define permeability anisotropy and stimulation effectiveness. All of these elements require field data, and in the cases of logs, core, and well test data, the cooperation of an operator.

More Details

The phase diagrams and doped-hole segregation in La{sub 2}CuO{sub 4+{delta}} and La{sub 2-x}Sr{sub x}CuO{sub 4+{delta}} (x {le} 0.15, {delta} {le} 0.12)

Schirber, James E.

The magnetic and structural phase diagrams of the La{sub 2}CuO{sub 4+{delta}} system and the La{sub 2-x}Sr{sub x}CuO{sub 4+{delta}} are reviewed, with emphasis on recent results obtained from magnetic and structural neutron diffraction, thermogravimetric analysis, iodometric titration, magnetic susceptibility {chi}(T), and {sup 129}La nuclear quadrupole resonance (NQR) measurements.

More Details

Microscopic study of local structure and charge distribution in metallic La{sub 2}CuO{sub 4+{delta}}

Schirber, James E.

The authors employ NMR and NQR spectroscopy as probes of local structure and charge environments in metallic La{sub 2}CuO{sub 4+{delta}} ({Tc} = 38 K). They discuss the effect of annealing the sample at various temperatures T{sub a} ({Tc} < T{sub a} < 300K) on the superconducting {Tc}. The dependence of {Tc} on annealing indicates that annealing allows the development of structural order which is important for {Tc}. The {sup 139}La quadrupole frequency, {nu}{sub Q} is smaller than in undoped materials. This is unexpected and may indicate a smaller charge on the apex oxygen in the doped material and thus a different distribution of charge between the La-O layer to the planes. The further, rapid decrease in {nu}{sub Q} just above {Tc} indicates that temperature dependent charge redistribution is occurring. The presence of doped holes induces a distribution of displacements of the apex oxygen off of the vertical La-Cu bond axis. These vary from zero to the value observed in lightly doped (antiferromagnetic) La{sub 2}CuO{sub 4+{delta}}. These measurements demonstrate a striking degree of inhomogeneity in the crystal structure of the La-O layer. Copper NQR spectroscopy shows that there are two distinct copper sites in the CuO{sub 2} planes and thus that either the structure or the charge distribution in the planes is inhomogeneous as well. These inhomogeneities are the intrinsic response of the crystal to doped holes; they are not the result of distortions of the lattice due to the presence of interstitial oxygen atoms.

More Details

An assessment of testing requirement impacts on nuclear thermal propulsion ground test facility design

Shipers, Larry R.

Programs to develop solid core nuclear thermal propulsion (NTP) systems have been under way at the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), and the Department of Energy (DOE). These programs have recognized the need for a new ground test facility to support development of NTP systems. However, the different military and civilian applications have led to different ground test facility requirements. The Department of Energy (DOE) in its role as landlord and operator of the proposed research reactor test facilities has initiated an effort to explore opportunities for a common ground test facility to meet both DoD and NASA needs. The baseline design and operating limits of the proposed DoD NTP ground test facility are described. The NASA ground test facility requirements are reviewed and their potential impact on the DoD facility baseline is discussed.

More Details

Communication on the Paragon

Mccurley, K.S.

In this note the authors describe the results of some tests of the message-passing performance of the Intel Paragon. These tests have been carried out under both the Intel-supplied OSF/1 operating system with an NX library, and also under an operating system called SUNMOS (Sandia UNM Operating System). For comparison with the previous generation of Intel machines, they have also included the results on the Intel Touchstone Delta. The source code used for these tests is identical for all systems. As a result of these tests, the authors can conclude that SUNMOS demonstrates that the Intel Paragon hardware is capable of very high bandwidth communication, and that the message coprocessor on Paragon nodes can be used to give quite respectable latencies. Further tuning can be expected to yield even better performance.

More Details

Summary of uncertainty analysis of dispersion and deposition modules of the MACCS and COSYMA consequence codes: A joint USNRC/CEC study

Harper, Frederick T.

This paper briefly describes an ongoing project designed to assess the uncertainty in offsite radiological consequence calculations of hypothetical accidents in commercial nuclear power plants. This project is supported jointly by the Commission of European Communities (CEC) and the US Nuclear Regulatory Commission (USNRC). Both commissions have expressed an interest in assessing the uncertainty in consequence calculations used for risk assessments and regulatory purposes.

More Details

Interactive graphical model building using telepresence and virtual reality

Stansfield, S.

This paper presents a prototype system developed at Sandia National Laboratories to create and verify computer-generated graphical models of remote physical environments. The goal of the system is to create an interface between an operator and a computer vision system so that graphical models can be created interactively. Virtual reality and telepresence are used to allow interaction between the operator, computer, and remote environment. A stereo view of the remote environment is produced by two CCD cameras. The cameras are mounted on a three degree-of-freedom platform which is slaved to a mechanically-tracked, stereoscopic viewing device. This gives the operator a sense of immersion in the physical environment. The stereo video is enhanced by overlaying the graphical model onto it. Overlay of the graphical model onto the stereo video allows visual verification of graphical models. Creation of a graphical model is accomplished by allowing the operator to assist the computer in modeling. The operator controls a 3-D cursor to mark objects to be modeled. The computer then automatically extracts positional and geometric information about the object and creates the graphical model.

More Details

Identification of components to optimize improvement in system reliability

Campbell, James E.

The fields of reliability analysis and risk assessment have grown dramatically since the 1970s. There are now bodies of literature and standard practices which cover quantitative aspects of system analysis such as failure rate and repair models, fault and event tree generation, minimal cut sets, classical and Bayesian analysis of reliability, component and system testing techniques, decomposition methods, etc. In spite of the growth in the sophistication of reliability models, however, little has been done to integrate optimization models within a reliability analysis framework. That is, often reliability models focus on characterization of system structure in terms of topology and failure/availability characteristics of components. A number of approaches have been proposed to help identify the components of a system that have the largest influence on overall system reliability. While this may help rank order the components, it does not necessarily help a system design team identify which components they should improve to optimize overall reliability (it may be cheaper and more effective to focus on improving two or three components of smaller importance than one component of larger importance). In this paper, we present an optimization model that identifies the components to be improved to maximize the increase in system MTBF, subject to a fixed budget constraint. A dual formulation of the model is to minimize cost, subject to achieving a certain level of system reliability.

More Details

Coherent sampling of multiple branch event tree questions

Payne Jr., A.C.; Wyss, G.D.

In the detailed phenomenological event trees used in recent Level III PRA analyses questions arise about the possible outcomes of events for which the underlying physics is not well understood and where the initial and boundary conditions are uncertain. Examples of the types of events being analyzed are: What is the containment failure mode?, Is them a large in-vessel steam explosion?, How much H{sub 2}, CO, and CO{sub 2} are produced during core-concrete interactions? The outcomes of each of these questions must be defined based on an understanding of the basic physics of the phenomena and the level of detail of the probabilistic analysis. Many of these phenomena have never occurred since severe reactor accidents are extremely rare events. The only information we have about these phenomena comes from four basic sources: general theoretical knowledge, limited experimental results a few actual events, and various models of the phenomena. All of these phenomena have significant uncertainty arising from three basic sources: level of detail, initial and boundary conditions, and lack of knowledge. Since it is not possible to conduct enough full scale tests to generate a set of ``objective`` relative frequencies, the probabilities, therefore, will have to be ``subjective`` and generated based on expert knowledge. In assessing the conditional probabilities of the various possible outcomes of an event during an accident, the expert must amalgamate his knowledge with the level of detail being used in the PRA analysis to generate a set of probabilities for the defined set of outcomes. It is often convenient for an expert to formulate his opinion in terms of expecting to see n{sub i} occurrences of outcome E{sub i} in N occurrences of event E. The order of the outcomes is typically not important because the individual trials are viewed as being independent of one another.

More Details

Multiple weight stepwise regression

Campbell, James E.

In many science and engineering applications, there is an interest in predicting the outputs of a process for given levels of inputs. In order to develop a model, one could run the process (or a simulation of the process) at a number of points (a point would be one run at one set of possible input values) and observe the values of the outputs at those points. There observations can be used to predict the values of the outputs for other values of the inputs. Since the outputs are a function of the inputs, we can generate a surface in the space of possible inputs and outputs. This surface is called a response surface. In some cases, collecting data needed to generate a response surface can e very expensive. Thus, in these cases, there is a powerful incentive to minimize the sample size while building better response surfaces. One such case is the semiconductor equipment manufacturing industry. Semiconductor manufacturing equipment is complex and expensive. Depending upon the type of equipment, the number of control parameters may range from 10 to 30 with perhaps 5 to 10 being important. Since a single run can cost hundreds or thousands of dollars, it is very important to have efficient methods for building response surfaces. A current approach to this problem is to do the experiment in two stages. First, a traditional design (such as fractional factorial) is used to screen variables. After deciding which variables are significant, additional runs of the experiment are conducted. The original runs and the new runs are used to build a model with the significant variables. However, the original (screening) runs are not as helpful for building the model as some other points might have been. This paper presents a point selection scheme that is more efficient than traditional designs.

More Details

MACCS version 1.5.11.1: A maintenance release of the code

Chanin, D.; Foster, J.; Rollstin, J.; Miller, L.

A new version of the MACCS code (version 1.5.11.1) has been developed by Sandia National Laboratories under sponsorship of the US Nuclear Regulatory Commission. MACCS was developed to support evaluations of the off-site consequences from hypothetical severe accidents at commercial power plants. MACCS is the only current public domain code in the US that embodies all of the following modeling capabilities: (1) weather sampling using a year of recorded weather data; (2) mitigative actions such as evacuation, sheltering, relocation, decontamination, and interdiction; (3) economic costs of mitigative actions; (4) cloudshine, groundshine, and inhalation pathways as well as food and water ingestion; (5) calculation of both individual and societal doses to various organs; and (6) calculation of both acute (nonstochastic) and latent (stochastic) health effects and risks of health effects. All of the consequence measures may be fun generated in the form of a complementary cumulative distribution function (CCDF). The current version implements a revised cancer model consistent with recent reports such as BEIR V and ICRP 60. In addition, a number of error corrections and portability enhancements have been implemented. This report describes only the changes made in creating the new version. Users of the code will need to obtain the code`s original documentation, NUREG/CR-4691.

More Details

Holographic interferometry: A user`s guide

Griggs, D.

This manual describes the procedures and components necessary to produce a holographic interferogram of a flow field in the Sandia National Laboratories hypersonic wind tunnel. In contrast to classical interferometry, holographic interferometry records the amplitude and phase distribution of a lightwave passing through the flow field at some instant of time. This information can then be reconstructed outside the wind tunnel for visual analysis and digital processing, yielding precise characterizations of aerodynamic phenomena. The reconstruction and subsequent hologram image storage process is discussed, with particular attention paid to the digital image processor and the data reduction technique.

More Details

Technical requirements for the actinide source-term waste test program

Phillips, Mark L.

This document defines the technical requirements for a test program designed to measure time-dependent concentrations of actinide elements from contact-handled transuranic (CH TRU) waste immersed in brines similar to those found in the underground workings of the Waste Isolation Pilot Plant (WIPP). This test program wig determine the influences of TRU waste constituents on the concentrations of dissolved and suspended actinides relevant to the performance of the WIPP. These influences (which include pH, Eh, complexing agents, sorbent phases, and colloidal particles) can affect solubilities and colloidal mobilization of actinides. The test concept involves fully inundating several TRU waste types with simulated WIPP brines in sealed containers and monitoring the concentrations of actinide species in the leachate as a function of time. The results from this program will be used to test numeric models of actinide concentrations derived from laboratory studies. The model is required for WIPP performance assessment with respect to the Environmental Protection Agency`s 40 CFR Part 191B.

More Details

Characteristics and development report for the T1576 power supply and the MC3935 battery

Butler, Paul C.

This report describes the requirements, designs, performance, and development histories for the T1576 power supply and the MC3935 rechargeable battery. These devices are used to power Permissive Action Link (PAL) ground controllers. The T1576 consists of a stainless steel container, one SA3553 connector, and one MC3935 battery. The MC3935 is a vented nickel/cadmium battery with 24 cells connected in series. It was designed to deliver 5.5 Amp-hours at 25{number_sign}C and the one-hour rate, with a nominal voltage of 28 V. The battery was designed to operate for 5 years or 500 full charge/discharge cycles. The power supply is expected to last indefinitely with replacement batteries and hardware.

More Details

MELCOR 1.8.2 Assessment: IET direct containment heating tests

Kmetyk, Lubomyra N.

MELCOR is a fully integrated, engineering-level computer code, being developed at Sandia National Laboratories for the USNRC, that models the entire spectrum of severe accident phenomena in a unified framework for both BWRs and PWRS. As part of an ongoing assessment program, the MELCOR computer code has been used to analyze several of the IET direct containment heating experiments done at 1:10 linear scale in the Surtsey test facility at Sandia and at 1:40 linear scale in the corium-water thermal interactions (CWTI) COREXIT test facility at Argonne National Laboratory. These MELCOR calculations were done as an open post-test study, with both the experimental data and CONTAIN results available to guide the selection of code input. Basecase MELCOR results are compared to test data in order to evaluate the new HPME DCH model recently added in MELCOR version 1.8.2. The effect of various user-input parameters in the HPME model, which define both the initial debris source and the subsequent debris interaction, were investigated in sensitivity studies. In addition, several other non-default input modelling changes involving other MELCOR code packages were required in our IET assessment analyses in order to reproduce the observed experiment behavior. Several calculations were done to identify whether any numeric effects exist in our DCH IET assessment analyses.

More Details

User`s guide for the frequency domain algorithms in the LIFE2 fatigue analysis code

Sutherland, Herbert J.

The LIFE2 computer code is a fatigue/fracture analysis code that is specialized to the analysis of wind turbine components. The numerical formulation of the code uses a series of cycle count matrices to describe the cyclic stress states imposed upon the turbine. However, many structural analysis techniques yield frequency-domain stress spectra and a large body of experimental loads (stress) data is reported in the frequency domain. To permit the analysis of this class of data, a Fourier analysis is used to transform a frequency-domain spectrum to an equivalent time series suitable for rainflow counting by other modules in the code. This paper describes the algorithms incorporated into the code and their numerical implementation. Example problems are used to illustrate typical inputs and outputs.

More Details
Results 94401–94425 of 96,771
Results 94401–94425 of 96,771