Publications

Results 93751–93800 of 99,299

Search results

Jump to search filters

Modeling an optical micromachine probe

Mittas, Anthony

Silicon micromachines are fabricated using Surface Micro-Machining (SMM) techniques. Silicon micromachines include engines that consist of orthogonally oriented linear comb drive actuators mechanically connected to a rotating gear. These gears are as small a 50-{micro}m in diameter and can be driven at rotation rates exceeding 300,000-rpm. Measuring and analyzing microengine performance is basic to micromachine development and system applications. Optical techniques offer the potential for measuring long term statistical performance data and transient responses needed to optimize designs and manufacturing techniques. The authors describe the modeling of an optical probe developed at Sandia National Laboratories. Experimental data will be compared with output from the model.

More Details

Beyond pretty pictures: Quantifying porous media properties and transport processes using transmission and emission CT

Lucero, Daniel A.

While gaining increasing interest, the use of Computerized Tomography (CT) in porous media studies has been limited by the availability of quantitative methods of analysis. Three methods are presented for the analysis of CT data and applied to images obtained from gamma transmission and gamma emission systems. The first utilizes measurement statistics and image histograms to provide exact estimates of multiple component volume contents. An improved thresholding technique in the second method allows an identification of individual voxel composition. The threshold utilizes error statistics to eliminate the arbitrary nature of current methods. Emission tomography images of solute transport are shown in the third procedure to provide in-situ measures of transport in fractured media. Application of each method is demonstrated on samples of the Culebra Dolomite of the Rustler Formation, New Mexico. Dolomite cores were collected by horizontal drilling at a depth of 218 m in the air intake shaft of the Waste Isolation Pilot Plant located near Carlsbad, New Mexico.

More Details

VSHOT measurement uncertainty and sensitivity study

Jones, Scott A.

The Video Scanning Hartmann Optical Tester (VSHOT) is a slope-measuring tool for large, imprecise reflectors. It is a laser ray trace device developed to measure the optical quality of point-focus solar concentrating mirrors. A unique tool was needed because of the diverse geometry and very large size of solar concentrators, plus their large optical errors. To study the accuracy of VSHOT as well as its sensitivity to changes in test setup variables, a series of experiments were performed with a very precise, astronomical-grade mirror. The slope errors of the reference mirror were much smaller than the resolution of the VSHOT, so that any measured slope errors were caused by the instrument itself rather than the mirror. The VSHOT exceeded its accuracy goals by achieving about {+-}0.5% (68% confidence) error in the determination of focal length and {+-} 0.1 mrad (68% confidence) error in the determination of RMS slope error. Displacement of the test mirror from the optical axis caused the largest source of measured errors.

More Details

Virtual Tower

Wayne, R.A.

The primary responsibility of an intrusion detection system (IDS) operator is to monitor the system, assess alarms, and summon and coordinate the response team when a threat is acknowledged. The tools currently provided to the operator are somewhat limited: monitors must be switched, keystrokes must be entered to call up intrusion sensor data, and communication with the response force must be maintained. The Virtual tower is an operator interface assembled from low-cost commercial-off-the-shelf hardware and software; it enables large amounts of data to be displayed in a virtual manner that provides instant recognition for the operator and increases assessment accuracy in alarm annunciator and control systems. This is accomplished by correlating and fusing the data into a 360-degree visual representation that employs color, auxiliary attributes, video, and directional audio to prompt the operator. The Virtual Tower would be a valuable low-cost enhancement to existing systems.

More Details

An introduction to the architectural surety program

Matalucci, R.V.

This paper provides a summary introduction to the nationally emerging area of Architectural and Infrastructure Surety that is under development at Sandia National Laboratories. This program area, addressing technology requirements at the national level, includes four major elements: education, research, development, and application. It involves a risk management approach to solving problems of the as-built environment through the application of security, safety, and reliability principles developed in the nuclear weapons programs of the Department of Energy. The changing responsibilities of engineering design professionals is addressed in light of the increased public awareness of structural and facility systems vulnerabilities to malevolent, normal, and abnormal environment threats. A brief discussion is presented of the education and technology outreach programs initiated through an infrastructure surety graduate Civil Engineering Department course taught at the University of New Mexico and through the architectural surety workshops and conferences already held and planned for the future. A summary description is also presented of selected technologies with strong potential for application to specific national architectural and infrastructure surety concerns. These technologies include super-computational modeling and structural simulations, window glass fragmentation modeling, risk management procedures, instrumentation and health monitoring systems, and three-dimensional CAD virtual reality visualization techniques.

More Details

Application of spatial and angular domain based parallelism to a discrete ordinates formulation with unstructured spatial discretization

Burns, Shawn P.

A parallel discrete ordinate formulation employing a general, unstructured finite element spatial discretization is presented for steady, gray, nonscattering radiative heat transport within a participating medium. The formulation is based on the first order form of the boltzmann transport equation and allows for any combination of spatial and angular domain based parallelism. The formulation is tested on a massively parallel, distributed memory architecture using a standard three-dimensional benchmark calculation. The results show that the formulation presented provides better parallel performance and accuracy than the author`s previously published work. The ultimate objective of both the current and previous efforts is to develop a computationally efficient radiative transport model for use in large scale numerical fire simulations.

More Details

Risk management for buildings -- Has the time come?

Berry, Dennis L.

There are both incentives and challenges for applying formal risk management processes to buildings and other structures, including bridges, highways, dams, stadiums, shopping centers, and private dwellings. Based on an assessment of several issues, the authors conclude that for certain types of buildings and structures the time has come for the use of a formal risk-management approach, including probabilistic risk assessment methods, to help identify dominant risks to public health, safety, and security and to help manage these risks in a cost-effective manner.

More Details

A thin-foil Faraday collector as a radiation-hard, high fluence charged particle spectrometer

Barbour, J.C.

The authors have developed a radiation-hard, charged particle spectrometer, consisting of thin parallel conducting foils as current collectors. Prototype detectors have been tested in accelerator bombardments and at the fusion plasma facilities TFTR and JET. In the case of the accelerator bombardments, a detector consisting of 6 Al foils, each of thickness about 6 {micro}m, demonstrated an energy resolution of about 7% for 7 MeV alpha particles. The prototype tested immediately outside TFTR demonstrated the expected insensitivity to moderately high levels of fast neutrons and hard gamma rays. The prototype tested inside JET similarly indicated operational capability at elevated temperatures as a lost alpha particle detector for d-t tokamak fusion plasmas. The robustness and moderately good energy resolution of these detectors should permit the application to tasks such as the first wall measurement of lost alpha particles from tokamak fusion plasmas, the real time measurement of light ion fission fragments from fission reactor experiments and the in-beam measurement of accelerator beam energies as a control diagnostic.

More Details

Contents and structure of the SME digital signature buffer

Tarman, Thomas D.

This contribution proposes additional text for Section 7.1.5.5 of [1] which defines the contents of the digital signature buffer for each relevant flow in the Two-Way and Three-Way Security Message Exchange Protocols. This is clearly an interoperability issue because these signature buffers must be constructed identically at the sender (signature generator) and receiver (signature validator) in order for the protocols to proceed correctly. Sections 2 and 3 of this contribution are intended to be placed in Section 7.1.5.5 of [1]. In addition, text is proposed in Motion 2 of Section 4 of this contribution which clarifies the scope of encryption of the Confidential Section, which is defined in Section 7.1.4 of [1].

More Details

An information model based weld schedule database

Kleban, Stephen

As part of a computerized system (SmartWeld) developed at Sandia National Laboratories to facilitate agile manufacturing of welded assemblies, a weld schedule database (WSDB) was also developed. SmartWeld`s overall goals are to shorten the design-to-product time frame and to promote right-the-first-time weldment design and manufacture by providing welding process selection guidance to component designers. The associated WSDB evolved into a substantial subproject by itself. At first, it was thought that the database would store perhaps 50 parameters about a weld schedule. This was a woeful underestimate: the current WSDB has over 500 parameters defined in 73 tables. This includes data bout the weld, the piece parts involved, the piece part geometry, and great detail about the schedule and intervals involved in performing the weld. This complex database was built using information modeling techniques. Information modeling is a process that creates a model of objects and their roles for a given domain (i.e. welding). The Natural-Language Information Analysis methodology (NIAM) technique was used, which is characterized by: (1) elementary facts being stated in natural language by the welding expert, (2) determinism (the resulting model is provably repeatable, i.e. it gives the same answer every time), and (3) extensibility (the model can be added to without changing existing structure). The information model produced a highly normalized relational schema that was translated to Oracle{trademark} Relational Database Management Systems for implementation.

More Details

Equilibrium characteristics of tartrate and EDTA-based electroless copper deposition baths

Chen, Ken S.

Electroless deposition of copper is being used for a variety of applications, one of them being the development of seed metallic layers on non-metals, which are widely used in electronic circuitry. Solution equilibrium characteristics of two electroless copper baths containing EDTA and tartrate as the complexing agents were studied as functions of pH, chelating agent and metal ion concentrations. Equilibrium diagrams were constructed for both cu-tartrate and Cu-EDTA systems. It was determined that copper is chiefly complexed as Cu(OH){sub 2}L{sub 2}{sup {minus}4} in the tartrate bath, and as CuA{sup {minus}2} in the EDTA bath, where L and A are the complexing tartrate and EDTA ligands, respectively. The operating ranges for electroless copper deposition were identified for both baths. Dependence of Cu(OH){sub 2} precipitation on the pH and species concentrations was also studied for these systems.

More Details

A plasma process monitor/control system

Stevenson, Joel O.

Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

More Details

Architectural design for reliability

Cranwell, Robert M.

Design-for-reliability concepts can be applied to the products of the construction industry, which includes buildings, bridges, transportation systems, dams, and other structures. The application of a systems approach to designing in reliability emphasizes the importance of incorporating uncertainty in the analyses, the benefits of optimization analyses, and the importance of integrating reliability, safety, and security. 4 refs., 3 figs.

More Details

On angularly perturbed Laplace equations in the unit ball of IR{sup n+2} and their distributional boundary values

Massopust, P.R.

All solutions of an in its angular coordinates continuously perturbed Laplace-Beltrami equation in the open unit ball IB{sup n+2} {contained_in} IR{sup n+2}, n {ge} 1, are characterized. Moreover, it is shown that such pertubations yield distributional boundary values which are different from, but algebraically and topologically equivalent to, the hyperfunctions of Lions & Magenes. This is different from the case of radially perturbed Laplace-Beltrami operators (cf. [7]) where one has stability of distributional boundary values under such perturbations.

More Details

Activity-based costing of security services for a Department of Energy nuclear site

Snell, Mark K.

Department of Energy (DOE) nuclear facilities are being encouraged to reduce costs but the accounting data typically in use by the financial organizations at these laboratories cannot easily be used to determine which security activities offer the best reduction in cost. For example, labor costs have historically been aggregated over various activities, making it difficult to determine the true costs of performing each activity. To illustrate how this problem can be solved, a study was performed applying activity-based costing (ABC) to a hypothetical DOE facility. ABC is a type of cost-accounting developed expressly to determine truer costs of company activities. The hypothetical facility was defined to have features similar to those found across the DOE nuclear complex. ABC traced costs for three major security functions - Protective Force Operations, Material Control and Accountability, and Technical Security - to various activities. Once these costs had been allocated, we compared the cost of three fictitious upgrades: (1) an improvement in training or weapons that allows the protective force to have better capabilities instead of adding more response forces; (2) a change in the frequency of inventories; and (3) a reduction in the annual frequencies of perimeter sensor tests.

More Details

Evaluation of a prototype infrasound system

Breding, D.

Under Department of Energy sponsorship, Sandia National Laboratories and Los Alamos National Laboratory cooperated to develop a prototype infrasonic array, with associated documentation, that could be used as part of the International Monitoring System. The United States Government or foreign countries could procure commercially available systems based on this prototype to fulfill their Comprehensive Test Ban Treaty (CTBT) obligations. The prototype is a four-element array in a triangular layout as recommended in CD/NTB/WP.224 with an element at each corner and one in the center. The prototype test configuration utilize an array spacing of 1 km. The prototype infrasound system has the following objectives: (1) Provide a prototype that reliably acquires and transmits near real-time infrasonic data to facilitate the rapid location and identification of atmospheric events. (2) Provide documentation that could be used by the United States and foreign countries to procure infrasound systems commercially to fulfill their CTBT responsibilities. Infrasonic monitoring is an effective, low cost technology for detecting atmospheric explosions. The low frequency components of explosion signals propagate to long ranges (few thousand kilometers) where they can be detected with an array of sensors. Los Alamos National Laboratory`s expertise in infrasound systems and phenomenology when combined with Sandia`s expertise in providing verification quality system for treaty monitoring make an excellent team to provide the prototype infrasound sensor system. By September 1997, the prototype infrasound system will have been procured, integrated, evaluated and documented. Final documentation will include a system requirements document, an evaluation report and a hardware design document. The hardware design document will describe the various hardware components used in the infrasound prototype and their interrelationships.

More Details

Oxidation in HVOF-sprayed steel

Smith, Mark F.

It is widely held that most of the oxidation in thermally sprayed coatings occurs on the surface of the droplet after it has flattened. The evidence in this paper suggests that, for the conditions studied here, oxidation of the top surface of flattened droplets is not the dominant oxidation mechanism. In this study, a mild steel wire (AISI 1025) was sprayed using a high-velocity oxy-fuel (HVOF) torch onto copper and aluminum substrates. Ion milling and Auger spectroscopy were used to examine the distribution of oxides within individual splats. Conventional metallographic analysis was also used to study oxide distributions within coatings that were sprayed under the same conditions. An analytical model for oxidation of the exposed surface of a splat is presented. Based on literature data, the model assumes that diffusion of iron through a solid FeO layer is the rate limiting factor in forming the oxide on the top surface of a splat. An FeO layer only a few thousandths of a micron thick is predicted to form on the splat surface as it cools. However, the experimental evidence shows that the oxide layers are typically 100x thicker than the predicted value. These thick, oxide layers are not always observed on the top surface of a splat. Indeed, in some instances the oxide layer is on the bottom, and the metal is on the top. The observed oxide distributions are more consistently explained if most of the oxide formed before the droplets impact the substrate.

More Details

Travel-time correction surface generation for the DOE Knowledge Base

Hipp, James R.

The DOE Knowledge Base data storage and access model consists of three parts: raw data processing, intermediate surface generation, and final output surface interpolation. The paper concentrates on the second step, surface generation, specifically applied to travel-time correction data. The surface generation for the intermediate step is accomplished using a modified kriging solution that provides robust error estimates for each for each interpolated point and satisfies many important physical requirements including differing quality data points, user-definable range of influence for each point, blend to background values for both interpolated values and error estimates beyond the ranges, and the ability to account for the effects of geologic region boundaries. These requirements are outlined and discussed and are linked to requirements specified for the final output model in the DOE Knowledge Base. Future work will focus on testing the entire Knowledge Base model using the regional calibration data sets which are being gathered by researchers at Los Alamos and Lawrence Livermore National Laboratories.

More Details

Visualization tools for comprehensive test ban treaty research

Edwards, T.L.

This paper focuses on tools used in Data Visualization efforts at Sandia National Laboratories under the Department of Energy CTBT R&D program. These tools provide interactive techniques for the examination and interpretation of scientific data, and can be used for many types of CTBT research and development projects. We will discuss the benefits and drawbacks of using the tools to display and analyze CTBT scientific data. While the tools may be used for everyday applications, our discussion will focus on the use of these tools for visualization of data used in research and verification of new theories. Our examples focus on uses with seismic data, but the tools may also be used for other types of data sets. 5 refs., 6 figs., 1 tab.

More Details

The software engineering journey: From a naieve past into a responsible future

Fitzpatrick-Fletcher, Sharon K.

All engineering fields experience growth, from early trial & error approaches, to disciplined approaches based on fundamental understanding. The field of software engineering is making the long and arduous journey, accomplished by evolution of thinking in many dimensions. This paper takes the reader along a trio of simultaneous evolutionary paths. First, the reader experiences evolution from a zero-risk mindset to a managed-risk mindset. Along this path, the reader observes three generations of security risk management and their implications for software system assurance. Next is a growth path from separate surety disciplines to an integrated systems surety approach. On the way, the reader visits safety, security, and dependability disciplines and peers into a future vision which coalesces them. The third and final evolutionary path explored here transitions the software engineering field from best practices to fundamental understanding. Along this road, the reader observes a framework for developing a {open_quotes}science behind the engineering{close_quotes}, and methodologies for software surety analysis.

More Details

A summary of the GPS system performance for STARS Mission 3

Creel, E.E.

This paper describes the performance of the GPS system on the most recent flight of the STARS missile, STARS Mission 3 (M3). This mission was conducted under the Ballistic Missile Defense Organization`s (BMDO`s) Consolidated Targets Program. The United States Army Space and Strategic Defense Command (USASSDC) is the executing agent for this mission and the Department of Energy`s (DOE`s) Sandia National Laboratories (SNL) is the vehicle developer and integrator. The M3 flight, dually designated as the MSX Dedicated Targets II (MDT-II) mission occurred on August 31, 1996. This mission was conducted for the specific purpose of providing targets for viewing by the MSX satellite. STARS M3 was the first STARS flight to use GPS-derived data for missile guidance, and proved to be instrumental in the procurement of a wealth of experimental data which is still undergoing analysis by numerous scientific agencies within the BMDO complex. GPS accuracy was required for this mission because of the prescribed targeting requirements for the MDT-II payload deliveries with respect to the MSX satellite flight path. During the flight test real time GPS-derived state vector data was also used to generate pointing angles for various down range sensors involved in the experiment. Background information describing the STARS missile, GPS subsystem architecture, and the GPS Kalman filter design is presented first, followed by a discussion of the telemetry data records obtained from this flight with interpretations and conclusions.

More Details

The data dictionary: A view into the CTBT knowledge base

Shepherd, E.R.; Keyser, R.G.; Armstrong, H.M.

The data dictionary for the Comprehensive Test Ban Treaty (CTBT) knowledge base provides a comprehensive, current catalog of the projected contents of the knowledge base. It is written from a data definition view of the knowledge base and therefore organizes information in a fashion that allows logical storage within the computer. The data dictionary introduces two organization categories of data: the datatype, which is a broad, high-level category of data, and the dataset, which is a specific instance of a datatype. The knowledge base, and thus the data dictionary, consist of a fixed, relatively small number of datatypes, but new datasets are expected to be added on a regular basis. The data dictionary is a tangible result of the design effort for the knowledge base and is intended to be used by anyone who accesses the knowledge base for any purpose, such as populating the knowledge base with data, or accessing the data for use with automatic data processing (ADP) routines, or browsing through the data for verification purposes. For these two reasons, it is important to discuss the development of the data dictionary as well as to describe its contents to better understand its usefulness; that is the purpose of this paper.

More Details

Testing the waveform correlation event detection system: Teleseismic, regional, and local distances

Young, Christopher J.

Waveform Correlation Event Detection System (WCEDS) prototypes have now been developed for both global and regional networks and the authors have extensively tested them to assess the potential usefulness of this technology for CTBT (Comprehensive Test Ban Treaty) monitoring. In this paper they present the results of tests on data sets from the IDC (International Data Center) Primary Network and the New Mexico Tech Seismic Network. The data sets span a variety of event types and noise conditions. The results are encouraging at both scales but show particular promise for regional networks. The global system was developed at Sandia Labs and has been tested on data from the IDC Primary Network. The authors have found that for this network the system does not perform at acceptable levels for either detection or location unless directional information (azimuth and slowness) is used. By incorporating directional information, however, both areas can be improved substantially suggesting that WCEDS may be able to offer a global detection capability which could complement that provided by the GA (Global Association) system in use at the IDC and USNDC (United States National Data Center). The local version of WCEDS (LWCEDS) has been developed and tested at New Mexico Tech using data from the New Mexico Tech Seismic Network (NMTSN). Results indicate that the WCEDS technology works well at this scale, despite the fact that the present implementation of LWCEDS does not use directional information. The NMTSN data set is a good test bed for the development of LWCEDS because of a typically large number of observed local phases and near network-wide recording of most local and regional events. Detection levels approach those of trained analysts, and locations are within 3 km of manually determined locations for local events.

More Details

Software design and operational model for the WCEDS prototype

Beiriger, Judy I.

To explore the potential of waveform correlation for CTBT, the Waveform Correlation Event Detection System (WCEDS) prototype was developed. The WCEDS software design followed the Object Modeling Technique process of analysis, system design, and detailed design and implementation. Several related executable programs are managed through a Graphical User Interface (GUI). The WCEDS prototype operates in an IDC/NDC-compatible environment. It employs a CSS 3.0 database as its primary input/output interface, reading in raw waveforms at the start, and storing origins, events, arrivals, and associations at the finish. Additional output includes correlation results and data for specified testcase origins, and correlation timelines for specified locations. During the software design process, the more general seismic monitoring functionality was extracted from WCEDS-specific requirements and developed into C++ object-oriented libraries. These include the master image, grid, basic seismic, and extended seismic libraries. Existing NDC and commercial libraries were incorporated into the prototype where appropriate, to focus development activities on new capability. The WCEDS-specific application code was built in a separate layer on top of the general seismic libraries. The general seismic libraries developed for the WCEDS prototype can provide a base for other algorithm development projects.

More Details

Magnetically-excited flexural plate wave resonator

Martin, Steve W.

A flexural plate wave (FPW) resonator was constructed by patterning current lines on a silicon nitride membrane suspended on a rectangular silicon frame. Eigenmodes of the rectangular membrane were excited using Lorentz forces generated between alternating surface currents and a static in-plane magnetic field. The magnetic field strength required for these devices can be achieved with small permanent magnets ({approx} 1 cm{sup 3}). Preferential coupling to a particular membrane mode was achieved by positioning current lines along longitudinal mode antinodes. An equivalent-circuit model was derived that characterizes the input impedance of a one-port device and the transmission response of a two-port device over a range of frequencies near a single membrane resonance. Experiments were performed to characterize the effects of varying magnetic field, ambient gas, gas pressure, and input power. To the authors` knowledge, this is the first experimental demonstration of a resonant FPW device.

More Details

Solidification modeling of Nb bearing superalloys

Robino, Charles V.

The solidification behavior of experimental Ni base and Fe base superalloys containing Nb, Si, and C was studied using differential thermal analysis (DTA) and microstructural characterization techniques. The solidification reaction sequences responsible for microstructural development were found to be similar to those expected in the Ni-Nb-C ternary system, where the solute-rich interdendritic liquid exhibited two eutectic-type reactions at the terminal stages of solidification: L {yields} ({gamma} + NbC) and L {yields} ({gamma} + Laves). A pseudo ternary {gamma}-Nb-C approach was developed to provide a quantitative description of solidification behavior for these experimental alloys. Solute redistribution calculations in the model are based on a previous approach developed by Mehrabian and Flemings, with modifications made to account for the high diffusion rate of C in the solid. Solidification parameters for Nb and C were determined through DTA and electron probe microanalysis techniques and used as inputs to the model. Reasonable agreement is found between calculated volume fractions of the {gamma}/NbC and {gamma}/Laves constituents and those measured experimentally. The modeling results permit detailed descriptions of the relation between alloy composition and microstructural evolution during solidification.

More Details

Data analysis for remote monitoring of safeguarded facilities

Deland, Sharon M.

The International Remote Monitoring Project (IRMP) sponsored by the US DOE allows DOE and its international partners to gain experience with the remote collection, transmission, and interpretation of safeguards-relevant data. This paper focuses on the interpretation of the data from these remote monitoring systems. Users of these systems need to be able to ascertain that the remote monitoring system is functioning as expected and that the events generated by the sensors are consistent with declared activity. The initial set of analytical tools being provided for IRMP installations this year include a suite of automatically generated views of user-selected data. The baseline set of tools, with illustrative examples, will be discussed. Plans for near-term enhancements will also be discussed. Finally, the applicability of more advanced analytical techniques such as expert systems will be discussed.

More Details

Low-level radioactive waste transportation safety history

Mcclure, J.D.

The Radioactive Materials Incident Report (RMIR) database was developed fin 1981 at the Transportation Technology Center of Sandia National Laboratories to support its research and development activities for the US department of Energy (DOE). This database contains information about radioactive material (RAM) transportation incidents that have occurred in the US since 1971. These data were drawn from the US Department of Transportation`s (DOT) Hazardous Materials Incident Report system, from Nuclear Regulatory Commission (NRC) files, and from various agencies including state radiological control offices. Support for the RMIR data base is funded by the US DOE National Transportation Program (NTP). Transportation events in RMIR are classified in one of the following ways: as a transportation accident, as a handling accident, or as a reported incident. This presentation will provide definitions for these classifications and give examples of each. The primary objective of this presentation is to provide information on nuclear materials transportation accident/incident events involving low-level waste (LLW) that have occurred in the US for the period 1971 through 1996. Among the areas to be examined are: transportation accidents by mode, package response during accidents, and an examination of accidents where release of contents has occurred. Where information is available, accident and incident history and package response for LLW packages in transportation accidents will be described.

More Details

Alumina strength degradation in the elastic regime

Furnish, Michael D.

Measurements of Kanel et. al. [1991] have suggested that deviatoric stresses in glasses shocked to nearly the Hugoniot Elastic limit (HEL) relax over a time span of microseconds after initial loading. Failure (damage) waves have been inferred on the basis of these measurements using time-resolved manganin normal and transverse stress gauges. Additional experiments on glass by other researchers, using time-resolved gauges, high-speed photography and spall strength determinations have also lead to the same conclusions. In the present study the authors have conducted transmitted-wave experiments on high-quality Coors AD995 alumina shocked to roughly 5 and 7 GPa (just below or at the HEL). The material is subsequently reshocked to just above its elastic limit. Results of these experiments do show some evidence of strength degradation in the elastic regime.

More Details

Use of z-pinch sources for high-pressure shock wave experiments

Konrad, C.H.; Trott, W.M.; Hall, C.A.

Recent developments have demonstrated the use of pulsed power for producing intense radiation sources (z-pinches) that can drive planar shock waves in samples with spatial dimensions significantly larger than possible with other radiation sources. In this paper, the authors will discuss the use of z-pinch sources for shock wave studies at multi-Mbar pressures. Experimental plans to use the technique for absolute shock Hugoniot measurements and with accuracies comparable to that obtained with gun launchers are discussed.

More Details

D-dot and B-dot monitors for Z-vacuum-section power-flow measurements

Stygar, William A.

The 36-module Z accelerator--designed to drive z-pinch loads at currents up to 20 MA--is contained in a 33-m-diameter tank with oil, water, and vacuum sections. The peak total forward-going power in the 36 water-section bi-plate transmission lines is approximately 63 TW. nine transmission lines deliver power to each of the four vacuum-section levels (referred to as levels A (the uppermost), B, C, and D). New differential D-dot and B-dot monitors were developed for the Z vacuum section. The D-dots measure voltage at the insulator stack. The B-dots measure current at the stack and in the outer magnetically-insulated transmission lines. Each monitor has two outputs that allow common-mode noise to be canceled to first order. The differential D-dot has one signal and one noise channel; the differential B-dot has two signal channels with opposite polarities. Each of the two B-dot sensors in the differential B-dot monitor has four 3-mm-diameter loops and is encased in copper to reduce flux penetration. For both types of probes, two 2.2-mm-diameter coaxial-cables connect the outputs to a Prodyn balun for common-mode-noise rejection. The cables provide reasonable bandwidth and generate acceptable levels of Compton drive in Z`s bremsstrahlung field. A new cavity B-dot is being developed to measure the total Z current 4.3 cm from the axis of the z-pinch load. All of the sensors are calibrated with 2--4% accuracy. The monitor signals are reduced with Barth or Weinschel attenuators, recorded on Tektronix 0.5-ns/sample digitizing oscilloscopes, and software cable compensated and integrated.

More Details

Design and performance of the Z magnetically-insulated transmission lines

Stygar, William A.

The 36-module Z accelerator was designed to drive z-pinch loads for weapon-physics and inertial-confinement-fusion experiments, and to serve as a testing facility for pulsed-power research required to develop higher-current drivers. The authors have designed and tested a 10-nH 1.5-m-radius vacuum section for the Z accelerator. The vacuum section consists of four vacuum flares, four conical 1.3-m-radius magnetically-insulated transmission lines, a 7.6-cm-radius 12-post double-post-hole convolute which connects the four outer MITLs in parallel, and a 5-cm-long inner MITL which connects the output of the convolute to a z-pinch load. IVORY and ELECTRO calculations were performed to minimize the inductance of the vacuum flares with the constraint that there be no significant electron emission from the insulator-stack grading rings. Iterative TLCODE calculations were performed to minimize the inductance of the outer MITLs with the constraint that the MITL electron-flow-current fraction be {le} 7% at peak current. The TLCODE simulations assume a 2.5 cm/{micro}s MITL-cathode-plasma expansion velocity. The design limits the electron dose to the outer-MITL anodes to 50 J/g to prevent the formation of an anode plasma. The TLCODE results were confirmed by SCREAMER, TRIFL, TWOQUICK, IVORY, and LASNEX simulations. For the TLCODE, SCREAMER, and TRIFL calculations, the authors assume that after magnetic insulation is established, the electron-flow current launched in the outer MITLs is lost at the convolute. This assumption has been validated by 3-D QUICKSILVER simulations for load impedances {le} 0.36 ohms. LASNEX calculations suggest that ohmic resistance of the pinch and conduction-current-induced energy loss to the MITL electrodes can be neglected in Z power-flow modeling that is accurate to first order. To date, the Z vacuum section has been tested on 100 shots. They have demonstrated they can deliver a 100-ns rise-time 20-MA current pulse to the baseline z-pinch load.

More Details

Application of reactors for testing neutron-induced upsets in commercial SRAMs

Griffin, Patrick J.

Reactor neutron environments can be used to test/screen the sensitivity of unhardened commercial SRAMs to low-LET neutron-induced upset. Tests indicate both thermal/epithermal (< 1 keV) and fast neutrons can cause upsets in unhardened parts. Measured upset rates in reactor environments can be used to model the upset rate for arbitrary neutron spectra.

More Details

Areal array jetting device for ball grid arrays

Frear, D.R.

Package designs for microelectronics devices have moved from through-hole to surface mount technology in order to increase the printed wiring board real estate available by utilizing both sides of the board. The traditional geometry for surface mount devices is peripheral arrays where the leads are on the edges of the device. As the technology drives towards high input/output (I/O) count (increasing number of leads) and smaller packages with finer pitch (less distance between peripheral leads), limitations on peripheral surface mount devices arise. A solution to the peripheral surface mount issue is to shift the leads to the area under the device. This scheme is called areal array packaging and is exemplified by the ball grid array (BGA) package. In a BGA package, the leads are on the bottom surface of the package in the form of an array of solder balls. The current practice of joining BGA packages to printed wiring boards involves a hierarchy of solder alloy compositions. A high melting temperature ball is typically used for standoff. A promising alternative to current methods is the use of jetting technology to perform monolithic solder ball attachment. This paper describes an areal array jetter that was designed and built to simultaneously jet arrays of solder balls directly onto BGA substrates.

More Details

Application & testing of high temperature materials for solenoid coils

Zich, John L.

Sandia National Laboratories has designed and proven-in two new Solenoid coils for a highly-reliable electromechanical switch. Mil-Spec Magnetics Inc., Walnut CA manufactured the coils. The new design utilizes two new materials: Liquid Crystal Polymer (Vectra C130) for the bobbin and Thermal Barrier Silicone (VI-SIL V-658) for the encapsulant. The use of these two new materials solved most of the manufacturing problems inherent in the old Sandia design. The coils are easier to precision wind and more robust for handling, testing, and storage. The coils have some unique weapon related safety requirements. The most severe of these requirements is the 400{degrees}C, 1600 V test. The coils must not, and did not, produce any outgassing products to affect the voltage breakdown between contacts in the switch at these temperatures and voltages. Actual coils in switches were tested under these conditions. This paper covers the prove-in of this new coil design.

More Details

Active chatter control in a milling machine

Dohner, Jeffrey L.

The use of active feedback compensation to mitigate cutting instabilities in an advanced milling machine is discussed in this paper. A linear structural model delineating dynamics significant to the onset of cutting instabilities was combined with a nonlinear cutting model to form a dynamic depiction of an existing milling machine. The model was validated with experimental data. Modifications made to an existing machine model were used to predict alterations in dynamics due to the integration of active feedback compensation. From simulations, subcomponent requirements were evaluated and cutting enhancements were predicted. Active compensation was shown to enable more than double the metal removal rate over conventional milling machines. 25 refs., 10 figs., 1 tab.

More Details

Copper in silicon: Quantitative analysis of internal and proximity gettering

Mchugo, S.A.; Flink, C.; Weber, E.R.

The behavior of copper in the presence of a proximity gettering mechanism and a standard internal gettering mechanism in silicon was studied. He implantation-induced cavities in the near surface region were used as a proximity gettering mechanism and oxygen precipitates in the bulk of the material provided internal gettering sites. Moderate levels of copper contamination were introduced by ion implantation such that the copper was not supersaturated during the anneals, thus providing realistic copper contamination/gettering conditions. Copper concentrations at cavities and internal gettering sites were quantitatively measured after the annealings. In this manner, the gettering effectiveness of cavities was measured when in direct competition with internal gettering sites. The cavities were found to be the dominant gettering mechanism with only a small amount of copper gettered at the internal gettering sites. These results reveal the benefits of a segregation-type gettering mechanism for typical contamination conditions.

More Details

A key management concept for the CTBT International Monitoring System

Craft, R.

Cryptographic authentication (commonly referred to as ``technical authentication`` in Working Group B) is an enabling technology which ensures the integrity of sensor data and security of digital networks under various data security compromise scenarios. The use of cryptographic authentication,however, implies the development of a key management infrastructure for establishing trust in the generation and distribution of cryptographic keys. This paper proposes security and operational requirements for a CTBT (Comprehensive Test Ban Treaty) key management system and, furthermore, presents a public key based solution satisfying the requirements. The key management system is instantiated with trust distribution technologies similar to those currently implemented in industrial public key infrastructures. A complete system solution is developed.

More Details

Optical measurement of micromachine engine performance

Holswade, Scott C.

Understanding the mechanisms that impact the performance of Microelectromechanical Systems (MEMS) is essential to the development of optimized designs and drive signals, as well as the qualification of devices for commercial applications. Silicon micromachines include engines that consist of orthogonally oriented linear comb drive actuators mechanically connected to a rotating gear. These gears are as small as 50 {mu}m in diameter and can be driven at rotation rates exceeding 300,000 rpm. Optical techniques offer the potential for measuring long term statistical performance data and transient responses needed to optimize designs and manufacturing techniques. The authors describe the development of Micromachine Optical Probe (MOP) technology for the evaluation of micromachine performance. The MOP approach is based on the detection of optical signals scattered by the gear teeth or other physical structures. They present experimental results for a prototype system designed to measure engine parameters as well as long term performance data.

More Details

Minimizing sulfur contamination and rinse water volume required following a sulfuric acid/hydrogen peroxide clean by performing a chemically basic rinse

Clews, Peggy J.

Sulfuric acid hydrogen peroxide mixtures (SPM) are commonly used in the semiconductor industry to remove organic contaminants from wafer surfaces. This viscous solution is very difficult to rinse off wafer surfaces. Various rinsing conditions were tested and the resulting residual contamination on the wafer surface was measured. The addition of small amounts of a chemical base such as ammonium hydroxide to the rinse water has been found to be effective in reducing the surface concentration of sulfur and also mitigates the particle growth that occurs on SPM cleaned wafers. The volume of room temperature water required to rinse these wafers is also significantly reduced.

More Details

Intelligent tools and process development for robotic edge finishing: LDRD project final report

Lewis, Christopher L.

This report describes a project undertaken to develop an agile automated, high-precision edge finishing system, for fabricating precision parts. The project involved re-designing and adding additional capabilities to an existing finishing work-cell. The resulting work-cell may serve as prototype for production systems to be integrated in highly flexible automated production lines. The system removes burrs formed in the machining process and produces precision chamfers. The system uses an expert system to predict the burr size from the machining history. Within the CAD system, tool paths are generated for burr removal and chamfer formation. Then, the optimal grinding process is automatically selected from a database of processes. The tool trajectory and the selected process definition is then downloaded to a robotic control system to execute the operation. The robotic control system implements a hybrid fuzzy logic-classical control scheme to achieve the desired performance goals regardless of tolerance and fixturing errors. This report describes the system architecture and the system`s performance.

More Details

Electro-thermal modeling of a microbridge gas sensor

Manginell, R.P.; Smith, J.H.; Ricco, A.J.; Hughes, R.C.; Moreno, D.J.; Huber, R.J.

Fully CMOS-compatible, surface-micromachined polysilicon microbridges have been designed, fabricated, and tested for use in catalytic, calorimetric gas sensing. To improve sensor behavior, extensive electro-thermal modeling efforts were undertaken using SPICE. The validity of the SPICE model was verified comparing its simulated behavior with experiment. Temperature distribution of an electrically heated microbridges was measured using an infrared microscope. Comparisons among the measured distribution, the SPICE simulation, and distributions obtained by analytical methods show that heating at the ends of a microbridges has important implications for device response. Additional comparisons between measured and simulated current-voltage characteristics, as well as transient response, further support the accuracy of the model. A major benefit of electro- thermal modeling with SPICE is the ability to simultaneously simulate the behavior of a device and its control/sensing electronics. Results for the combination of a unique constant-resistance control circuit and microbridges gas sensor are given. Models of in situ techniques for monitoring catalyst deposition are shown to be in agreement with experiment. Finally, simulated chemical response of the detector is compared with the data, and methods of improving response through modifications in bridge geometry are predicted.

More Details

Friction and wear in surface micromachined tribological test devices

Senft, D.C.; Dugger, M.T.

We report on the design, construction, and initial testing of surface micromachined devices for measuring friction and wear. The devices measure friction coefficients on both horizontal deposited polysilicon surfaces and vertical etched polysilicon surfaces. The contact geometry of the rubbing surfaces is well-defined, and a method is presented for the determination of the normal and frictional forces. Initial observations on test devices which have been dried with supercritical CO{sub 2} and devices coated with octadecyltrichlorosilane suggest that the coatings increase the lifetime of the devices and the repeatability of the results.

More Details

A summary of the sources of input parameter values for the Waste Isolation Pilot Plant final porosity surface calculations

Butcher, B.M.

A summary of the input parameter values used in final predictions of closure and waste densification in the Waste Isolation Pilot Plant disposal room is presented, along with supporting references. These predictions are referred to as the final porosity surface data and will be used for WIPP performance calculations supporting the Compliance Certification Application to be submitted to the U.S. Environmental Protection Agency. The report includes tables and list all of the input parameter values, references citing their source, and in some cases references to more complete descriptions of considerations leading to the selection of values.

More Details

Potts-model grain growth simulations: Parallel algorithms and applications

Wright, Steven A.

Microstructural morphology and grain boundary properties often control the service properties of engineered materials. This report uses the Potts-model to simulate the development of microstructures in realistic materials. Three areas of microstructural morphology simulations were studied. They include the development of massively parallel algorithms for Potts-model grain grow simulations, modeling of mass transport via diffusion in these simulated microstructures, and the development of a gradient-dependent Hamiltonian to simulate columnar grain growth. Potts grain growth models for massively parallel supercomputers were developed for the conventional Potts-model in both two and three dimensions. Simulations using these parallel codes showed self similar grain growth and no finite size effects for previously unapproachable large scale problems. In addition, new enhancements to the conventional Metropolis algorithm used in the Potts-model were developed to accelerate the calculations. These techniques enable both the sequential and parallel algorithms to run faster and use essentially an infinite number of grain orientation values to avoid non-physical grain coalescence events. Mass transport phenomena in polycrystalline materials were studied in two dimensions using numerical diffusion techniques on microstructures generated using the Potts-model. The results of the mass transport modeling showed excellent quantitative agreement with one dimensional diffusion problems, however the results also suggest that transient multi-dimension diffusion effects cannot be parameterized as the product of the grain boundary diffusion coefficient and the grain boundary width. Instead, both properties are required. Gradient-dependent grain growth mechanisms were included in the Potts-model by adding an extra term to the Hamiltonian. Under normal grain growth, the primary driving term is the curvature of the grain boundary, which is included in the standard Potts-model Hamiltonian.

More Details

Electric utility capacity expansion and energy production models for energy policy analysis

Edenburn, Michael W.

This report describes electric utility capacity expansion and energy production models developed for energy policy analysis. The models use the same principles (life cycle cost minimization, least operating cost dispatching, and incorporation of outages and reserve margin) as comprehensive utility capacity planning tools, but are faster and simpler. The models were not designed for detailed utility capacity planning, but they can be used to accurately project trends on a regional level. Because they use the same principles as comprehensive utility capacity expansion planning tools, the models are more realistic than utility modules used in present policy analysis tools. They can be used to help forecast the effects energy policy options will have on future utility power generation capacity expansion trends and to help formulate a sound national energy strategy. The models make renewable energy source competition realistic by giving proper value to intermittent renewable and energy storage technologies, and by competing renewables against each other as well as against conventional technologies.

More Details

The role of technology in reducing health care costs. Final project report

Warren, S.

Sandia National Laboratories applied a systems approach to identifying innovative biomedical technologies with the potential to reduce U.S. health care delivery costs while maintaining care quality. This study was conducted by implementing both top-down and bottom-up strategies. The top-down approach used prosperity gaming methodology to identify future health care delivery needs. This effort provided roadmaps for the development and integration of technology to meet perceived care delivery requirements. The bottom-up approach identified and ranked interventional therapies employed in existing care delivery systems for a host of health-related conditions. Economic analysis formed the basis for development of care pathway interaction models for two of the most pervasive, chronic disease/disability conditions: coronary artery disease (CAD) and benign prostatic hypertrophy (BPH). Societal cost-benefit relationships based on these analyses were used to evaluate the effect of emerging technology in these treatment areas. 17 figs., 48 tabs.

More Details

Design, demonstration and evaluation of a thermal enhanced vapor extraction system

Phelan, James M.

The Thermal Enhanced Vapor Extraction System (TEVES), which combines powerline frequency heating (PLF) and radio frequency (RF) heating with vacuum soil vapor extraction, was used to effectively remove volatile organic compounds (VOCs) and semi-volatile organic compounds (SVOCs) from a pit in the chemical waste landfill (CWL) at Sandia National Laboratories (SNL) within a two month heating period. Volume average temperatures of 83{degrees}C and 112{degrees}C were reached for the PLF and RF heating periods, respectively, within the 15 ft x 45 ft x 18.5 ft deep treated volume. This resulted in the removal of 243 lb of measured toxic organic compounds (VOCs and SVOCs), 55 gallons of oil, and 11,000 gallons of water from the site. Reductions of up to 99% in total chromatographic organics (TCO) was achieved in the heated zone. Energy balance calculations for the PLF heating period showed that 36.4% of the heat added went to heating the soil, 38.5% went to evaporating water and organics, 4.2% went to sensible heat in the water, 7.1% went to heating the extracted air, and 6.6% was lost. For the RF heating period went to heating the soil, 23.5% went to evaporating water and organics, 2.4% went to sensible heat in the water, 7.5% went to heating extracted air, and 9.7% went to losses. Energy balance closure was 92.8% for the PLF heating and 98% for the RF heating. The energy input requirement per unit soil volume heated per unit temperature increase was 1.63 kWH/yd{sup 3}-{degrees}C for PLF heating and 0.73 kWH/yd{sup 3}{degrees}C for RF heating.

More Details

Conceptual model for transport processes in the Culebra Dolomite Member, Rustler Formation

Holt, R.M.

The Culebra Dolomite Member of the Rustler Formation represents a possible pathway for contaminants from the Waste Isolation Pilot Plant underground repository to the accessible environment. The geologic character of the Culebra is consistent with a double-porosity, multiple-rate model for transport in which the medium is conceptualized as consisting of advective porosity, where solutes are carried by the groundwater flow, and fracture-bounded zones of diffusive porosity, where solutes move through slow advection or diffusion. As the advective travel length or travel time increases, the nature of transport within a double-porosity medium changes. This behavior is important for chemical sorption, because the specific surface area per unit mass of the diffusive porosity is much greater than in the advective porosity. Culebra transport experiments conducted at two different length scales show behavior consistent with a multiple-rate, double-porosity conceptual model for Culebra transport. Tracer tests conducted on intact core samples from the Culebra show no evidence of significant diffusion, suggesting that at the core scale the Culebra can be modeled as a single-porosity medium where only the advective porosity participates in transport. Field tracer tests conducted in the Culebra show strong double-porosity behavior that is best explained using a multiple-rate model.

More Details

Final report of the environmental measurement-while-drilling-gamma ray spectrometer system technology demonstration at the Savannah River Site F-Area Retention Basin

Williams, Cecelia V.

The environmental measurement-while-drilling-gamma ray spectrometer (EMWD-GRS) system represents an innovative blend of new and existing technology that provides real-time environmental and drill bit data during drilling operations. The EMWD-GRS technology was demonstrated at Savannah River Site F-Area Retention Basin. The EMWD-GRS technology demonstration consisted of continuously monitoring for gamma-radiation-producing contamination while drilling two horizontal boreholes below the backfilled retention basin. These boreholes passed near previously sampled vertical borehole locations where concentrations of contaminant levels of cesium had been measured. Contaminant levels continuously recorded by the EMWD-GRs system during drilling are compared to contaminant levels previously determined through quantitative laboratory analysis of soil samples.

More Details
Results 93751–93800 of 99,299
Results 93751–93800 of 99,299