Publications

Results 90551–90575 of 99,299

Search results

Jump to search filters

Active Research Topics in Human Machine Interfaces

Mcdonald, Michael J.

This paper identifies active research topics concerning human machine interfaces for intelligent machine systems. The paper was compiled by performing a series of literature searches and organizing the information according to the author's interest in better directing his own Human Machine Interface (HMI) research. Introductory literature from outside the HMI communities is also referenced to provide context.

More Details

SETEC/Semiconductor Manufacturing Technologies Program: 1999 Annual and Final Report

Mcbrayer, John D.

This report summarizes the results of work conducted by the Semiconductor Manufacturing Technologies Program at Sandia National Laboratories (Sandia) during 1999. This work was performed by one working group: the Semiconductor Equipment Technology Center (SETEC). The group's projects included Numerical/Experimental Characterization of the Growth of Single-Crystal Calcium Fluoride (CaF{sub 2}); The Use of High-Resolution Transmission Electron Microscopy (HRTEM) Imaging for Certifying Critical-Dimension Reference Materials Fabricated with Silicon Micromachining; Assembly Test Chip for Flip Chip on Board; Plasma Mechanism Validation: Modeling and Experimentation; and Model-Based Reduction of Contamination in Gate-Quality Nitride Reactor. During 1999, all projects focused on meeting customer needs in a timely manner and ensuring that projects were aligned with the goals of the National Technology Roadmap for Semiconductors sponsored by the Semiconductor Industry Association and with Sandia's defense mission. This report also provides a short history of the Sandia/SEMATECH relationship and a brief on all projects completed during the seven years of the program.

More Details

Aspen-EE: An Agent-Based Model of Infrastructure Interdependency

Barton, Dianne C.; Eidson, Eric D.; Schoenwald, David A.; Stamber, Kevin L.; Reinert, Rhonda K.

This report describes the features of Aspen-EE (Electricity Enhancement), a new model for simulating the interdependent effects of market decisions and disruptions in the electric power system on other critical infrastructures in the US economy. Aspen-EE extends and modifies the capabilities of Aspen, an agent-based model previously developed by Sandia National Laboratories. Aspen-EE was tested on a series of scenarios in which the rules governing electric power trades were changed. Analysis of the scenario results indicates that the power generation company agents will adjust the quantity of power bid into each market as a function of the market rules. Results indicate that when two power markets are faced with identical economic circumstances, the traditionally higher-priced market sees its market clearing price decline, while the traditionally lower-priced market sees a relative increase in market clearing price. These results indicate that Aspen-EE is predicting power market trends that are consistent with expected economic behavior.

More Details

A SAR ATR algorithm based on coherent change detection

Harmony, David W.

This report discusses an automatic target recognition (ATR) algorithm for synthetic aperture radar (SAR) imagery that is based on coherent change detection techniques. The algorithm relies on templates created from training data to identify targets. Objects are identified or rejected as targets by comparing their SAR signatures with templates using the same complex correlation scheme developed for coherent change detection. Preliminary results are presented in addition to future recommendations.

More Details

Final Report for the Virtual Reliability Realization System LDRD

Dellin, Theodore A.; Henderson, Christopher L.; Toole, Edward J.

Current approaches to reliability are not adequate to keep pace with the need for faster, better and cheaper products and systems. This is especially true in high consequence of failure applications. The original proposal for the LDRD was to look at this challenge and see if there was a new paradigm that could make reliability predictions, along with a quantitative estimate of the risk in that prediction, in a way that was faster, better and cheaper. Such an approach would be based on the underlying science models that are the backbone of reliability predictions. The new paradigm would be implemented in two software tools: the Virtual Reliability Realization System (VRRS) and the Reliability Expert System (REX). The three-year LDRD was funded at a reduced level for the first year ($120K vs. $250K) and not renewed. Because of the reduced funding, we concentrated on the initial development of the expertise system. We developed an interactive semiconductor calculation tool needed for reliability analyses. We also were able to generate a basic functional system using Microsoft Siteserver Commerce Edition and Microsoft Sequel Server. The base system has the capability to store Office documents from multiple authors, and has the ability to track and charge for usage. The full outline of the knowledge model has been incorporated as well as examples of various types of content.

More Details

Advanced Production Planning Models

Jones, Dean A.; Lawton, Craig; Kjeldgaard, Edwin A.; Wright, Stephen T.

>This report describes the innovative modeling approach developed as a result of a 3-year Laboratory Directed Research and Development project. The overall goal of this project was to provide an effective suite of solvers for advanced production planning at facilities in the nuclear weapons complex (NWC). We focused our development activities on problems related to operations at the DOE's Pantex Plant. These types of scheduling problems appear in many contexts other than Pantex--both within the NWC (e.g., Neutron Generators) and in other commercial manufacturing settings. We successfully developed an innovative and effective solution strategy for these types of problems. We have tested this approach on actual data from Pantex, and from Org. 14000 (Neutron Generator production). This report focuses on the mathematical representation of the modeling approach and presents three representative studies using Pantex data. Results associated with the Neutron Generator facility will be published in a subsequent SAND report. The approach to task-based scheduling described here represents a significant addition to the literature for large-scale, realistic scheduling problems in a variety of production settings.

More Details

PICO: An Object-Oriented Framework for Branch and Bound

Hart, William E.; Phillips, Cynthia A.

This report describes the design of PICO, a C++ framework for implementing general parallel branch-and-bound algorithms. The PICO framework provides a mechanism for the efficient implementation of a wide range of branch-and-bound methods on an equally wide range of parallel computing platforms. We first discuss the basic architecture of PICO, including the application class hierarchy and the package's serial and parallel layers. We next describe the design of the serial layer, and its central notion of manipulating subproblem states. Then, we discuss the design of the parallel layer, which includes flexible processor clustering and communication rates, various load balancing mechanisms, and a non-preemptive task scheduler running on each processor. We describe the application of the package to a branch-and-bound method for mixed integer programming, along with computational results on the ASCI Red massively parallel computer. Finally we describe the application of the branch-and-bound mixed-integer programming code to a resource constrained project scheduling problem for Pantex.

More Details

ATR2000 Mercury/MPI Real-Time ATR System User's Guide

Meyer, R.H.; Doerfler, Douglas W.

The Air Force's Electronic Systems Center has funded Sandia National Laboratories to develop an Automatic Target Recognition (ATR) System for the Air Force's Joint STARS platform using Mercury Computer systems hardware. This report provides general theory on the internal operations of the Real-Time ATR system and provides some basic techniques that can be used to reconfigure the system and monitor its runtime operation. In addition, general information on how to interface an image formation processor and a human machine interface to the ATR is provided. This report is not meant to be a tutorial on the ATR algorithms.

More Details

Pointing Control System for a High Precision Flight Telescope

Bentley, Anthony E.; Wilcoxen, Jeffrey L.

A pointing control system is developed and tested for a flying gimbaled telescope. The two-axis pointing system is capable of sub-microradian pointing stability and high accuracy in the presence of large host vehicle jitter. The telescope also has high agility--it is capable of a 50-degree retarget (in both axes simultaneously) in less than 2 seconds. To achieve the design specifications, high-accuracy, high-resolution, two-speed resolvers were used, resulting in gimbal-angle measurements stable to 1.5 microradians. In addition, on-axis inertial angle displacement sensors were mounted on the telescope to provide host-vehicle jitter cancellation. The inertial angle sensors are accurate to about 100 nanoradians, but do not measure low frequency displacements below 2 Hz. The gimbal command signal includes host-vehicle attitude information, which is band-limited. This provides jitter data below 20 Hz, but includes a variable latency between 15 and 25 milliseconds. One of the most challenging aspects of this design was to combine the inertial-angle-sensor data with the less perfect information in the command signal to achieve maximum jitter reduction. The optimum blending of these two signals, along with the feedback compensation were designed using Quantitative Feedback Theory.

More Details

Lightning Induced Arcing an LDRD Report

Jorgenson, Roy E.; Warne, Larry K.

The purpose of this research was to develop a science-based understanding of the early-time behavior of electric surface arcing in air at atmospheric pressure. As a first step towards accomplishing this, we used a kinetic approach to model an electron swarm as it evolved in a neutral gas under the influence of an applied electric field. A computer code was written in which pseudo-particles, each representing some number of electrons, were accelerated by an electric field. The electric field due to the charged particles was calculated efficiently using a tree algorithm. Collision of the electrons with the background gas led to the creation of new particles through the processes of ionization and photoionization. These processes were accounted for using measured cross-section data and Monte Carlo methods. A dielectric half-space was modeled by imaging the charges in its surface. Secondary electron emission from the surface, resulting in surface charging, was also calculated. Simulation results show the characteristics of a streamer in three dimensions. A numerical instability was encountered before the streamer matured to form branching.

More Details

Massively Parallel Methods for Simulating the Phase-Field Model

Tikare, Veena

Prediction of the evolution of microstructures in weapons systems is critical to meeting the objectives of stockpile stewardship in accordance with the Nuclear Weapons Test Ban Treaty. For example, accurate simulation of microstructural evolution in solder joints, cermets, PZT power generators, etc. is necessary for predicting the performance, aging, and reliability both of individual components and of entire weapons systems. A recently developed but promising approach called the ''Phase-Field Model'' (PFM) has the potential of allowing the accurate quantitative prediction of microstructural evolution, with all the spatial and thermodynamic complexity of a real microstructure. Simulating with the PFM requires solving a set of coupled nonlinear differential equations, one for each material variable (e.g., grain orientation, phase, composition, stresses, anisotropy, etc.). While the PFM is versatile and is able to incorporate the necessary complexity for modeling real material systems, it is very computationally intensive, and it has been a difficult and major challenge to formulate an efficient algorithmic implementation of the approach. We found that second order in space algorithm is more stable and leads to more accurate results. However, the computational requirements still remain high, so we have developed a single field algorithm to reduce the computations by 2 orders of magnitude. We have created a 3-D parallel version of the basic phase-field (PF model) and benchmarked it performance. Preliminary results indicate that we will be able to run very large problems effectively with the new parallel code. Microstructural evolution in a diffusion couple was simulated using PFM to simultaneously simulate grain growth, diffusion and phase transformation. Solute drag in a variable composition material, a process no other model can simulate, was successfully simulated using the phase-field model. The phase field model was used to study the evolution of fractal high curvature structures to show that these structures have very different morphological and kinetic behaviors than those of equi-axed structures.

More Details

Application of Roll-Isolated Inertial Measurement Units to the Instrumentation of Spinning Vehicles

Beader, Mark E.

Roll-isolated inertial measurement units are developed at Sandia for use in the instrumentation, guidance, and control of rapidly spinning vehicles. Roll-isolation is accomplished by supporting the inertial instrument cluster (gyros and accelerometers) on a single gimbal, the axis of which is parallel to the vehicle's spin axis. A rotary motor on the gimbal is driven by a servo loop to null the roll gyro output, thus inertially stabilizing the gimbal and instrument cluster while the vehicle spins around it. Roll-isolation prevents saturation of the roll gyro by the high vehicle spin rate, and vastly reduces measurement errors arising from gyro scale factor and alignment uncertainties. Nine versions of Sandia-developed roll-isolated inertial measurement units have been flown on a total of 27 flight tests since 1972.

More Details

Dispersive Velocity Measurements in Heterogeneous Materials

Trott, Wayne M.; Castaeda, Jaime N.; Baer, M.R.; Chhabildas, L.C.; Knudson, Marcus D.; Davis, Jean-Paul; Asay, J.R.

In order to provide real-time data for validation of three dimensional numerical simulations of heterogeneous materials subjected to impact loading, an optically recording velocity interferometer system (ORVIS) has been adapted to a line-imaging instrument capable of generating precise mesoscopic scale measurements of spatially resolved velocity variations during dynamic deformation. Combining independently variable target magnification and interferometer fringe spacing, this instrument can probe a velocity field along line segments up to 15 mm in length. In high magnification operation, spatial resolution better than 10 {micro}m can be achieved. For events appropriate to short recording times, streak camera recording can provide temporal resolution better than 0.2 ns. A robust method for extracting spatially resolved velocity-time profiles from streak camera image data has been developed and incorporated into a computer program that utilizes a standard VISAR analysis platform. The use of line-imaging ORVIS to obtain measurements of the mesoscopic scale dynamic response of shocked samples has been demonstrated on several different classes of heterogeneous materials. Studies have focused on pressed, granular sugar as a simulant material for the widely used explosive HMX. For low-density (65% theoretical maximum density) pressings of sugar, material response has been investigated as a function of both impact velocity and changes in particle size distribution. The experimental results provide a consistent picture of the dispersive nature of the wave transmitted through these samples and reveal both transverse and longitudinal wave structures on mesoscopic scales. This observed behavior is consistent with the highly structured mesoscopic response predicted by 3-D simulations. Preliminary line-imaging ORVIS measurements on HMX as well as other heterogeneous materials such as foam and glass-reinforced polyester are also discussed.

More Details

Visualization of Information Spaces with VxInsight

Wylie, Brian N.; Boyack, Kevin W.; Davidson, George S.

VxInsight provides a visual mechanism for browsing, exploring and retrieving information from a database. The graphical display conveys information about the relationship between objects in several ways and on multiple scales. In this way, individual objects are always observed within a larger context. For example, consider a database consisting of a set of scientific papers. Imagine that the papers have been organized in a two dimensional geometry so that related papers are located close to each other. Now construct a landscape where the altitude reflects the local density of papers. Papers on physics will form a mountain range, and a different range will stand over the biological papers. In between will be research reports from biophysics and other bridging disciplines. Now, imagine exploring these mountains. If we zoom in closer, the physics mountains will resolve into a set of sub-disciplines. Eventually, by zooming in far enough, the individual papers become visible. By pointing and clicking you can learn more about papers of interest or retrieve their full text. Although physical proximity conveys a great deal of information about the relationship between documents, you can also see which papers reference which others, by drawing lines between the citing and cited papers. For even more information, you can choose to highlight papers by a particular researcher or a particular institution, or show the accumulation of papers through time, watching some disciplines explode and other stagnate. VxInsight is a general purpose tool, which enables this kind of interaction with wide variety of relational data: documents, patents, web pages, and financial transactions are just a few examples. The tool allows users to interactively browse, explore and retrieve information from the database in an intuitive way.

More Details

Final Report: Weighted Neighbor Data Mining

Carlson, Jeffrey; Muguira, Maritza R.

Data mining involves the discovery and fusion of features from large databases to establish minimal probability of error (MPE) decision and estimation models. Our approach combines a weighted nearest neighbor (WNN) decision model for classification and estimation with genetic algorithms (GA) for feature discovery and model optimization. The WNN model is used to provide a mathematical framework for adaptively discovering and fusing features into near-MPE decision algorithms. The GA is used to discover weighted features and select decision points for the WNN decision model to achieve near-MPE decisions. The performance of the WNN fusion model is demonstrated on the first of two very different problems to demonstrate its robust and practical application to a wide variety of data-mining problems. The first problem involves the isolation of factors that cause hepatitis C virus (HCV) and requires the evaluation of large databases to establish the critical features that can detect with minimal error whether a person is at risk of having HCV. This requires discovering and extracting relevant information (features) from a questionnaire database and combining (fusing) them to achieve a minimal error decision rule. The primary objective of the research is to develop a practical basis for fusing information from questionnaires administered at hospitals to identify and verify features important to isolate risk factors for HCV. The basic problem involves creating a feature database from the questionnaire information, discovering features that provide sufficient information to reliably identify when a person is at risk under conditions with uncertainties caused by recording errors and evasive tactics of people answering the questionnaire. The results of this study demonstrate the WNN fusion algorithm ability to perform in supervised learning environments. The second phase of the research project is directed at the unsupervised learning environment. In this environment the feature data is presented without any classification. Clustering algorithms are developed to partition the feature data into clusters based upon similarity measure models. After the feature data is clustered and classified the supervised WNN fusion algorithms are used to classify the data based upon the minimal probability of error decision rule.

More Details

Final Report of LDRD Project: An Electromagnetic Imaging System for Environmental Site Reconnaissance

Denison, Gary J.; Loubriel, Guillermo M.; Buttram, Malcolm T.; Rinehart, Larry F.; O'Malley, Martin W.; Zutavern, Fred J.

This report provides a summary of the LDRD project titled: An Electromagnetic Imaging System for Environmental Site Reconnaissance. The major initial challenge of this LDRD was to develop a ground penetrating radar (GPR) whose peak and average radiated power surpassed that of any other in existence. Goals were set to use such a system to detect the following: (1) disrupted soil layers where there is potential for buried waste, (2) buried objects such as 55-gallon drums at depths up to 3 m, and (3) detecting contaminated soil. Initial modeling of the problem suggested that for soil conditions similar to Puerto Rican clay loam, moisture content 10 percent (conductivity = 0.01 mhos at 350 MHz), a buried 55-gallon drum could be detected in a straightforward manner by an UWB GPR system at a depth of 3 meters. From the simulations, the highest attenuation ({minus}50 dB) was the result of scattering from a 3-m deep vertically orientated drum. A system loss of {minus}100 dB is a typical limit for all kinds of radar systems (either direct time-domain or swept frequency). The modeling work also determined that the waveshape of the pulse scattered off the buried drum would be relatively insensitive to drum orientation, and thus easier to detect with the GPR system.

More Details
Results 90551–90575 of 99,299
Results 90551–90575 of 99,299