Publications

Results 88176–88200 of 99,299

Search results

Jump to search filters

On the Held-Karp relaxation for the asymmetric and symmetric traveling salesman problems

Mathematical Programming

Carr, Robert D.

A long-standing conjecture in combinatorial optimization says that the integrality gap of the famous Held-Karp relaxation of the metric STSP (Symmetric Traveling Salesman Problem) is precisely 4/3. In this paper, we show that a slight strengthening of this conjecture implies a tight 4/3 integrality gap for a linear programming relaxation of the metric ATSP (Asymmetric Traveling Salesman Problem). Our main tools are a new characterization of the integrality gap for linear objective functions over polyhedra, and the isolation of "hard-to-round" solutions of the relaxations. © Springer-Verlag 2004.

More Details

An explanation for the minimal effect of body curvature on hypervelocity penetration hole formation

International Journal of Solids and Structures

Dechant, Lawrence

Though not discussed extensively in the literature, it is known among workers in impact and penetration dynamics, e.g. the CTH analysis and development team at Sandia National Laboratories, that curvature of thin plates has a minimal effect on the penetration hole diameter due to a hypervelocity impact. To understand why curvature introduces a minimal effect on penetration hole size we extend a flat plate penetration hole diameter relationship (De Chant (2004a) Unpublished manuscript; De Chant (2004b) Mechanics of Materials, in press) to include the effect of body curvature. The effect of the body curvature on the hole diameter is shown to scale according to the dimensionless plate thickness to radius of curvature of the body i.e. h/R, which is typically small. Indeed for most problems where a single layer shell (plate) can be meaningfully defined, the effect of curvature upon hole diameter is on the order of other uncertainties in the problem, e.g. doubts concerning the appropriate equation of state and strength model, and is often, therefore, negligible. © 2004 Published by Elsevier Ltd.

More Details

A generalized approximation for the thermophoretic force on a free-molecular particle

Aerosol Science and Technology

Gallis, Michael A.; Rader, Daniel J.; Torczynski, John R.

A general, approximate expression is described that can be used to predict the thermophoretic force on a free-molecular, motionless, spherical particle suspended in a quiescent gas with a temperature gradient. The thermophoretic force is equal to the product of an order-unity coefficient, the gas-phase translational heat flux, the particle cross-sectional area, and the inverse of the mean molecular speed. Numerical simulations are used to test the accuracy of this expression for monatomic gases, polyatomic gases, and mixtures thereof. Both continuum and noncontinuum conditions are examined; in particular, the effects of low pressure, wall proximity, and high heat flux are investigated. The direct simulation Monte Carlo (DSMC) method is used to calculate the local molecular velocity distribution, and the force-Green's-function method is used to calculate the thermophoretic force. The approximate expression is found to predict the calculated thermophoretic force to within 10% for all cases examined.

More Details

A response-modeling approach to characterization and propagation of uncertainty specified over intervals

Reliability Engineering and System Safety

Rutherford, Brian

Computational simulation methods have advanced to a point where simulation can contribute substantially in many areas of systems analysis. One research challenge that has accompanied this transition involves the characterization of uncertainty in both computer model inputs and the resulting system response. This article addresses a subset of the 'challenge problems' posed in [Challenge problems: uncertainty in system response given uncertain parameters, 2001] where uncertainty or information is specified over intervals of the input parameters and inferences based on the response are required. The emphasis of the article is to describe and illustrate a method for performing tasks associated with this type of modeling 'economically'-requiring relatively few evaluations of the system to get a precise estimate of the response. This 'response-modeling approach' is used to approximate a probability distribution for the system response. The distribution is then used: (1) to make inferences concerning probabilities associated with response intervals and (2) to guide in determining further, informative, system evaluations to perform. © 2004 Elsevier Ltd. All rights reserved.

More Details

Simulations of the pipe overpack to compute constitutive model parameters for use in WIPP room closure calculations

Park, Byoung; Hansen, Francis D.

The regulatory compliance determination for the Waste Isolation Pilot Plant includes the consideration of room closure. Elements of the geomechanical processes include salt creep, gas generation and mechanical deformation of the waste residing in the rooms. The WIPP was certified as complying with regulatory requirements based in part on the implementation of room closure and material models for the waste. Since the WIPP began receiving waste in 1999, waste packages have been identified that are appreciably more robust than the 55-gallon drums characterized for the initial calculations. The pipe overpack comprises one such waste package. This report develops material model parameters for the pipe overpack containers by using axisymmetrical finite element models. Known material properties and structural dimensions allow well constrained models to be completed for uniaxial, triaxial, and hydrostatic compression of the pipe overpack waste package. These analyses show that the pipe overpack waste package is far more rigid than the originally certified drum. The model parameters developed in this report are used subsequently to evaluate the implications to performance assessment calculations.

More Details

Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool

Hart, Darren

MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefronts at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.

More Details

Inspection strategy for LIGA microstructures using a programmable optical microscope

Ceremuga, Joseph T.; Aigeldinger, Georg

The LIGA process has the ability to fabricate very precise, high aspect ratio mesoscale structures with microscale features [l]. The process consists of multiple steps before a final part is produced. Materials native to the LIGA process include metals and photoresists. These structures are routinely measured for quality control and process improvement. However, metrology of LIGA structures is challenging because of their high aspect ratio and edge topography. For the scale of LIGA structures, a programmable optical microscope is well suited for lateral (XU) critical dimension measurements. Using grayscale gradient image processing with sub-pixel interpolation, edges are detected and measurements are performed. As with any measurement, understanding measurement uncertainty is necessary so that appropriate conclusions are drawn from the data. Therefore, the abilities of the inspection tool and the obstacles presented by the structures under inspection should be well understood so that precision may be quantified. This report presents an inspection method for LIGA microstructures including a comprehensive assessment of the uncertainty for each inspection scenario.

More Details

A user's guide to Sandia's latin hypercube sampling software : LHS UNIX library/standalone version

Swiler, Laura P.; Wyss, Gregory D.

This document is a reference guide for the UNIX Library/Standalone version of the Latin Hypercube Sampling Software. This software has been developed to generate Latin hypercube multivariate samples. This version runs on Linux or UNIX platforms. This manual covers the use of the LHS code in a UNIX environment, run either as a standalone program or as a callable library. The underlying code in the UNIX Library/Standalone version of LHS is almost identical to the updated Windows version of LHS released in 1998 (SAND98-0210). However, some modifications were made to customize it for a UNIX environment and as a library that is called from the DAKOTA environment. This manual covers the use of the LHS code as a library and in the standalone mode under UNIX.

More Details

Wave-form optimization for a 60 MA Z-Pinch driver

McDaniel, Dillon H.

A new Z-pinch driver is being planned by Sandia National Laboratories (SNL) that will provide up to 16 MJ of X-ray radiation. Two load designs are being considered. One is a double Z-pinch configuration, with each load providing 7 MJ radiation. The other is a single Z-pinch configuration that produces 16 MJ. Both configurations require 100 to 120 ns implosion times, and radiation pulse widths of less than 10 ns. These requirements translate into two 40 MA drivers for the double-sided load, and a 60 MA driver for the single-load configuration. The design philosophy for this machine is to work from the load out. Radiation requirements determine the current, pulsewidth, and load-inductance requirements. These parameters set the drive wave-form and insulator voltage, which in turn determine the insulator-stack design. The goal is to choose a drive wave-form that meets the load requirements while optimizing efficiency and minimizing breakdown risk.

More Details

Measures of effectiveness:an annotated bibliography

Campbell, Philip L.

The purpose of this report is to provide guidance, from the open literature, on developing a set of ''measures of effectiveness'' (MoEs) and using them to evaluate a system. Approximately twenty papers and books are reviewed. The papers that provide the clearest understanding of MoEs are identified (Sproles [46], [48], [50]). The seminal work on value-focused thinking (VFT), an approach that bridges the gap between MoEs and a system, is also identified (Keeney [25]). And finally three examples of the use of VFT in evaluating a system based on MoEs are identified (Jackson et al. [21], Kerchner & Deckro [27], and Doyle et al. [14]). Notes are provided of the papers and books to pursue in order to take this study to the next level of detail.

More Details

ITS strategic test plan : revision 1.0

Lorence, Leonard; Franke, Brian C.; Kensek, Ronald P.; Laub, Thomas W.; Barteau, Lisa A.

This test plan describes the testing strategy for the ITS (Integrated-TIGER-Series) suite of codes. The processes and procedures for performing both verification and validation tests are described. ITS Version 5.0 was developed under the NNSA's ASC program and supports Sandia's stockpile stewardship mission.

More Details

Testing thermocline filler materials and molten-salt heat transfer fluids for thermal energy storage systems used in parabolic trough solar power plants

Brosseau, Douglas A.; Hlava, Paul F.; Kelly, Michael J.

Parabolic trough power systems that utilize concentrated solar energy to generate electricity are a proven technology. Industry and laboratory research efforts are now focusing on integration of thermal energy storage as a viable means to enhance dispatchability of concentrated solar energy. One option to significantly reduce costs is to use thermocline storage systems, low-cost filler materials as the primary thermal storage medium, and molten nitrate salts as the direct heat transfer fluid. Prior thermocline evaluations and thermal cycling tests at the Sandia National Laboratories' National Solar Thermal Test Facility identified quartzite rock and silica sand as potential filler materials. An expanded series of isothermal and thermal cycling experiments were planned and implemented to extend those studies in order to demonstrate the durability of these filler materials in molten nitrate salts over a range of operating temperatures for extended timeframes. Upon test completion, careful analyses of filler material samples, as well as the molten salt, were conducted to assess long-term durability and degradation mechanisms in these test conditions. Analysis results demonstrate that the quartzite rock and silica sand appear able to withstand the molten salt environment quite well. No significant deterioration that would impact the performance or operability of a thermocline thermal energy storage system was evident. Therefore, additional studies of the thermocline concept can continue armed with confidence that appropriate filler materials have been identified for the intended application.

More Details

Investigation of reliability method formulations in Dakota/UQ

Proposed for publication in Structure and Infrastructure Engineering: Maintenance, Management, Life-Cycle Design & Performance.

Wojtkiewicz, Steven F.

Reliability methods are probabilistic algorithms for quantifying the effect of simulation input uncertainties on response metrics of interest. In particular, they compute approximate response function distribution statistics (probability, reliability and response levels) based on specified input random variable probability distributions. In this paper, a number of algorithmic variations are explored for both the forward reliability analysis of computing probabilities for specified response levels (the reliability index approach (RIA)) and the inverse reliability analysis of computing response levels for specified probabilities (the performance measure approach (PMA)). These variations include limit state linearizations, probability integrations, warm starting and optimization algorithm selections. The resulting RIA/PMA reliability algorithms for uncertainty quantification are then employed within bi-level and sequential reliability-based design optimization approaches. Relative performance of these uncertainty quantification and reliability-based design optimization algorithms are presented for a number of computational experiments performed using the DAKOTA/UQ software.

More Details

Validating DOE's Office of Science "capability" computing needs

Leland, Robert W.; Camp, William J.

A study was undertaken to validate the 'capability' computing needs of DOE's Office of Science. More than seventy members of the community provided information about algorithmic scaling laws, so that the impact of having access to Petascale capability computers could be assessed. We have concluded that the Office of Science community has described credible needs for Petascale capability computing.

More Details
Results 88176–88200 of 99,299
Results 88176–88200 of 99,299