Publications

Results 1–25 of 27

Search results

Jump to search filters

Gemma V&V/UQ/Credibility Activities: FY2020 Progress

Jelsema, Casey M.; Red-Horse, John; Rutherford, Brian; Huerta, Jose G.; Eckert, Aubrey

This report describes the credibility activities undertaken in support of Gemma code development in FY20, which include Verification & Validation (V&V), Uncertainty Quantification (UQ), and Credibility process application. The main goal of these activities is to establish capabilities and process frameworks that can be more broadly applied to new and more advanced problems as the Gemma code development effort matures. This will provide Gemma developers and analysts with the tools needed to generate credibility evidence in support of Gemma predictions for future use cases. The FY20 Gemma V&V/UQ/Credibility activities described in this report include experimental uncertainty analysis, the development and use of methods for optimal design of computer experiments, and the development of a framework for validation. These initial activities supported the development of broader credibility planning for Gemma that continued into FY21.

More Details

1 Mil Gold Bond Wire Study

Huff, Johnathon; Mclean, Michael B.; Jenkins, Mark W.; Rutherford, Brian

In microcircuit fabrication, the diameter and length of a bond wire have been shown to both affect the current versus fusing time ratio of a bond wire as well as the gap length of the fused wire. This study investigated the impact of current level on the time-to-open and gap length of 1 mil by 60 mil gold bond wires. During the experiments, constant current was provided for a control set of bond wires for 250ms, 410ms and until the wire fused; non-destructively pull-tested wires for 250ms; and notched wires. The key findings were that as the current increases, the gap length increases and 73% of the bond wires will fuse at 1.8A, and 100% of the wires fuse at 1.9A within 60ms. Due to the limited scope of experiments and limited data analyzed, further investigation is encouraged to confirm these observations.

More Details

Conductor fusing and gapping for bond wires

Progress in Electromagnetics Research M

Chen, Kenneth C.; Warne, Larry K.; Kinzel, Robert L.; Huff, Johnathon; Mclean, Michael B.; Jenkins, Mark W.; Rutherford, Brian

In this paper, fusing of a metallic conductor is studied by judiciously using the solution of the one-dimensional heat equation, resulting in an approximate method for determining the threshold fusing current. The action is defined as an integration of the square of the wire current over time. The burst action (the action required to completely vaporize the material) for an exploding wire is then used to estimate the typical wire gapping action (involving wire fusing), from which gapping time can be estimated for a gapping current greater than a factor of two over the fusing current. The test data are used to determine the gapped length as a function of gapping current and to show, for a limited range, that the gapped length is inversely proportional to gapping time. The gapping length can be used as a signature of the fault current level in microelectronic circuits.

More Details

Some statistical procedures to refine estimates of uncertainty when sparse data are available for model validation and calibration

Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference

Romero, Vicente J.; Rutherford, Brian; Newcomer, Justin T.

This paper presents some statistical concepts and techniques for refining the expression of uncertainty arising from: a) random variability (aleatory uncertainty) of a random quantity; and b) contributed epistemic uncertainty due to limited sampling of the random quantity. The treatment is tailored to handling experimental uncertainty in a context of model validation and calibration. Two particular problems are considered. One involves deconvolving random measurement error from measured random response. The other involves exploiting a relationship between two random variates of a system and an independently characterized probability density of one of the variates.

More Details

Case study for model validation : assessing a model for thermal decomposition of polyurethane foam

Dowding, Kevin J.; Pilch, Martin; Rutherford, Brian; Hobbs, Michael L.

A case study is reported to document the details of a validation process to assess the accuracy of a mathematical model to represent experiments involving thermal decomposition of polyurethane foam. The focus of the report is to work through a validation process. The process addresses the following activities. The intended application of mathematical model is discussed to better understand the pertinent parameter space. The parameter space of the validation experiments is mapped to the application parameter space. The mathematical models, computer code to solve the models and its (code) verification are presented. Experimental data from two activities are used to validate mathematical models. The first experiment assesses the chemistry model alone and the second experiment assesses the model of coupled chemistry, conduction, and enclosure radiation. The model results of both experimental activities are summarized and uncertainty of the model to represent each experimental activity is estimated. The comparison between the experiment data and model results is quantified with various metrics. After addressing these activities, an assessment of the process for the case study is given. Weaknesses in the process are discussed and lessons learned are summarized.

More Details

A response-modeling approach to characterization and propagation of uncertainty specified over intervals

Reliability Engineering and System Safety

Rutherford, Brian

Computational simulation methods have advanced to a point where simulation can contribute substantially in many areas of systems analysis. One research challenge that has accompanied this transition involves the characterization of uncertainty in both computer model inputs and the resulting system response. This article addresses a subset of the 'challenge problems' posed in [Challenge problems: uncertainty in system response given uncertain parameters, 2001] where uncertainty or information is specified over intervals of the input parameters and inferences based on the response are required. The emphasis of the article is to describe and illustrate a method for performing tasks associated with this type of modeling 'economically'-requiring relatively few evaluations of the system to get a precise estimate of the response. This 'response-modeling approach' is used to approximate a probability distribution for the system response. The distribution is then used: (1) to make inferences concerning probabilities associated with response intervals and (2) to guide in determining further, informative, system evaluations to perform. © 2004 Elsevier Ltd. All rights reserved.

More Details

An approach to model validation and model-based prediction -- polyurethane foam case study

Rutherford, Brian; Dowding, Kevin J.

Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical analyses and hypothesis tests as a part of the validation step to provide feedback to analysts and modelers. Decisions on how to proceed in making model-based predictions are made based on these analyses together with the application requirements. Updating modifying and understanding the boundaries associated with the model are also assisted through this feedback. (4) We include a ''model supplement term'' when model problems are indicated. This term provides a (bias) correction to the model so that it will better match the experimental results and more accurately account for uncertainty. Presumably, as the models continue to develop and are used for future applications, the causes for these apparent biases will be identified and the need for this supplementary modeling will diminish. (5) We use a response-modeling approach for our predictions that allows for general types of prediction and for assessment of prediction uncertainty. This approach is demonstrated through a case study supporting the assessment of a weapons response when subjected to a hydrocarbon fuel fire. The foam decomposition model provides an important element of the response of a weapon system in this abnormal thermal environment. Rigid foam is used to encapsulate critical components in the weapon system providing the needed mechanical support as well as thermal isolation. Because the foam begins to decompose at temperatures above 250 C, modeling the decomposition is critical to assessing a weapons response. In the validation analysis it is indicated that the model tends to ''exaggerate'' the effect of temperature changes when compared to the experimental results. The data, however, are too few and to restricted in terms of experimental design to make confident statements regarding modeling problems. For illustration, we assume these indications are correct and compensate for this apparent bias by constructing a model supplement term for use in the model-based predictions. Several hypothetical prediction problems are created and addressed. Hypothetical problems are used because no guidance was provided concerning what was needed for this aspect of the analysis. The resulting predictions and corresponding uncertainty assessment demonstrate the flexibility of this approach.

More Details

A response modeling approach to experimental design for optimal product design

4th International Symposium on Uncertainty Modeling and Analysis, ISUMA 2003

Rutherford, Brian

The general problem considered is an optimization problem involving product design where some initial data are available and computer simulation is to be used to obtain more information. Resources and system complexity together restrict the number of simulations that can be performed in search of optimal settings for the product parameters. Consequently levels of these parameters, used in the simulations, (the experimental design) must be selected in an efficient way. We describe an algorithmic 'response-modeling' approach for performing this selection. The algorithm is illustrated using a rolamite design application. We provide (as examples) optimal one, two and three-point experimental designs for the rolamite computational analyses.

More Details

Description of the Sandia Validation Metrics Project

Trucano, Timothy G.; Easterling, Robert G.; Dowding, Kevin J.; Paez, Thomas L.; Urbina, Angel U.; Romero, Vicente J.; Rutherford, Brian; Hills, Richard G.

This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomechanics, that serve to focus the technical work of the project in Fiscal Year 2001.

More Details

Methodology for characterizing modeling and discretization uncertainties in computational simulation

Alvin, Kenneth F.; Oberkampf, William L.; Rutherford, Brian; Diegert, Kathleen V.

This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

More Details

A methodology for selecting an optimal experimental design for the computer analysis of a complex system

Technometrics

Rutherford, Brian

Investigation and evaluation of a complex system is often accomplished through the use of performance measures based on system response models. The response models are constructed using computer-generated responses supported where possible by physical test results. The general problem considered is one where resources and system complexity together restrict the number of simulations that can be performed. The levels of input variables used in defining environmental scenarios, initial and boundary conditions and for setting system parameters must be selected in an efficient way. This report describes an algorithmic approach for performing this selection.

More Details

A Resampling Based Approach to Optimal Experimental Design for Computer Analysis of a Complex System

Rutherford, Brian

The investigation of a complex system is often performed using computer generated response data supplemented by system and component test results where possible. Analysts rely on an efficient use of limited experimental resources to test the physical system, evaluate the models and to assure (to the extent possible) that the models accurately simulate the system order investigation. The general problem considered here is one where only a restricted number of system simulations (or physical tests) can be performed to provide additional data necessary to accomplish the project objectives. The levels of variables used for defining input scenarios, for setting system parameters and for initializing other experimental options must be selected in an efficient way. The use of computer algorithms to support experimental design in complex problems has been a topic of recent research in the areas of statistics and engineering. This paper describes a resampling based approach to form dating this design. An example is provided illustrating in two dimensions how the algorithm works and indicating its potential on larger problems. The results show that the proposed approach has characteristics desirable of an algorithmic approach on the simple examples. Further experimentation is needed to evaluate its performance on larger problems.

More Details

Important considerations in experimental design for large scale simulation analyses

Rutherford, Brian

Economic and other factors accompanying developments in physics, mathematics and particularly in computer technology are shifting a substantial portion of the experimental resources associated with large scale engineering projects from physical testing to modeling and simulation. In the process, the priorities of selecting meaningful and informative tests and simulations to perform are also changing. This paper describes issues related to experimental design and how the goals and priorities of the experimental design for these problems are changing to accommodate the this shift in experimentation. Issues, priorities and new methods of approach are discussed.

More Details

A general method for the efficient selection of sampling locations for problems in environmental restoration

Rutherford, Brian

Problems in environmental restoration that involve detecting or monitoring contamination or site characterization often benefit from procedures that help select sampling or drilling locations for obtaining meaningful data that support the analysis. One example of this type of procedure is a spatial sampling program that will ``automatically`` (based on the implementation of a computer algorithm) guide an iterative investigation through the process of site characterization at a minimal cost to determine appropriate remediation activities. In order to be effective, such a procedure should translate site and modeling uncertainties into terms that facilitate comparison with regulations and should also provide a methodology that will lead to an efficient sampling plan over the course of the analysis. In this paper, a general framework is given that can accomplish these objectives and can be applied to a wide range of environmental restoration applications. The methodology is illustrated using an example where soil samples support the characterization of a chemical waste landfill area.

More Details
Results 1–25 of 27
Results 1–25 of 27