Publications

Results 101–125 of 144

Search results

Jump to search filters

Comparison of several model validation conceptions against a "real space" end-to-end approach

SAE Technical Papers

Romero, Vicente J.

This paper1 explores some of the important considerations in devising a practical and consistent framework and methodology for working with experiments and experimental data in connection with modeling and prediction. The paper outlines a pragmatic and versatile "real-space" approach within which experimental and modeling uncertainties (correlated and uncorrelated, systematic and random, aleatory and epistemic) are treated to mitigate risk in modeling and prediction. The elements of data conditioning, model conditioning, model validation, hierarchical modeling, and extrapolative prediction under uncertainty are examined. An appreciation can be gained for the constraints and difficulties at play in devising a viable end-to-end methodology. The considerations and options are many, and a large variety of viewpoints and precedents exist in the literature, as surveyed here. Rationale is given for the various choices taken in assembling the novel real-space end-to-end framework. The framework adopts some elements and constructs from the literature (sometimes adding needed refinement), rejects others (even some currently popular ones), and adds pivotal new elements and constructs. Crucially, the approach reflects a pragmatism and versatility derived from working many industrial-scale problems involving complex physics and constitutive models, steady-state and time-varying nonlinear behavior and boundary conditions, and various categories of uncertainty in experiments and models. The framework benefits from a broad exposure to integrated experimental and modeling activities in the areas of heat transfer, structural mechanics, irradiated electronics, and combustion in fluids and solids.2. © 2011 SAE International.

More Details

Data & model conditioning for multivariate systematic uncertainty in model calibration, validation, and extrapolation

Romero, Vicente J.

This paper discusses implications and appropriate treatment of systematic uncertainty in experiments and modeling. Systematic uncertainty exists when experimental conditions, and/or measurement bias errors, and/or bias contributed by post-processing the data, are constant over the set of experiments but the particular values of the conditions and/or biases are unknown to within some specified uncertainty. Systematic uncertainties in experiments do not automatically show up in the output data, unlike random uncertainty which is revealed when multiple experiments are performed. Therefore, the output data must be properly 'conditioned' to reflect important sources of systematic uncertainty in the experiments. In industrial scale experiments the systematic uncertainty in experimental conditions (especially boundary conditions) is often large enough that the inference error on how the experimental system maps inputs to outputs is often quite substantial. Any such inference error and uncertainty thereof also has implications in model validation and calibration/conditioning; ignoring systematic uncertainty in experiments can lead to 'Type X' error in these procedures. Apart from any considerations of modeling and simulation, reporting of uncertainty associated with experimental results should include the effects of any significant systematic uncertainties in the experiments. This paper describes and illustrates the treatment of multivariate systematic uncertainties of interval and/or probabilistic natures, and combined cases. The paper also outlines a practical and versatile 'real-space' framework and methodology within which experimental and modeling uncertainties (correlated and uncorrelated, systematic and random, aleatory and epistemic) are treated to mitigate risk in model validation, calibration/conditioning, hierarchical modeling, and extrapolative prediction.

More Details

Coupled thermal-mechanical experiments for validation of pressurized, high temperature systems

Dempsey, James F.; Wellman, Gerald W.; Scherzinger, William M.; Connelly, Kevin C.; Romero, Vicente J.

Instrumented, fully coupled thermal-mechanical experiments were conducted to provide validation data for finite element simulations of failure in pressurized, high temperature systems. The design and implementation of the experimental methodology is described in another paper of this conference. Experimental coupling was accomplished on tubular 304L stainless steel specimens by mechanical loading imparted by internal pressurization and thermal loading by side radiant heating. Experimental parameters, including temperature and pressurization ramp rates, maximum temperature and pressure, phasing of the thermal and mechanical loading and specimen geometry details were studied. Experiments were conducted to increasing degrees of deformation, up to and including failure. Mechanical characterization experiments of the 304L stainless steel tube material was also completed for development of a thermal elastic-plastic material constitutive model used in the finite element simulations of the validation experiments. The material was characterized in tension at a strain rate of 0.001/s from room temperature to 800 C. The tensile behavior of the tube material was found to differ substantially from 304L bar stock material, with the plasticity characteristics and strain to failure differing at every test temperature.

More Details

Application of a pragmatic interval-based "real space" approach to fire-model validation involving aleatory and epistemic uncertainty

Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference

Romero, Vicente J.; Luketa, Anay L.; Sherman, Martin

This paper applies a pragmatic interval-based approach to validation of a fire dynamics model involving computational fluid dynamics, combustion, participating-media radiation, and heat transfer. Significant aleatory and epistemic sources of uncertainty exist in the experiments and simulations. The validation comparison of experimental and simulation results, and corresponding criteria and procedures for model affirmation or refutation, take place in "real space" as opposed to "difference space" where subtractive differences between experiments and simulations are assessed. The versatile model validation framework handles difficulties associated with representing and aggregating aleatory and epistemic uncertainties from multiple correlated and uncorrelated source types, including: • experimental variability from multiple repeat experiments • uncertainty of experimental inputs • experimental output measurement uncertainties • uncertainties that arise in data processing and inference from raw simulation and experiment outputs • parameter and model-form uncertainties intrinsic to the model • numerical solution uncertainty from model discretization effects. The framework and procedures of the model validation methodology are here applied to a difficult validation problem involving experimental and predicted calorimeter temperatures in a wind-driven hydrocarbon pool fire.

More Details

Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire

Luketa, Anay L.; Romero, Vicente J.; Domino, Stefan P.; Glaze, D.J.; Figueroa Faria, Victor G.

The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

More Details

Efficiencies from spatially-correlated uncertainty and sampling in continuous-variable ordinal optimization

SAE International Journal of Materials and Manufacturing

Romero, Vicente J.

A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effect. One simply asks "Is that alternative better or worse than this one?"-not "HOW MUCH better or worse is that alternative to this one?" The answer to the latter question requires precise characterization of the uncertainty- with the corresponding sampling/integration expense for precise resolution. By looking at things from an ordinal ranking perspective instead, the trade-off between computational expense and vagueness in the uncertainty characterization can be managed to make cost-effective stepping decisions in the design space. This paper demonstrates correct advancement in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. It is explained and shown how spatial correlation of uncertainty in such design problems can be exploited to dramatically increase the efficiency of ordinal approaches to optimization under uncertainty.

More Details

Efficiencies from spatially-correlated uncertainty and sampling in continuous-variable ordinal optimization

SAE International Journal of Materials and Manufacturing

Romero, Vicente J.

A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effect. One simply asks "Is that alternative better or worse than this one?"-not "HOW MUCH better or worse is that alternative to this one?" The answer to the latter question requires precise characterization of the uncertainty- with the corresponding sampling/integration expense for precise resolution. By looking at things from an ordinal ranking perspective instead, the trade-off between computational expense and vagueness in the uncertainty characterization can be managed to make cost-effective stepping decisions in the design space. This paper demonstrates correct advancement in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. It is explained and shown how spatial correlation of uncertainty in such design problems can be exploited to dramatically increase the efficiency of ordinal approaches to optimization under uncertainty.

More Details

Type X and y errors and data & model conditioning for systematic uncertainty in model calibration, validation, and extrapolation

SAE Technical Papers

Romero, Vicente J.

This paper introduces and develops the concept of "Type X" and "Type Y" errors in model validation and calibration, and their implications on extrapolative prediction. Type X error is non-detection of model bias because it is effectively hidden by the uncertainty in the experiments. Possible deleterious effects of Type X error can be avoided by mapping uncertainty into the model until it envelopes the potential model bias, but this likely assigns a larger uncertainty than is needed to account for the actual bias (Type Y error). A philosophy of Best Estimate + Uncertainty modeling and prediction is probably best supported by taking the conservative choice of guarding against Type X error while accepting the downside of incurring Type Y error. An associated methodology involving data-and model-conditioning is presented and tested on a simple but rich test problem. The methodology is shown to appropriately contend with model bias under conditions of systematic experimental input uncertainty in the test problem. The methodology effectively bounds the uncertain model bias and brings a correction into the model that extrapolates very well under a large variety of extrapolation conditions. The methodology has been straightforwardly applied to considerably more complex real problems where system response is likewise jointly monotonic in the input uncertainties. The methodology also allows for other types of systematic and random uncertainty in the experiments and model as discussed herein. Copyright © 2008 SAE International.

More Details

A paradigm of model validation and validated models for best-estimate-plus-uncertainty predictions in systems engineering

SAE Technical Papers

Romero, Vicente J.

What constitutes a validated model? What are the criteria that allow one to defensibly make the claim that they are using a validated model in an analysis? These questions get to the heart of what model validation really implies (conceptually, operationally, interpretationally, etc.), and these details are currently the subject of substantial debate in the V&V community. This is perhaps because many contemporary paradigms of model validation have a limited modeling scope in mind, so the validation paradigms do not span different modeling regimes and purposes that are important in engineering. This paper discusses the different modeling regimes and purposes that it is important for a validation theory to span, and then proposes a validation paradigm that appears to span them. The author's criterion for validated models proceeds from a desire to meet an end objective of "best estimate plus uncertainty" (BEPU) in model predictions. Starting from this end, the author works back to the implications on the model validation process (conceptually, operationally, interpretationally, etc.). Ultimately a shift is required in the conceptualization and articulation of model validation, away from contemporary paradigms. Thus, this paper points out weaknesses in contemporary model validation perspectives and proposes a conception of model validation and validated models that seems to reconcile many of the issues. Copyright © 2007 SAE International.

More Details

Development of a new adaptive ordinal approach to continuous-variable probabilistic optimization

Romero, Vicente J.

A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks ''Is that alternative better or worse than this one?'' -not ''HOW MUCH better or worse is that alternative to this one?'' The answer to the latter question requires precise characterization of the uncertainty--with the corresponding sampling/integration expense for precise resolution. However, in this report we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared--analogous to the role that Monte Carlo simulation plays in uncertainty propagation.

More Details
Results 101–125 of 144
Results 101–125 of 144