Model Calibration in Latent Response Space Using Principal Component Analysis
Abstract not provided.
Abstract not provided.
Reliability Engineering and System Safety
Causality in an engineered system pertains to how a system output changes due to a controlled change or intervention on the system or system environment. Engineered systems designs reflect a causal theory regarding how a system will work, and predicting the reliability of such systems typically requires knowledge of this underlying causal structure. The aim of this work is to introduce causal modeling tools that inform reliability predictions based on biased data sources. We present a novel application of the popular structural causal modeling (SCM) framework to reliability estimation in an engineering application, illustrating how this framework can inform whether reliability is estimable and how to estimate reliability given a set of data and assumptions about the subject matter and data generating mechanism. When data are insufficient for estimation, sensitivity studies based on problem-specific knowledge can inform how much reliability estimates can change due to biases in the data and what information should be collected next to provide the most additional information. We apply the approach to a pedagogical example related to a real, but proprietary, engineering application, considering how two types of biases in data can influence a reliability calculation.
ASME 2020 Verification and Validation Symposium, VVS 2020
Empirically-based correlations are commonly used in modeling and simulation but rarely have rigorous uncertainty quantification that captures the nature of the underlying data. In many applications, a mathematical description for a parameter response to some input stimulus is often either unknown, unable to be measured, or both. Likewise, the data used to observe a parameter response is often noisy, and correlations are derived to approximate the bulk response. Practitioners frequently treat the chosen correlation-sometimes referred to as the "surrogate"or "reduced-order"model of the response-as a constant mathematical description of the relationship between input and output. This assumption, as with any model, is incorrect to some degree, and the uncertainty in the correlation can potentially have significant impacts on system responses. Thus, proper treatment of correlation uncertainty is necessary. In this paper, a method is proposed for high-level abstract sampling of uncertain data correlations. Whereas uncertainty characterization is often assigned to scalar values for direct sampling, functional uncertainty is not always straightforward. A systematic approach for sampling univariable uncertain correlations was developed to perform more rigorous uncertainty analyses and more reliably sample the correlation space. This procedure implements pseudo-random sampling of a correlation with a bounded input range to maintain the correlation form, to respect variable uncertainty across the range, and to ensure function continuity with respect to the input variable.
Abstract not provided.
Reliability Engineering and System Safety
Causality in an engineered system pertains to how a system output changes due to a controlled change or intervention on the system or system environment. Engineered systems designs reflect a causal theory regarding how a system will work, and predicting the reliability of such systems typically requires knowledge of this underlying causal structure. The aim of this work is to introduce causal modeling tools that inform reliability predictions based on biased data sources. We present a novel application of the popular structural causal modeling (SCM) framework to reliability estimation in an engineering application, illustrating how this framework can inform whether reliability is estimable and how to estimate reliability given a set of data and assumptions about the subject matter and data generating mechanism. When data are insufficient for estimation, sensitivity studies based on problem-specific knowledge can inform how much reliability estimates can change due to biases in the data and what information should be collected next to provide the most additional information. We apply the approach to a pedagogical example related to a real, but proprietary, engineering application, considering how two types of biases in data can influence a reliability calculation.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Journal of Verification, Validation and Uncertainty Quantification
When making computational simulation predictions of multiphysics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. Here, for this analysis, a verification, validation, and uncertainty quantification (VVUQ) workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification (UQ) simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. In conclusion, based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias is used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes use of computing resources.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Journal of Verification, Validation and Uncertainty Quantification
When making computational simulation predictions of multiphysics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. For this analysis, a verification, validation, and uncertainty quantification (VVUQ) workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification (UQ) simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. Based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias is used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes use of computing resources.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems. Part B. Mechanical Engineering
This paper examines the variability of predicted responses when multiple stress-strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity’s behavior varies according to the particular stress-strain curves used for the materials in the model. We desire to estimate response variability when only a few stress-strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, Tolerance Intervals, is tested for effectively treating sparse stress-strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-standard response distributions in the can-crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. Finally, the simple Tolerance Interval method is also demonstrated to be very economical.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
ASME 2018 Verification and Validation Symposium, VVS 2018
When making computational simulation predictions of multi-physics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. For this analysis a verification, validation and uncertainty quantification workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. Based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias are used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes computing resources.