On the Aggregation and Extrapolation of Uncertainty from Component to System Level Models
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
53rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference 2012
High fidelity modeling of complex systems can require large finite element models to capture the physics of interest. Typically these high-order models take an excessively long time to run. For important studies such as model validation and uncertainty quantification, where probabilistic measures of the response are required, a large number of simulations of the high fidelity model with different parameters are necessary. In addition, some environments, such as an extensive random vibration excitation, require a long simulation time to capture the entire event. A process that produces a highly efficient model from the original high order model is necessary to enable these analyses. These highly efficient models are referred to as surrogate models, for their purpose is to represent the main physics that is of importance, but decrease the computational burden. A critical aspect of any surrogate model is how faithfully the efficient model represents the original high-order model. This paper describes the process for verifying a surrogate model using response quantities of interest and quantifying the introduced uncertainties in the use of the surrogate model. A sequel paper to be submitted continues this work by validating the surrogate model and quantifying margins of uncertainty. © 2012 by the American Institute of Aeronautics and Astronautics, Inc.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference
This paper discusses the handling and treatment of uncertainties corresponding to relatively few data samples in experimental characterization of random quantities. The importance of this topic extends beyond experimental uncertainty to situations where the derived experimental information is used for model validation or calibration. With very sparse data it is not practical to have a goal of accurately estimating the underlying variability distribution (probability density function, PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a desired percentage of the actual PDF, say 95% included probability, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the random-variable range corresponding to the desired percentage of the actual PDF. The performance of a variety of uncertainty representation techniques is tested and characterized in this paper according to these two opposing objectives. An initial set of test problems and results is presented here from a larger study currently underway.
Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference
This paper explores various frameworks to quantify and propagate sources of epistemic and aleatoric uncertainty within the context of decision making for assessing system performance relative to design margins of a complex mechanical system. If sufficient data is available for characterizing aleatoric-type uncertainties, probabilistic methods are commonly used for computing response distribution statistics based on input probability distribution specifications. Conversely, for epistemic uncertainties, data is generally too sparse to support objective probabilistic input descriptions, leading to either subjective probabilistic descriptions (e.g., assumed priors in Bayesian analysis) or non-probabilistic methods based on interval specifications. Among the techniques examined in this work are (1) Interval analysis, (2) Dempster-Shafer Theory of Evidence, (3) a second-order probability (SOP) analysis in which the aleatory and epistemic variables are treated separately, and a nested iteration is performed, typically sampling epistemic variables on the outer loop, then sampling over aleatory variables on the inner loop and (4) a Bayesian approach where plausible prior distributions describing the epistemic variable are created and updated using available experimental data. This paper compares the results and the information provided by different methods to enable decision making in the context of performance assessment when epistemic uncertainty is considered.
Inverse Problems
Abstract not provided.
There is currently sparse literature on how to implement systematic and comprehensive processes for modern V&V/UQ (VU) within large computational simulation projects. Important design requirements have been identified in order to construct a viable 'system' of processes. Significant processes that are needed include discovery, accumulation, and assessment. A preliminary design is presented for a VU Discovery process that accounts for an important subset of the requirements. The design uses a hierarchical approach to set context and a series of place-holders that identify the evidence and artifacts that need to be created in order to tell the VU story and to perform assessments. The hierarchy incorporates VU elements from a Predictive Capability Maturity Model and uses questionnaires to define critical issues in VU. The place-holders organize VU data within a central repository that serves as the official VU record of the project. A review process ensures that those who will contribute to the record have agreed to provide the evidence identified by the Discovery process. VU expertise is an essential part of this process and ensures that the roadmap provided by the Discovery process is adequate. Both the requirements and the design were developed to support the Nuclear Energy Advanced Modeling and Simulation Waste project, which is developing a set of advanced codes for simulating the performance of nuclear waste storage sites. The Waste project served as an example to keep the design of the VU Discovery process grounded in practicalities. However, the system is represented abstractly so that it can be applied to other M&S projects.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.
This paper compares three approaches for model selection: classical least squares methods, information theoretic criteria, and Bayesian approaches. Least squares methods are not model selection methods although one can select the model that yields the smallest sum-of-squared error function. Information theoretic approaches balance overfitting with model accuracy by incorporating terms that penalize more parameters with a log-likelihood term to reflect goodness of fit. Bayesian model selection involves calculating the posterior probability that each model is correct, given experimental data and prior probabilities that each model is correct. As part of this calculation, one often calibrates the parameters of each model and this is included in the Bayesian calculations. Our approach is demonstrated on a structural dynamics example with models for energy dissipation and peak force across a bolted joint. The three approaches are compared and the influence of the log-likelihood term in all approaches is discussed.
Abstract not provided.
There is an increasing need to assess the performance of high consequence systems using a modeling and simulation based approach. Central to this approach are the need to quantify the uncertainties present in the system and to compare the system response to an expected performance measure. At Sandia National Laboratories, this process is referred to as quantification of margins and uncertainties or QMU. Depending on the outcome of the assessment, there might be a need to increase the confidence in the predicted response of a system model; thus a need to understand where resources need to be allocated to increase this confidence. This paper examines the problem of resource allocation done within the context of QMU. An optimization based approach to solving the resource allocation is considered and sources of aleatoric and epistemic uncertainty are included in the calculations.
Conference Proceedings of the Society for Experimental Mechanics Series
Accurate material models are fundamental to predictive structural finite element models. Because potting foams are routinely used to mitigate shock and vibration of encapsulated components in electro/mechanical systems, accurate material models of foams are needed. A linear-viscoelastic foam constitutive model has been developed to represent the foam's stiffness and damping throughout an application space defined by temperature, strain rate or frequency and strain level. Validation of this linear-viscoelastic model, which is integrated into the Salinas structural dynamics code, is being achieved by modeling and testing a series of structural geometries of increasing complexity that have been designed to ensure sensitivity to material parameters. Both experimental and analytical uncertainties are being quantified to ensure the fair assessment of model validity. Quantitative model validation metrics are being developed to provide a means of comparison for analytical model predictions to observations made in the experiments. This paper is one of several recent papers documenting the validation process for simple to complex structures with foam encapsulated components. This paper specifically focuses on model validation over a wide temperature range and using a simple dumbbell structure for modal testing and simulation. Material variations of density and modulus have been included. A double blind validation process is described that brings together test data with model predictions.
Conference Proceedings of the Society for Experimental Mechanics Series
Accurate material models are fundamental to predictive structural finite element models. Because potting foams are routinely used to mitigate shock and vibration of encapsulated components in electro/mechanical systems, accurate material models of foams are needed. A linear-viscoelastic foam constitutive model has been developed to represent the foam's stiffness and damping throughout an application space defined by temperature, strain rate or frequency and strain level. Validation of this linear-viscoelastic model, which is integrated into the Salinas structural dynamics code, is being achieved by modeling and testing a series of structural geometries of increasing complexity that have been designed to ensure sensitivity to material parameters. Both experimental and analytical uncertainties are being quantified to ensure the fair assessment of model validity. Quantitative model validation metrics are being developed to provide a means of comparison for analytical model predictions to observations made in the experiments. This paper is one of several recent papers documenting the validation process for simple to complex structures with foam encapsulated components. This paper specifically focuses on model validation over a wide temperature range and using a simple dumbbell structure for modal testing and simulation. Material variations of density and modulus have been included. A double blind validation process is described that brings together test data with model predictions.
Abstract not provided.
Proceedings of the 2006 SEM Annual Conference and Exposition on Experimental and Applied Mechanics 2006
Accurate material models are fundamental to predictive structural finite element models. Because potting foams are routinely used to mitigate shock and vibration of encapsulated components in mechanical systems, accurate material models of foams are needed. A linear-viscoelastic foam constitutive model has been developed to represent the foam's stiffness and damping throughout an application space defined by temperature, strain rate or frequency and strain level. Validation of this linear-viscoelastic model, which is integrated into the Saunas structural dynamics code, is achieved by modeling and testing a series of structural geometries of increasing complexity that have been designed to ensure sensitivity to material parameters. Both experimental and analytical uncertainties are being quantified to ensure the fair assessment of model validity. Quantitative model validation metrics are being developed to provide a means of comparison for analytical model predictions to observations made in the experiments. This paper is one of several parallel papers documenting the validation process for simple to complex structures with foam encapsulated components. This paper will describe the development of a linear-viscoelastic constitutive model for EF-AR20 epoxy foam with density, modulus, and damping uncertainties and apply the model to the simplest of the series of foam/component structural geometries for the calibration and validation of the constitutive model.
Abstract not provided.