Validation Assessment of Hypersonic Double-Cone Flow Simulations using UQ Sensitivity Analysis and Validation Metrics
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The SPARC (Sandia Parallel Aerodynamics and Reentry Code) will provide nuclear weapon qualification evidence for the random vibration and thermal environments created by re-entry of a warhead into the earth’s atmosphere. SPARC incorporates the innovative approaches of ATDM projects on several fronts including: effective harnessing of heterogeneous compute nodes using Kokkos, exascale-ready parallel scalability through asynchronous multi-tasking, uncertainty quantification through Sacado integration, implementation of state-of-the-art reentry physics and multiscale models, use of advanced verification and validation methods, and enabling of improved workflows for users. SPARC is being developed primarily for the Department of Energy nuclear weapon program, with additional development and use of the code is being supported by the Department of Defense for conventional weapons programs.
Abstract not provided.
Sandia journal manuscript; Not yet accepted for publication
The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, the PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.
Computers and Fluids
Uncertainty quantification (UQ) deals with providing reasonable estimates of the uncertainties associated with an engineering model and propagating them to final engineering quantities of interest. We present a conceptual UQ framework for the case of shock hydrodynamics with Euler's equations where the uncertainties are assumed to lie principally in the equation of state (EOS). In this paper we consider experimental data as providing both data and an estimate of data uncertainty. We propose a specific Bayesian inference approach for characterizing EOS uncertainty in thermodynamic phase space. We show how this approach provides a natural and efficient methodology for transferring data uncertainty to engineering outputs through an EOS representation that understands and deals consistently with parameter correlations as sensed in the data.Historically, complex multiphase EOSs have been built utilizing tables as the delivery mechanism in order to amortize the cost of creation of the tables over many subsequent continuum scale runs. Once UQ enters into the picture, however, the proper operational paradigm for multiphase tables become much less clear. Using a simple single-phase Mie-Grüneisen model we experiment with several approaches and demonstrate how uncertainty can be represented. We also show how the quality of the tabular representation is of key importance. As a first step, we demonstrate a particular tabular approach for the Mie-Grüneisen model which when extended to multiphase tables should have value for designing a UQ-enabled shock hydrodynamic modeling approach that is not only theoretically sound but also robust, useful, and acceptable to the modeling community. We also propose an approach to separate data uncertainty from modeling error in the EOS. © 2012 Elsevier Ltd.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Reliability Engineering and System Safety
Sensitivity analysis is comprised of techniques to quantify the effects of the input variables on a set of outputs. In particular, sensitivity indices can be used to infer which input parameters most significantly affect the results of a computational model. With continually increasing computing power, sensitivity analysis has become an important technique by which to understand the behavior of large-scale computer simulations. Many sensitivity analysis methods rely on sampling from distributions of the inputs. Such sampling-based methods can be computationally expensive, requiring many evaluations of the simulation; in this case, the Sobol method provides an easy and accurate way to compute variance-based measures, provided a sufficient number of model evaluations are available. As an alternative, meta-modeling approaches have been devised to approximate the response surface and estimate various measures of sensitivity. In this work, we consider a variety of sensitivity analysis methods, including different sampling strategies, different meta-models, and different ways of evaluating variance-based sensitivity indices. The problem we consider is the 1-D Riemann problem. By a careful choice of inputs, discontinuous solutions are obtained, leading to discontinuous response surfaces; such surfaces can be particularly problematic for meta-modeling approaches. The goal of this study is to compare the estimated sensitivity indices with exact values and to evaluate the convergence of these estimates with increasing samples sizes and under an increasing number of meta-model evaluations. © 2011 Elsevier Ltd. All rights reserved.
Abstract not provided.