Publications Details

Publications / Conference Poster

Validation assessment of hypersonic double-cone flow simulations using uncertainty quantification, sensitivity analysis, and validation metrics

Kieweg, Sarah K.; Ray, Jaideep R.; Weirs, V.G.; Carnes, Brian C.; Dinzl, Derek J.; Freno, Brian A.; Howard, Micah A.; Phipps, Eric T.; Rider, William J.; Smith, Thomas M.

This is the second of three related conference papers focused on verifying and validating a CFD model for laminar hypersonic flows. The first paper deals with the code verification and solution verification activities. In this paper, we investigate whether the model can accurately simulate laminar, hypersonic experiments of flows over double-cones, conducted in CUBRC’s LENS-I and LENS-XX wind-tunnels. The approach is to use uncertainty quantification and sensitivity analysis, along with a careful examination of experimental uncertainties, to perform validation assessments. The validation assessments use metrics that probabilistically incorporate both parametric (i.e. freestream input) uncertainty and experimental uncertainty. Further validation assessments compare these uncertainties to iterative and convergence uncertainties described in the first paper in our series of related papers. As other researchers have found, the LENS-XX simulations under-predict experimental heat flux measurements in the laminar, attached region of the fore-cone. This is observed for a deterministic simulation, as well as a probabilistic approach to creating an ensemble of simulations derived from CUBRC-provided estimates of uncertainty for freestream conditions. This paper will conclude with possible reasons that simulations cannot bracket experimental observations, and motivate the third paper in our series, which will further examine these possible explanations. The results in this study emphasize the importance of careful measurement of experimental conditions and uncertainty quantification of validation experiments. This study, along with its sister papers, also demonstrates a process of verification, uncertainty quantification, and quantitative validation activities for building and assessing credibility of computational simulations.