Publications Details
Quantifying model prediction sensitivity to model-form uncertainty
Portone, Teresa; White, Rebekah D.; Rosso, Haley; Bandy, Rileigh J.; Hart, Joseph L.
Computational and mathematical models are essential to understanding complex systems and phenomena. However, when developing such models, limited knowledge and/or resources necessitates the use of simplifying assumptions. It is therefore crucial to quantify the impact of such simplifying assumptions on the reliability and accuracy of resulting model predictions. This work develops a first-of-its-kind approach to quantify the impact of physics modeling assumptions on predictions. Here, we leverage the emerging field of model-form uncertainty (MFU) representations, which are parameterized modifications to modeling assumptions, in combination with grouped Sobol’ indices to quantitatively measure an assumption’s importance. Specifically, we compute the grouped Sobol’ index for the MFU representation’s parameters as a single importance measure of the assumption for which the MFU representation characterizes uncertainty. To ensure this approach is robust to the subjective choice of how to parameterize a MFU representation, we establish bounds for the difference between sensitivity results for two different MFU representations based on differences in model prediction statistics. The capabilities associated with this approach are demonstrated on three exemplar problems: an upscaled subsurface contaminant transport problem, ablation modeling for hypersonic flight, and nuclear waste repository modeling. We found that our grouped approach is able to assess the impact of modeling assumptions on predictions and offers computational advantages over classical Sobol’ index computation while providing more interpretable results.