Publications

Results 1–25 of 210

Search results

Jump to search filters

Democratizing uncertainty quantification

Journal of Computational Physics

Seelinger, Linus; Reinarz, Anne; Lykkegaard, Mikkel B.; Alghamdi, Amal M.A.; Aristoff, David; Bangerth, Wolfgang; Benezech, Jean; Diez, Matteo; Frey, Kurt; Jakeman, John D.; Jorgensen, Jakob S.; Kim, Ki-Tae; Martinelli, Massimiliano; Parno, Matthew; Pellegrini, Riccardo; Petra, Noemi; Riis, Nicolai A.B.; Rosenfeld, Katherine; Serani, Andrea; Tamellini, Lorenzo; Villa, Umberto; Dodwell, Tim J.; Scheichl, Robert

Uncertainty Quantification (UQ) is vital to safety-critical model-based analyses, but the widespread adoption of sophisticated UQ methods is limited by technical complexity. In this paper, we introduce UM-Bridge (the UQ and Modeling Bridge), a high-level abstraction and software protocol that facilitates universal interoperability of UQ software with simulation codes. It breaks down the technical complexity of advanced UQ applications and enables separation of concerns between experts. UM-Bridge democratizes UQ by allowing effective interdisciplinary collaboration, accelerating the development of advanced UQ methods, and making it easy to perform UQ analyses from prototype to High Performance Computing (HPC) scale. In addition, we present a library of ready-to-run UQ benchmark problems, all easily accessible through UM-Bridge. These benchmarks support UQ methodology research, enabling reproducible performance comparisons. We demonstrate UM-Bridge with several scientific applications, harnessing HPC resources even using UQ codes not designed with HPC support.

More Details

Assessing convergence in global sensitivity analysis: a review of methods for assessing and monitoring convergence

Socio-Environmental Systems Modelling

Sun, Xifu; Jakeman, Anthony J.; Croke, Barry F.W.; Roberts, Stephen G.; Jakeman, John D.

In global sensitivity analysis (GSA) of a model, a proper convergence analysis of metrics is essential for ensuring a level of confidence or trustworthiness in sensitivity results obtained, yet is somewhat deficient in practice. The level of confidence in sensitivity measures, particularly in relation to their influence and support for decisions from scientific, social and policy perspectives, is heavily reliant on the convergence of GSA. We review the literature and summarize the available methods for monitoring and assessing convergence of sensitivity measures based on application purposes. The aim is to expose the various choices for convergence assessment and encourage further testing of available methods to clarify their level of robustness. Furthermore, the review identifies a pressing need for comparative studies on convergence assessment methods to establish a clear hierarchy of effectiveness and encourages the adoption of systematic approaches for enhanced robustness in sensitivity analysis.

More Details

Progressive reduced order modeling: from single-phase flow to coupled multiphysics processes

58th US Rock Mechanics / Geomechanics Symposium 2024, ARMA 2024

Kadeethum, Teeratorn; Chang, Kyung W.; Jakeman, John D.; Yoon, Hongkyu

This study introduces the Progressive Improved Neural Operator (p-INO) framework, aimed at advancing machine-learning-based reduced-order models within geomechanics for underground resource optimization and carbon sequestration applications.The p-INO method transcends traditional transfer learning limitations through progressive learning, enhancing the capability of transferring knowledge from many sources.Through numerical experiments, the performance of p-INO is benchmarked against standard Improved Neural Operators (INO) in scenarios varying by data availability (different number of training samples).The research utilizes simulation data reflecting scenarios like single-phase, two-phase, and two-phase flow with mechanics inspired by the Illinois Basin Decatur Project.Results reveal that p-INO significantly surpasses conventional INO models in accuracy, particularly in data-constrained environments.Besides, adding more priori information (more trained models used by p-INO) can further enhance the process.This experiment demonstrates p-INO's robustness in leveraging sparse datasets for precise predictions across complex subsurface physics scenarios.The findings underscore the potential of p-INO to revolutionize predictive modeling in geomechanics, presenting a substantial improvement in computational efficiency and accuracy for large-scale subsurface simulations.

More Details

PyApprox: A software package for sensitivity analysis, Bayesian inference, optimal experimental design, and multi-fidelity uncertainty quantification and surrogate modeling

Environmental Modelling and Software

Jakeman, John D.

PyApprox is a Python-based one-stop-shop for probabilistic analysis of numerical models such as those used in the earth, environmental and engineering sciences. Easy to use and extendable tools are provided for constructing surrogates, sensitivity analysis, Bayesian inference, experimental design, and forward uncertainty quantification. The algorithms implemented represent a wide range of methods for model analysis developed over the past two decades, including recent advances in multi-fidelity approaches that use multiple model discretizations and/or simplified physics to significantly reduce the computational cost of various types of analyses. An extensive set of Benchmarks from the literature is also provided to facilitate the easy comparison of new or existing algorithms for a wide range of model analyses. This paper introduces PyApprox and its various features, and presents results demonstrating the utility of PyApprox on a benchmark problem modeling the advection of a tracer in groundwater.

More Details

Multifidelity uncertainty quantification with models based on dissimilar parameters

Computer Methods in Applied Mechanics and Engineering

Zeng, Xiaoshu; Geraci, Gianluca; Eldred, Michael; Jakeman, John D.; Gorodetsky, Alex A.; Ghanem, Roger

Multifidelity uncertainty quantification (MF UQ) sampling approaches have been shown to significantly reduce the variance of statistical estimators while preserving the bias of the highest-fidelity model, provided that the low-fidelity models are well correlated. However, maintaining a high level of correlation can be challenging, especially when models depend on different input uncertain parameters, which drastically reduces the correlation. Existing MF UQ approaches do not adequately address this issue. In this work, we propose a new sampling strategy that exploits a shared space to improve the correlation among models with dissimilar parameterization. We achieve this by transforming the original coordinates onto an auxiliary manifold using the adaptive basis (AB) method (Tipireddy and Ghanem, 2014). The AB method has two main benefits: (1) it provides an effective tool to identify the low-dimensional manifold on which each model can be represented, and (2) it enables easy transformation of polynomial chaos representations from high- to low-dimensional spaces. This latter feature is used to identify a shared manifold among models without requiring additional evaluations. We present two algorithmic flavors of the new estimator to cover different analysis scenarios, including those with legacy and non-legacy high-fidelity (HF) data. We provide numerical results for analytical examples, a direct field acoustic test, and a finite element model of a nuclear fuel assembly. For all examples, we compare the proposed strategy against both single-fidelity and MF estimators based on the original model parameterization.

More Details

A Decision-Relevant Factor-Fixing Framework: Application to Uncertainty Analysis of a High-Dimensional Water Quality Model

Water Resources Research

Wang, Qian; Guillaume, Joseph H.A.; Jakeman, John D.; Bennett, Frederick R.; Croke, Barry F.W.; Fu, Baihua; Yang, Tao; Jakeman, Anthony J.

Factor Fixing (FF) is a common method for reducing the number of model parameters to lower computational cost. FF typically starts with distinguishing the insensitive parameters from the sensitive and pursues uncertainty quantification (UQ) on the resulting reduced-order model, fixing each insensitive parameter at a fixed value. There is a need, however, to expand such a common approach to consider the effects of decision choices in the FF-UQ procedure on metrics of interest. Therefore, to guide the use of FF and increase confidence in the resulting dimension-reduced model, we propose a new adaptive framework consisting of four principles: (a) re-parameterize the model first to reduce obvious non-identifiable parameter combinations, (b) focus on decision relevance especially with respect to errors in quantities of interest (QoI), (c) conduct adaptive evaluation and robustness assessment of errors in the QoI across FF choices as sample size increases, and (d) reconsider whether fixing is warranted. The framework is demonstrated on a spatially-distributed water quality model. The error in estimates of QoI caused by FF can be estimated using a Polynomial Chaos Expansion (PCE) surrogate model. Built with 70 model runs, the surrogate is computationally inexpensive to evaluate and can provide global sensitivity indices for free. For the selected catchment, just two factors may provide an acceptably accurate estimate of model uncertainty in the average annual load of Total Suspended Solids (TSS), suggesting that reducing the uncertainty in these two parameters is a priority for future work before undertaking further formal uncertainty quantification.

More Details

Improving Bayesian networks multifidelity surrogate construction with basis adaptation

AIAA SciTech Forum and Exposition, 2023

Zeng, Xiaoshu; Geraci, Gianluca; Gorodetsky, Alex A.; Jakeman, John D.; Eldred, Michael; Ghanem, Roger

Surrogate construction is an essential component for all non-deterministic analyses in science and engineering. The efficient construction of easy and cheaper-to-run alternatives to a computationally expensive code paves the way for outer loop workflows for forward and inverse uncertainty quantification and optimization. Unfortunately, the accurate construction of a surrogate still remains a task that often requires a prohibitive number of computations, making the approach unattainable for large-scale and high-fidelity applications. Multifidelity approaches offer the possibility to lower the computational expense requirement on the highfidelity code by fusing data from additional sources. In this context, we have demonstrated that multifidelity Bayesian Networks (MFNets) can efficiently fuse information derived from models with an underlying complex dependency structure. In this contribution, we expand on our previous work by adopting a basis adaptation procedure for the selection of the linear model representing each data source. Our numerical results demonstrate that this procedure is computationally advantageous because it can maximize the use of limited data to learn and exploit the important structures shared among models. Two examples are considered to demonstrate the benefits of the proposed approach: an analytical problem and a nuclear fuel finite element assembly. From these two applications, a lower dependency of MFnets on the model graph has been also observed.

More Details
Results 1–25 of 210
Results 1–25 of 210