Nonlinear sparse Bayesian learning using EnKF based state estimator
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.1.1 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sensitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.
Journal of Computational Physics
This paper addresses the issue of overfitting while calibrating unknown parameters of over-parameterized physics-based models with noisy and incomplete observations. A semi-analytical Bayesian framework of nonlinear sparse Bayesian learning (NSBL) is proposed to identify sparsity among model parameters during Bayesian inversion. NSBL offers significant advantages over machine learning algorithm of sparse Bayesian learning (SBL) for physics-based models, such as 1) the likelihood function or the posterior parameter distribution is not required to be Gaussian, and 2) prior parameter knowledge is incorporated into sparse learning (i.e. not all parameters are treated as questionable). NSBL employs the concept of automatic relevance determination (ARD) to facilitate sparsity among questionable parameters through parameterized prior distributions. The analytical tractability of NSBL is enabled by employing Gaussian ARD priors and by building a Gaussian mixture-model approximation of the posterior parameter distribution that excludes the contribution of ARD priors. Subsequently, type-II maximum likelihood is executed using Newton's method whereby the evidence and its gradient and Hessian information are computed in a semi-analytical fashion. We show numerically and analytically that SBL is a special case of NSBL for linear regression models. Subsequently, a linear regression example involving multimodality in both parameter posterior pdf and model evidence is considered to demonstrate the performance of NSBL in cases where SBL is inapplicable. Next, NSBL is applied to identify sparsity among the damping coefficients of a mass-spring-damper model of a shear building frame. These numerical studies demonstrate the robustness and efficiency of NSBL in alleviating overfitting during Bayesian inversion of nonlinear physics-based models.
Abstract not provided.
Oceans Conference Record (IEEE)
This study presents a numerical model of a WEC array. The model will be used in subsequent work to study the ability of data assimilation to support power prediction from WEC arrays and WEC array design. In this study, we focus on design, modeling, and control of the WEC array. A case study is performed for a small remote Alaskan town. Using an efficient method for modeling the linear interactions within a homogeneous array, we produce a model and predictionless feedback controllers for the devices within the array. The model is applied to study the effects of spectral wave forecast errors on power output. The results of this analysis show that the power performance of the WEC array will be most strongly affected by errors in prediction of the spectral period, but that reductions in performance can realistically be limited to less than 10% based on typical data assimilation based spectral forecasting accuracy levels.
Abstract not provided.
Journal of Sound and Vibration
This paper focuses on the derivation of an analytical model of the aeroelastic dynamics of an elastically mounted flexible wing. The equations of motion obtained serve to help understand the behaviour of the aeroelastic wind tunnel setup in question, which consists of a rectangular wing with a uniform NACA 0012 airfoil profile, whose base is free to rotate rigidly about a longitudinal axis. Of particular interest are the structural geometric nonlinearities primarily introduced by the coupling between the rigid body pitch degree-of-freedom and the continuous system. A coupled system of partial differential equations (PDEs) coupled with an ordinary differential equation (ODE) describing axial-bending-bending-torsion-pitch motion is derived using Hamilton's principle. A finite dimensional approximation of the system of coupled differential equations is obtained using the Galerkin method, leading to a system of coupled nonlinear ODEs. Subsequently, these nonlinear ODEs are solved numerically using Houbolt's method. The results that are obtained are verified by comparison with the results obtained by direct integration of the equations of motion using a finite difference scheme. Adopting a linear unsteady aerodynamic model, it is observed that the system undergoes coalescence flutter due to coupling between the rigid body pitch rotation dominated mode and the first flapwise bending dominated mode. The behaviour of the limit cycle oscillations is primarily influenced by the structural geometric nonlinear terms in the coupled system of PDEs and ODE.
Computer Methods in Applied Mechanics and Engineering
To model and quantify the variability in plasticity and failure of additively manufactured metals due to imperfections in their microstructure, we have developed uncertainty quantification methodology based on pseudo marginal likelihood and embedded variability techniques. We account for both the porosity resolvable in computed tomography scans of the initial material and the sub-threshold distribution of voids through a physically motivated model. Calibration of the model indicates that the sub-threshold population of defects dominates the yield and failure response. Finally, the technique also allows us to quantify the distribution of material parameters connected to microstructural variability created by the manufacturing process, and, thereby, make assessments of material quality and process control.
Abstract not provided.
Abstract not provided.
Abstract not provided.
Abstract not provided.
This report documents a statistical method for the "real-time" characterization of partially observed epidemics. Observations consist of daily counts of symptomatic patients, diagnosed with the disease. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information for the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and is predicated on a model for the distribution of the incubation period. The model parameters are estimated as distributions using a Markov Chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. The method is applied to the COVID-19 pandemic of 2020, using data at the country, provincial (e.g., states) and regional (e.g. county) levels. The epidemiological model includes a stochastic component due to uncertainties in the incubation period. This model-form uncertainty is accommodated by a pseudo-marginal Metropolis-Hastings MCMC sampler, which produces posterior distributions that reflect this uncertainty. We approximate the discrepancy between the data and the epidemiological model using Gaussian and negative binomial error models; the latter was motivated by the over-dispersed count data. For small daily counts we find the performance of the calibrated models to be similar for the two error models. For large daily counts the negative-binomial approximation is numerically unstable unlike the Gaussian error model. Application of the model at the country level (for the United States, Germany, Italy, etc.) generally provided accurate forecasts, as the data consisted of large counts which suppressed the day-to-day variations in the observations. Further, the bulk of the data is sourced over the duration before the relaxation of the curbs on population mixing, and is not confounded by any discernible country-wide second wave of infections. At the state-level, where reporting was poor or which evinced few infections (e.g., New Mexico), the variance in the data posed some, though not insurmountable, difficulties, and forecasts were able to capture the data with large uncertainty bounds. The method was found to be sufficiently sensitive to discern the flattening of the infection and epidemic curve due to shelter-in-place orders after around 90% quantile for the incubation distribution (about 10 days for COVID-19). The proposed model was also used at a regional level to compare the forecasts for the central and north-west regions of New Mexico. Modeling the data for these regions illustrated different disease spread dynamics captured by the model. While in the central region the daily counts peaked in the late April, in the north-west region the ramp-up continued for approximately three more weeks.
The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.
The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.1.0 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sensitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.
Transitional Markov Chain Monte Carlo (TMCMC) is a variant of a class of Markov Chain Monte Carlo algorithms known as tempering-based methods. In this report, the implementation of TMCMC in the Uncertainty Quantification Toolkit is investigated through the sampling of high-dimensional distributions, multi-modal distributions, and nonlinear manifolds. Furthermore, the Bayesian model evidence estimates obtained from TMCMC are tested on problems with known analytical solutions and shown to provide consistent results.
Abstract not provided.