Publications

Results 26–50 of 112

Search results

Jump to search filters

Nonlinear sparse Bayesian learning for physics-based models

Journal of Computational Physics

Sandhu, Rimple; Khalil, Mohammad K.; Pettit, Chris; Poirel, Dominique; Sarkar, Abhijit

This paper addresses the issue of overfitting while calibrating unknown parameters of over-parameterized physics-based models with noisy and incomplete observations. A semi-analytical Bayesian framework of nonlinear sparse Bayesian learning (NSBL) is proposed to identify sparsity among model parameters during Bayesian inversion. NSBL offers significant advantages over machine learning algorithm of sparse Bayesian learning (SBL) for physics-based models, such as 1) the likelihood function or the posterior parameter distribution is not required to be Gaussian, and 2) prior parameter knowledge is incorporated into sparse learning (i.e. not all parameters are treated as questionable). NSBL employs the concept of automatic relevance determination (ARD) to facilitate sparsity among questionable parameters through parameterized prior distributions. The analytical tractability of NSBL is enabled by employing Gaussian ARD priors and by building a Gaussian mixture-model approximation of the posterior parameter distribution that excludes the contribution of ARD priors. Subsequently, type-II maximum likelihood is executed using Newton's method whereby the evidence and its gradient and Hessian information are computed in a semi-analytical fashion. We show numerically and analytically that SBL is a special case of NSBL for linear regression models. Subsequently, a linear regression example involving multimodality in both parameter posterior pdf and model evidence is considered to demonstrate the performance of NSBL in cases where SBL is inapplicable. Next, NSBL is applied to identify sparsity among the damping coefficients of a mass-spring-damper model of a shear building frame. These numerical studies demonstrate the robustness and efficiency of NSBL in alleviating overfitting during Bayesian inversion of nonlinear physics-based models.

More Details

Modeling and predicting power from a WEC array

Oceans Conference Record (IEEE)

Coe, Ryan G.; Bacelli, Giorgio B.; Gaebele, Daniel; Cotten, Alfred; Mcnatt, Cameron; Wilson, David G.; Weaver, Wayne; Kasper, Jeremy L.; Khalil, Mohammad K.; Dallman, Ann R.

This study presents a numerical model of a WEC array. The model will be used in subsequent work to study the ability of data assimilation to support power prediction from WEC arrays and WEC array design. In this study, we focus on design, modeling, and control of the WEC array. A case study is performed for a small remote Alaskan town. Using an efficient method for modeling the linear interactions within a homogeneous array, we produce a model and predictionless feedback controllers for the devices within the array. The model is applied to study the effects of spectral wave forecast errors on power output. The results of this analysis show that the power performance of the WEC array will be most strongly affected by errors in prediction of the spectral period, but that reductions in performance can realistically be limited to less than 10% based on typical data assimilation based spectral forecasting accuracy levels.

More Details

Aeroelastic oscillations of a pitching flexible wing with structural geometric nonlinearities: Theory and numerical simulation

Journal of Sound and Vibration

Robinson, Brandon; Da Costa, Leandro; Poirel, Dominique; Pettit, Chris; Khalil, Mohammad K.; Sarkar, Abhijit

This paper focuses on the derivation of an analytical model of the aeroelastic dynamics of an elastically mounted flexible wing. The equations of motion obtained serve to help understand the behaviour of the aeroelastic wind tunnel setup in question, which consists of a rectangular wing with a uniform NACA 0012 airfoil profile, whose base is free to rotate rigidly about a longitudinal axis. Of particular interest are the structural geometric nonlinearities primarily introduced by the coupling between the rigid body pitch degree-of-freedom and the continuous system. A coupled system of partial differential equations (PDEs) coupled with an ordinary differential equation (ODE) describing axial-bending-bending-torsion-pitch motion is derived using Hamilton's principle. A finite dimensional approximation of the system of coupled differential equations is obtained using the Galerkin method, leading to a system of coupled nonlinear ODEs. Subsequently, these nonlinear ODEs are solved numerically using Houbolt's method. The results that are obtained are verified by comparison with the results obtained by direct integration of the equations of motion using a finite difference scheme. Adopting a linear unsteady aerodynamic model, it is observed that the system undergoes coalescence flutter due to coupling between the rigid body pitch rotation dominated mode and the first flapwise bending dominated mode. The behaviour of the limit cycle oscillations is primarily influenced by the structural geometric nonlinear terms in the coupled system of PDEs and ODE.

More Details

Modeling strength and failure variability due to porosity in additively manufactured metals

Computer Methods in Applied Mechanics and Engineering

Khalil, Mohammad K.; Teichert, Gregory H.; Alleman, Coleman A.; Heckman, Nathan H.; Jones, Reese E.; Garikipati, K.; Boyce, Brad B.

To model and quantify the variability in plasticity and failure of additively manufactured metals due to imperfections in their microstructure, we have developed uncertainty quantification methodology based on pseudo marginal likelihood and embedded variability techniques. We account for both the porosity resolvable in computed tomography scans of the initial material and the sub-threshold distribution of voids through a physically motivated model. Calibration of the model indicates that the sub-threshold population of defects dominates the yield and failure response. Finally, the technique also allows us to quantify the distribution of material parameters connected to microstructural variability created by the manufacturing process, and, thereby, make assessments of material quality and process control.

More Details

Characterization of Partially Observed Epidemics - Application to COVID-19

Safta, Cosmin S.; Ray, Jaideep R.; Laros, James H.; Catanach, Thomas A.; Chowdhary, Kamaljit S.; Debusschere, Bert D.; Galvan, Edgar; Geraci, Gianluca G.; Khalil, Mohammad K.; Portone, Teresa P.

This report documents a statistical method for the "real-time" characterization of partially observed epidemics. Observations consist of daily counts of symptomatic patients, diagnosed with the disease. Characterization, in this context, refers to estimation of epidemiological parameters that can be used to provide short-term forecasts of the ongoing epidemic, as well as to provide gross information for the time-dependent infection rate. The characterization problem is formulated as a Bayesian inverse problem, and is predicated on a model for the distribution of the incubation period. The model parameters are estimated as distributions using a Markov Chain Monte Carlo (MCMC) method, thus quantifying the uncertainty in the estimates. The method is applied to the COVID-19 pandemic of 2020, using data at the country, provincial (e.g., states) and regional (e.g. county) levels. The epidemiological model includes a stochastic component due to uncertainties in the incubation period. This model-form uncertainty is accommodated by a pseudo-marginal Metropolis-Hastings MCMC sampler, which produces posterior distributions that reflect this uncertainty. We approximate the discrepancy between the data and the epidemiological model using Gaussian and negative binomial error models; the latter was motivated by the over-dispersed count data. For small daily counts we find the performance of the calibrated models to be similar for the two error models. For large daily counts the negative-binomial approximation is numerically unstable unlike the Gaussian error model. Application of the model at the country level (for the United States, Germany, Italy, etc.) generally provided accurate forecasts, as the data consisted of large counts which suppressed the day-to-day variations in the observations. Further, the bulk of the data is sourced over the duration before the relaxation of the curbs on population mixing, and is not confounded by any discernible country-wide second wave of infections. At the state-level, where reporting was poor or which evinced few infections (e.g., New Mexico), the variance in the data posed some, though not insurmountable, difficulties, and forecasts were able to capture the data with large uncertainty bounds. The method was found to be sufficiently sensitive to discern the flattening of the infection and epidemic curve due to shelter-in-place orders after around 90% quantile for the incubation distribution (about 10 days for COVID-19). The proposed model was also used at a regional level to compare the forecasts for the central and north-west regions of New Mexico. Modeling the data for these regions illustrated different disease spread dynamics captured by the model. While in the central region the daily counts peaked in the late April, in the north-west region the ramp-up continued for approximately three more weeks.

More Details

Dakota, A Multilevel Parallel Object-Oriented Framework for Design Optimization Parameter Estimation Uncertainty Quantification and Sensitivity Analysis: Version 6.12 User's Manual

Adams, Brian M.; Bohnhoff, William J.; Dalbey, Keith D.; Ebeida, Mohamed S.; Eddy, John P.; Eldred, Michael S.; Hooper, Russell H.; Hough, Patricia D.; Hu, Kenneth H.; Jakeman, John D.; Khalil, Mohammad K.; Maupin, Kathryn A.; Monschke, Jason A.; Ridgway, Elliott M.; Rushdi, Ahmad R.; Seidl, Daniel T.; Stephens, John A.; Swiler, Laura P.; Winokur, Justin W.

The Dakota toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

More Details

Transitional Markov Chain Monte Carlo Sampler in UQTk

Safta, Cosmin S.; Khalil, Mohammad K.; Najm, H.N.

Transitional Markov Chain Monte Carlo (TMCMC) is a variant of a class of Markov Chain Monte Carlo algorithms known as tempering-based methods. In this report, the implementation of TMCMC in the Uncertainty Quantification Toolkit is investigated through the sampling of high-dimensional distributions, multi-modal distributions, and nonlinear manifolds. Furthermore, the Bayesian model evidence estimates obtained from TMCMC are tested on problems with known analytical solutions and shown to provide consistent results.

More Details

UQTk User Manual (V.3.1.0)

Sargsyan, Khachik S.; Safta, Cosmin S.; Johnston, Katherine J.; Khalil, Mohammad K.; Chowdhary, Kamaljit S.; Rai, Prashant R.; Casey, Tiernan A.; Zeng, Xiaoshu; Debusschere, Bert D.

The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.1.0 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sensitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

More Details

Wave data assimilation in support of wave energy converter powerprediction: Yakutat, Alaska case study

Proceedings of the Annual Offshore Technology Conference

Dallman, Ann R.; Khalil, Mohammad K.; Raghukumar, Kaus; Jones, Craig; Kasper, Jeremy; Flanary, Christopher; Chang, Grace; Roberts, Jesse D.

Integration of renewable power sources into grids remains an active research and development area,particularly for less developed renewable energy technologies such as wave energy converters (WECs).WECs are projected to have strong early market penetration for remote communities, which serve as naturalmicrogrids. Hence, accurate wave predictions to manage the interactions of a WEC array with microgridsis especially important. Recently developed, low-cost wave measurement buoys allow for operationalassimilation of wave data at remote locations where real-time data have previously been unavailable. This work includes the development and assessment of a wave modeling framework with real-time dataassimilation capabilities for WEC power prediction. The availability of real-time wave spectral componentsfrom low-cost wave measurement buoys allows for operational data assimilation with the Ensemble Kalmanfilter technique, whereby measured wave conditions within the numerical wave forecast model domain areassimilated onto the combined set of internal and boundary grid points while taking into account model andobservation error covariances. The updated model state and boundary conditions allow for more accuratewave characteristic predictions at the locations of interest. Initial deployment data indicated that measured wave data from one buoy that were assimilated intothe wave modeling framework resulted in improved forecast skill for a case where a traditional numericalforecast model (e.g., Simulating WAves Nearshore; SWAN) did not well represent the measured conditions.On average, the wave power forecast error was reduced from 73% to 43% using the data assimilationmodeling with real-time wave observations.

More Details

Wave data assimilation in support of wave energy converter powerprediction: Yakutat, Alaska case study

Proceedings of the Annual Offshore Technology Conference

Dallman, Ann R.; Khalil, Mohammad K.; Raghukumar, Kaus; Jones, Craig; Kasper, Jeremy; Flanary, Christopher; Chang, Grace; Roberts, Jesse D.

Integration of renewable power sources into grids remains an active research and development area,particularly for less developed renewable energy technologies such as wave energy converters (WECs).WECs are projected to have strong early market penetration for remote communities, which serve as naturalmicrogrids. Hence, accurate wave predictions to manage the interactions of a WEC array with microgridsis especially important. Recently developed, low-cost wave measurement buoys allow for operationalassimilation of wave data at remote locations where real-time data have previously been unavailable. This work includes the development and assessment of a wave modeling framework with real-time dataassimilation capabilities for WEC power prediction. The availability of real-time wave spectral componentsfrom low-cost wave measurement buoys allows for operational data assimilation with the Ensemble Kalmanfilter technique, whereby measured wave conditions within the numerical wave forecast model domain areassimilated onto the combined set of internal and boundary grid points while taking into account model andobservation error covariances. The updated model state and boundary conditions allow for more accuratewave characteristic predictions at the locations of interest. Initial deployment data indicated that measured wave data from one buoy that were assimilated intothe wave modeling framework resulted in improved forecast skill for a case where a traditional numericalforecast model (e.g., Simulating WAves Nearshore; SWAN) did not well represent the measured conditions.On average, the wave power forecast error was reduced from 73% to 43% using the data assimilationmodeling with real-time wave observations.

More Details

Uncertainty Quantification of Microstructural Material Variability Effects

Jones, Reese E.; Boyce, Brad B.; Frankel, Ari L.; Heckman, Nathan H.; Khalil, Mohammad K.; Ostien, Jakob O.; Rizzi, Francesco N.; Tachida, Kousuke K.; Teichert, Gregory H.; Templeton, Jeremy A.

This project has developed models of variability of performance to enable robust design and certification. Material variability originating from microstructure has significant effects on component behavior and creates uncertainty in material response. The outcomes of this project are uncertainty quantification (UQ) enabled analysis of material variability effects on performance and methods to evaluate the consequences of microstructural variability on material response in general. Material variability originating from heterogeneous microstructural features, such as grain and pore morphologies, has significant effects on component behavior and creates uncertainty around performance. Current engineering material models typically do not incorporate microstructural variability explicitly, rather functional forms are chosen based on intuition and parameters are selected to reflect mean behavior. Conversely, mesoscale models that capture the microstructural physics, and inherent variability, are impractical to utilize at the engineering scale. Therefore, current efforts ignore physical characteristics of systems that may be the predominant factors for quantifying system reliability. To address this gap we have developed explicit connections between models of microstructural variability and component/system performance. Our focus on variability of mechanical response due to grain and pore distributions enabled us to fully probe these influences on performance and develop a methodology to propagate input variability to output performance. This project is at the forefront of data-science and material modeling. We adapted and innovated from progressive techniques in machine learning and uncertainty quantification to develop a new, physically-based methodology to address the core issues of the Engineering Materials Reliability (EMR) research challenge in modeling constitutive response of materials with significant inherent variability and length-scales.

More Details
Results 26–50 of 112
Results 26–50 of 112