Publications

Results 101–125 of 228

Search results

Jump to search filters

A statistical approach for isolating fossil fuel emissions in atmospheric inverse problems

Journal of Geophysical Research

Yadav, Vineet; Michalak, Anna M.; Ray, Jaideep; Shiga, Yoichi P.

Independent verification and quantification of fossil fuel (FF) emissions constitutes a considerable scientific challenge. By coupling atmospheric observations of CO2 with models of atmospheric transport, inverse models offer the possibility of overcoming this challenge. However, disaggregating the biospheric and FF flux components of terrestrial fluxes from CO2 concentration measurements has proven to be difficult, due to observational and modeling limitations. In this study, we propose a statistical inverse modeling scheme for disaggregating winter time fluxes on the basis of their unique error covariances and covariates, where these covariances and covariates are representative of the underlying processes affecting FF and biospheric fluxes. The application of the method is demonstrated with one synthetic and two real data prototypical inversions by using in situ CO2 measurements over North America. Inversions are performed only for the month of January, as predominance of biospheric CO2 signal relative to FF CO2 signal and observational limitations preclude disaggregation of the fluxes in other months. The quality of disaggregation is assessed primarily through examination of a posteriori covariance between disaggregated FF and biospheric fluxes at regional scales. Findings indicate that the proposed method is able to robustly disaggregate fluxes regionally at monthly temporal resolution with a posteriori cross covariance lower than 0.15 µmolm-2 s-1 between FF and biospheric fluxes. Error covariance models and covariates based on temporally varying FF inventory data provide a more robust disaggregation over static proxies (e.g., nightlight intensity and population density). However, the synthetic data case study shows that disaggregation is possible even in absence of detailed temporally varying FF inventory data.

More Details

Online mapping and forecasting of epidemics using open-source indicators

Ray, Jaideep; Lefantzi, Sophia; Bauer, Joshua B.; Khalil, Mohammad; Rothfuss, Andrew J.; Cauthen, Katherine R.; Finley, Patrick D.; Smith, Halley

Open-source indicators have been proposed as a way of tracking and forecasting disease outbreaks. Some, such are meteorological data, are readily available as reanalysis products. Others, such as those derived from our online behavior (web searches, media article etc.) are gathered easily and are more timely than public health reporting. In this study we investigate how these datastreams may be combined to provide useful epidemiological information. The investigation is performed by building data assimilation systems to track influenza in California and dengue in India. The first does not suffer from incomplete data and was chosen to explore disease modeling needs. The second explores the case when observational data is sparse and disease modeling complexities are beside the point. The two test cases are for opposite ends of the disease tracking spectrum. We find that data assimilation systems that produce disease activity maps can be constructed. Further, being able to combine multiple open-source datastreams is a necessity as any one individually is not very informative. The data assimilation systems have very little in common except that they contain disease models, calibration algorithms and some ability to impute missing data. Thus while the data assimilation systems share the goal for accurate forecasting, they are practically designed to compensate for the shortcomings of the datastreams. Thus we expect them to be disease and location-specific.

More Details

Imputing data that are missing at high rates using a boosting algorithm

JSM Proceedings

Cauthen, Katherine R.; Lambert, Gregory; Ray, Jaideep; Lefantzi, Sophia

Traditional multiple imputation approaches may perform poorly for datasets with high rates of missingness unless many m imputations are used. This paper implements an alternative machine learning-based approach to imputing data that are missing at high rates. Here, we use boosting to create a strong learner from a weak learner fitted to a dataset missing many observations. This approach may be applied to a variety of types of learners (models). The approach is demonstrated by application to a spatiotemporal dataset for predicting dengue outbreaks in India from meteorological covariates. A Bayesian spatiotemporal CAR model is boosted to produce imputations, and the overall RMSE from a k-fold cross-validation is used to assess imputation accuracy.

More Details

Bayesian parameter estimation of a κ-ϵ Model for accurate jet-in-crossflow simulations

Journal of Aircraft

Ray, Jaideep; Lefantzi, Sophia; Arunajatesan, Srinivasan; Dechant, Lawrence

Reynolds-Averaged Navier-Stokes models are not very accurate for high-Reynolds-number compressible jet-incrossflow interactions. The inaccuracy arises from the use of inappropriate model parameters and model-form errors in the Reynolds-Averaged Navier-Stokes model. In this work, the hypothesis is pursued that Reynolds-Averaged Navier-Stokes predictions can be significantly improved by using parameters inferred from experimental measurements of a supersonic jet interacting with a transonic crossflow.ABayesian inverse problem is formulated to estimate three Reynolds-Averaged Navier-Stokes parameters (Cμ;Cϵ2;Cϵ1), and a Markov chain Monte Carlo method is used to develop a probability density function for them. The cost of the Markov chain Monte Carlo is addressed by developing statistical surrogates for the Reynolds-Averaged Navier-Stokes model. It is found that only a subset of the (Cμ;Cϵ2;Cϵ1) spaceRsupports realistic flow simulations.Ris used as a prior belief when formulating the inverse problem. It is enforced with a classifier in the current Markov chain Monte Carlo solution. It is found that the calibrated parameters improve predictions of the entire flowfield substantially when compared to the nominal/ literature values of (Cμ;Cϵ2;Cϵ1); furthermore, this improvement is seen to hold for interactions at other Mach numbers and jet strengths for which the experimental data are available to provide a comparison. The residual error is quantifies, which is an approximation of the model-form error; it is most easily measured in terms of turbulent stresses.

More Details

A robust technique to make a 2D advection solver tolerant to soft faults

Procedia Computer Science

Strazdins, Peter; Harding, Brendan; Lee, Chung; Mayo, Jackson R.; Ray, Jaideep; Armstrong, Robert C.

We present a general technique to solve Partial Differential Equations, called robust stencils, which make them tolerant to soft faults, i.e. bit flips arising in memory or CPU calculations. We show how it can be applied to a two-dimensional Lax-Wendroff solver. The resulting 2D robust stencils are derived using an orthogonal application of their 1D counterparts. Combinations of 3 to 5 base stencils can then be created. We describe how these are then implemented in a parallel advection solver. Various robust stencil combinations are explored, representing tradeoff between performance and robustness. The results indicate that the 3-stencil robust combinations are slightly faster on large parallel workloads than Triple Modular Redundancy (TMR). They also have one third of the memory footprint. We expect the improvement to be significant if suitable optimizations are performed. Because faults are avoided each time new points are computed, the proposed stencils are also comparably robust to faults as TMR for a large range of error rates. The technique can be generalized to 3D (or higher dimensions) with similar benefits.

More Details
Results 101–125 of 228
Results 101–125 of 228