Publications

2 Results

Search results

Jump to search filters

Analysis of Neural Networks as Random Dynamical Systems

Hudson, Joshua L.; Diaz-Ibarra, Oscar H.; D'Elia, Marta; Najm, Habib N.; Rosso, Haley; Ruthotto, Lars; Sargsyan, Khachik

In this report we present our findings and outcomes of the NNRDS (analysis of Neural Networks as Random Dynamical Systems) project. The work is largely motivated by the analogy of a large class of neural networks (NNs) with a discretized ordinary differential equation (ODE) schemes. Namely, residual NNs, or ResNets, can be viewed as a discretization of neural ODEs (NODEs) where the NN depth plays the role of the time evolution. We employ several legacy tools from ODE theory, such as stiffness, nonlocality, autonomicity, to enable regularization of ResNets thus improving their generalization capabilities. Furthermore, armed with NN analysis tools borrowed from the ODE theory, we are able to efficiently augment NN predictions with uncertainty overcoming wellknown dimensionality challenges and adding a degree of trust towards NN predictions. Finally, we have developed a Python library QUiNN (Quantification of Uncertainties in Neural Networks) that incorporates improved-architecture ResNets, besides classical feed-forward NNs, and contains wrappers to PyTorch NN models enabling several major classes of uncertainty quantification methods for NNs. Besides synthetic problems, we demonstrate the methods on datasets from climate modeling and materials science.

More Details

Quantifying model prediction sensitivity to model-form uncertainty

Portone, Teresa; White, Rebekah D.; Rosso, Haley; Bandy, Rileigh J.; Hart, Joseph L.

Computational and mathematical models are essential to understanding complex systems and phenomena. However, when developing such models, limited knowledge and/or resources necessitates the use of simplifying assumptions. It is therefore crucial to quantify the impact of such simplifying assumptions on the reliability and accuracy of resulting model predictions. This work develops a first-of-its-kind approach to quantify the impact of physics modeling assumptions on predictions. Here, we leverage the emerging field of model-form uncertainty (MFU) representations, which are parameterized modifications to modeling assumptions, in combination with grouped Sobol’ indices to quantitatively measure an assumption’s importance. Specifically, we compute the grouped Sobol’ index for the MFU representation’s parameters as a single importance measure of the assumption for which the MFU representation characterizes uncertainty. To ensure this approach is robust to the subjective choice of how to parameterize a MFU representation, we establish bounds for the difference between sensitivity results for two different MFU representations based on differences in model prediction statistics. The capabilities associated with this approach are demonstrated on three exemplar problems: an upscaled subsurface contaminant transport problem, ablation modeling for hypersonic flight, and nuclear waste repository modeling. We found that our grouped approach is able to assess the impact of modeling assumptions on predictions and offers computational advantages over classical Sobol’ index computation while providing more interpretable results.

More Details
2 Results
2 Results