Non-conformity Scores for High-Quality Uncertainty Quantification from Conformal Prediction
High-quality uncertainty quantification (UQ) is a critical component of enabling trust in deep learning (DL) models and is especially important if DL models are to be deployed in high-consequence applications. Conformal prediction (CP) methods represent an emerging nonparametric approach for producing UQ that is easily interpretable and, under weak assumptions, provides a guarantee regarding UQ quality. This report describes the research outputs of an Exploratory Express Laboratory Directed Research and Development (LDRD) project at Sandia National Laboratories. This project focused on how best to implement CP methods for DL models. This report introduces new methodology for obtaining high-quality UQ from DL models using CP methods, describes a novel system of assessing UQ quality, and provides experimental results that demonstrate the quality of the new methodology and utility of the UQ quality assessment system. Avenues for future research and discussion of potential impacts at Sandia and in the wider research community are also given.