09A: Uncertainty Quantification

Materials:

Date: Tuesday, 24-Sep-2024

Pre-work:

  1. [blog] Expected Calibration Error
  2. [paper] Calibration in Deep Learning: A Survey of the State-of-the-Art
  3. [paper] On Calibration of Modern Neural Networks
  4. [tutorial] Introduction to Uncertainty in Deep Learning

In-Class

  1. A gentle introduction to Conformal Prediction and Distribution-free Uncertainty Quantification Video
  2. colab from DEEL-PUNCC

Post-class

  1. [paper] A tutorial on Conformal Prediction
  2. [paper] Towards Reliability using Pretrained Large Model Extensions
  3. [tools] awesome-conformal-prediction - a collection Conformal Prediction resources including implementations.
  4. [tools] crepes - Conformal Classifiers, Regressors, and Predictive Systems.
  5. [tools] TorchCP - a python toolbox for Conformal Prediction research in Deep Learning Models using PyTorch.
  6. [tools] MAPIE - a python toolbox for Conformal Prediction
  7. [tools] DEEL-PUNCC - a python toolbox for Conformal Prediction from DEEL.ai a project for Dependable, Certifiable, Explainable AI for Critical Systems. Checkout the sister projects from DEEl on Bias DEEL INFLUENCIAE, oodeel for OOD, xplique for XAI,

Notes

  1. Deep Learning models are not calibrated. They can make confident, but wrong mistakes.
  2. Conformal Prediction (CP) provides a rigorous statistical guarantees for the predictions by predicting sets and not points. For example, in a regression problem, one gets to predict an interval with guaranteed coverage probability. In a classification problem, CP may predict more than one class label.
  3. CP is model-agnostic and can work for a variety of tasks including, regression, multi-class classification, multi-label prediction, time-series models, and also useful in LLMs Conformal Language Modeling, even though it is still a research topic.
  4. It is a post-hoc technique and should be used in every project.