Sampling Methods for Uncertainty Quantification
Department of Computer Science
University of Central Florida
Orlando, FL, USA
List of Papers
Select one paper from the list below for your presentation. Papers are organized by course module. Each presentation should include: (1) the paper’s key contribution, (2) connections to course material (cite specific lectures), and (3) a critical analysis of strengths and limitations.
Module 1: Foundations
- On the statistical formalism of uncertainty quantification [pdf]
- Inverse problems: a Bayesian perspective [pdf]
- Position: there is no free Bayesian uncertainty quantification [pdf]
- Uncertainty quantification for machine learning: one size does not fit all [pdf]
- A tutorial on conformal prediction for reliable machine learning [pdf]
- Computational optimal transport [pdf]
- Rethinking aleatoric and epistemic uncertainty [pdf]
Module 2: MCMC
- Debiasing with couplings of Markov chains [pdf]
- Interacting Langevin diffusions: gradient structure and ensemble Kalman sampler [pdf]
- Continuously tempered Hamiltonian Monte Carlo [pdf]
- Optimal tuning of the hybrid Monte Carlo algorithm [pdf]
- MCMC methods for functions: modifying old algorithms to make them faster [pdf]
- Sequential Monte Carlo samplers [pdf]
- Sampling via gradient flows in the space of probability measures [pdf]
- Posterior sampling based on gradient flows [pdf]
Module 3: Variational Inference
- Variational inference for uncertainty quantification: an analysis of trade-offs [pdf]
- Do Bayesian neural networks actually behave like Bayesian models? [pdf]
- ELBO surgery: yet another way to carve up the variational evidence lower bound [pdf]
- Black box variational inference [pdf]
- Learning to draw samples with amortized Stein variational gradient descent [pdf]
- Information geometry of variational Bayes [pdf]
- An approximation theory framework for measure-transport sampling algorithms [pdf]
- Probabilistic inference and learning with Stein’s method [pdf]
- An introduction to sampling via measure transport [pdf]
- Adaptive symmetrization of the KL divergence [pdf]
- Frequentist validity of epistemic uncertainty estimators [pdf]
Module 4: Deep Generative Models
- Normalizing flows are capable generative models [pdf]
- Neural spline flows [pdf]
- Dynamical regimes of diffusion models [pdf]
- Closed-form diffusion models [pdf]
- Understanding diffusion objectives as the ELBO with simple data augmentation [pdf]
- Convergence of denoising diffusion models under the manifold hypothesis [pdf]
- Variational perspective on diffusion-based generative models and score matching [pdf]
- Consistency models [pdf]
- Elucidating the design space of diffusion-based generative models [pdf]
- Stochastic interpolants: a unifying framework for flows and diffusions [pdf]
- On the closed-form of flow matching: generalization does not arise from target stochasticity [pdf]
- Flow matching on general geometries [pdf]
- Scaling rectified flow transformers for high-resolution image synthesis [pdf]
- Input convex neural networks [pdf]
- Convex potential flows: universal probability distributions with optimal transport and convex optimization [pdf]
- Conditionally strongly log-concave generative models [pdf]
- Iterated denoising energy matching for sampling from Boltzmann densities [pdf]
- Energy-based transformers are scalable learners and thinkers [pdf]
Module 5: Simulation-Based Inference
- All-in-one simulation-based inference [pdf]
- Robust amortized Bayesian inference with self-consistency losses on unlabeled data [pdf]
- Contrastive neural ratio estimation [pdf]
- Diffusion models in simulation-based inference: a tutorial review [pdf]
- Fast and robust simulation-based inference with optimization Monte Carlo [pdf]
- Unifying summary statistic selection for approximate Bayesian computation [pdf]
- Compositional score modeling for simulation-based inference [pdf]
- Sensitivity-aware amortized Bayesian inference [pdf]
- Detecting model misspecification in Bayesian inverse problems via variational gradient descent [pdf]
- Sampling through iterated approximation: gradient-free and multi-fidelity Bayesian inference via transport [pdf]
- Step-DAD: semi-amortized policy-based Bayesian experimental design [pdf]
- Bayesian optimal experimental design via contrastive diffusions [pdf]
- Bayesian experimental design: a review [pdf]
- Generative modelling meets Bayesian inference: a new paradigm for inverse problems [pdf]
- Calibrated inference: statistical inference that accounts for both sampling uncertainty and distributional uncertainty [pdf]
- Ensemble Kalman diffusion guidance: a derivative-free method for inverse problems [pdf]
- Energy-based models for inverse imaging problems [pdf]