Projects
Meta-Uncertainty in Bayesian Model Comparison
Paper (AISTATS 2023) | Code | Project website | Poster | Presentation (15min)
Meta-Uncertainty represents a fully probabilistic framework for quantifying the uncertainty over Bayesian posterior model probabilities (PMPs) using meta-models. Meta-models integrate simulated and observed data into a predictive distribution for new PMPs and help reduce overconfidence and estimate the PMPs in future replication studies.
JANA: Jointly Amortized Neural Approximation of Complex Bayesian Models
Paper (arXiv) | Python library
JANA is a new amortized solution for intractable likelihood functions and posterior densities in Bayesian modeling. It trains three networks to learn both an approximate posterior and a surrogate model for the likelihood, enabling amortized marginal likelihood and posterior predictive estimation.
Detecting Model Misspecification in Amortized Simulation-Based Inference with Neural Networks
arXiv Preprint | Code | Poster
Novel neural network based architectures enable amortized Bayesian inference in settings where the likelihood function is only implicitly defined by a simulation program. But how faithful is such inference when simulations represent reality somewhat inaccurately? This paper illustrates how imposing a probabilistic structure on the latent data summary space can help to detect potentially catastrophic misspecifications during inference.