Jeremias Knoblauch

 

jeremias knoblauch

Abstract

In this talk, I advocate for an optimization-centric view on and introduce a generalization of Bayesian inference. Our inspiration is the representation of Bayes' rule as infinite-dimensional optimization problem (Csiszar, 1975; Donsker and Varadhan; 1975, Zellner; 1988). First, we use it to prove an optimality result of standard Variational Inference (VI): Under an optimization-centric view, the standard VI posterior is preferable to alternative approximations of the Bayesian posterior. Next, we argue for generalizing standard Bayesian inference. The need for this arises in situations of severe misalignment between reality and three assumptions underlying standard Bayesian inference: (1) Well-specified priors, (2) well-specified likelihoods, (3) the availability of infinite computing power. Our generalization addresses these shortcomings with three arguments and is called the Rule of Three (RoT). We recover existing posteriors as special cases, including the Bayesian posterior and its approximation by standard VI. In contrast, approximations based on alternative ELBO-like objectives violate the axioms. Finally, we study a special case of the RoT that we call Generalized Variational Inference (GVI). GVI posteriors are a large and tractable family of belief distributions specified by three arguments: A loss, a divergence and a variational family. GVI posteriors have appealing properties, including consistency and an interpretation as approximate ELBO.

Our speaker

Jeremias Knoblauch is a doctoral researcher working at the nexus of computer science and statistics within the Oxford-Warwick Statistics Programme. He is also currently a visiting researcher at the Alan Turing Institute and a Facebook Fellow. His current research efforts concern the optimization-centric generalization of Bayesian inference, with a focus on applications in statistical Machine Learning.