Generalized Bayesian Inference for Discrete Intractable Likelihood

Abstract

Discrete state spaces represent a major computational challenge to statistical inference, since the computation of normalization constants requires summation over large or possibly infinite sets, which can be impractical. This article addresses this computational challenge through the development of a novel generalized Bayesian inference procedure suitable for discrete intractable likelihood. Inspired by recent methodological advances for continuous data, the main idea is to update beliefs about model parameters using a discrete Fisher divergence, in lieu of the problematic intractable likelihood. The result is a generalized posterior that can be sampled from using standard computational tools, such as Markov chain Monte Carlo, circumventing the intractable normalizing constant. The statistical properties of the generalized posterior are analyzed, with sufficient conditions for posterior consistency and asymptotic normality established. In addition, a novel and general approach to calibration of generalized posteriors is proposed. Applications are presented on lattice models for discrete spatial data and on multivariate models for count data, where in each case the methodology facilitates generalized Bayesian inference at low computational cost. Supplementary materials for this article are available online.

Publication
Journal of the American Statistical Association
Jeremias Knoblauch
Jeremias Knoblauch
Associate Professor and EPSRC Fellow in Machine Learning & Statistics

My research interests include robust Bayesian methods, generalised and post-Bayesian methodology, variational methods, and simulators.