Abstract
In this talk, I summarize Generalised Variational Inference—a family of methods that is geared towards scalable and robust inference in Bayesian machine learning models. I cover its relationship to other variational and Bayes-like methods, its modularity and the way it can be used to address various challenges in machine learning, as well as its applications to Bayesian Neural Networks, (Deep) Gaussian Processes, and Bayesian On-line Changepoint Detection in the presence of outliers.
Date
Sep 29, 2020 2:00 PM
Event
Research talks at various research organisations, including at Columbia University, New York University, Cornell University, The University of Oxford, Google Brain, Facebook Research at Menlo Park, Sheffield University, Lancaster University …
Associate Professor and EPSRC Fellow in Machine Learning & Statistics
My research interests include robust Bayesian methods, generalised and post-Bayesian methodology, variational methods, and simulators.