Adaptive, Robust and Scalable Bayesian Filtering for Online Learning

venue: PhD thesis
date: May 2025
Authors: Gerardo Duran-Martin

Abstract

In this thesis, we introduce Bayesian filtering as a principled framework for tackling diverse sequential machine learning problems, including online (continual) learning, prequential (one-step-ahead) forecasting, and contextual bandits. To this end, this thesis addresses key challenges in applying Bayesian filtering to these problems: adaptivity to non-stationary environments, robustness to model misspecification and outliers, and scalability to the high-dimensional parameter space of deep neural networks. We develop novel tools within the Bayesian filtering framework to address each of these challenges, including: (i) a modular framework that enables the development adaptive approaches for online learning; (ii) a novel, provably robust filter with similar computational cost to standard filters, that employs Generalised Bayes; and (iii) a set of tools for sequentially updating model parameters using approximate second-order optimisation methods that exploit the overparametrisation of high-dimensional parametric models such as neural networks. Theoretical analysis and empirical results demonstrate the improved performance of our methods in dynamic, high-dimensional, and misspecified models.

Citation

1
2
3
4
5
6
@phdthesis{duran2025adaptive,
  title={Adaptive, Robust, and Scalable Bayesian Filtering for Online Learning},
  author={Duran-Martin, Gerardo},
  year={2025},
  school={Queen Mary University of London}
}