Intuitively a dynamical system is any observable quantity which changes over time according to some fixed rule. Building models to understand, predict, and control dynamical systems has been a field of study for many years, resulting in a large and diverse array of distinct models. Each of these models offers its own unique advantages and disadvantages, and choosing the best model for a given task is a difficult problem where a variety of tradeoffs must be considered. In this work we explore the complex web of relationships between these models, and use the insights gained to derive new connections. Our goal is to unify these many diverse models into sophisticated hybrid models which offer the best of all worlds. In particular we focus on unifying the two main categories of models: Bayes Filters and Recurrent Neural Networks. Bayes Filters model dynamical systems as probability distributions and offer a wealth of statistical insight. In contrast Recurrent Neural Networks are complex functions design to produce accurate predictions, but lack the statistical theory of Bayes Filters. By drawing on insights from each of these fields we develop new models which combine an axiomatic statistical theory with rich functional forms, are widely applicable and offer state of the art performance.
Geoff Gordon (Chair)
Byron Boots (Georgia Institute of Technology)
Arthur Gretton (University College London)