CALD Seminar

  • Tom Minka

Expectation Propagation for Approximate Bayesian Inference

Bayesian inference is often very computational and so fast approximations have great potential. Recently, two fast algorithms have been shown to be effective for Bayesian inference: loopy belief propagation and assumed-density filtering. I will show how these two algorithms can be unified and generalized into a powerful framework for approximate inference. This framework, "expectation propagation," enjoys the best of both worlds: like assumed-density filtering, it works on hybrid discrete/continuous belief networks and it allows approximations which are not completely factorized. But like belief propagation, it uses iterative refinement, not just a single forward-backward pass. I demonstrate this method on a difficult but important model: the Bayes point machine, a Bayesian alternative to support vector machine.
For More Information, Please Contact: 
Catherine Copetas, copetas@cs.cmu.edu