My academic research writings and publications

by Frederick Eaton

As a sometime computer scientist, I have mostly studied approximate inference algorithms, such as Belief Propagation, applied to probabilistic graphical models.

I completed my PhD in the Computational and Biological Learning Lab in the Department of Engineering at the University of Cambridge in England, graduating on January 21, 2012. I have continued writing papers sporadically after graduating. For some reason all of my papers are authored as “Frederik” with no “c”, same as my email address frederik at ofb dot net. My favorite paper is Model reductions for inference.


FH Eaton. Belief propagation generalizes backpropagation. arXiv preprint

This paper shows that Loopy Belief Propagation, which is the most important approximate inference algorithm, performs analogous operations to the Backpropagation algorithm when an input to the second algorithm is converted into a (probabilistic graphical model) form to which the first algorithm can be applied.

This is a purely theoretical paper, with numerical experiments left to future work. It continues in the earlier theme of approximate inference from my doctoral research.


FH Eaton, Z Ghahramani. Model reductions for inference: generality of pairwise, binary, and planar factor graphs. Neural Computation May 2013. Vol 25, No. 5, pp. 1213–1260

A much expanded version of the background chapter of my PhD thesis, with additional theorems.

This paper proves several results concerning the representability of statistical models using factor graphs with constraints on topology - planar or not; factor size - pairwise or n-wise; and variable domain size - binary or n-ary. It characterises the expressive power of planar binary pairwise graphs, proving that all strictly positive factor graphs can be reduced to this form. This new result extended the content of my PhD thesis, which had earlier pointed out that some (non-strictly-positive) factor graphs cannot be converted to binary pairwise form due to the special “median graph” structure that is maintained by a set of solutions to a 2-SAT problem. The notion of graph conversion or reduction used here is not new but had never been formalized, so the first part of the paper is an attempt to lay suitable foundations by defining what we think is meant by this.


FH Eaton. Guided inference: a protocol for teaching approximations.

Essentially a chapter from my PhD thesis, that was unfortunately never accepted by any machine learning conferences (I actually submitted it to five of them), so I am releasing it here as a preprint.

Introduces a simple framework called “guided inference” for exchanging information between two approximate inference algorithms, to model a kind of “learning to do inference”. Explores this framework in a toy setting, where the “teacher” is exact and the “student” is in some sense optimal. Shows empirically that the conditional game (below) is close to the best and most efficient method for choosing example states to transmit to the student.

FH Eaton. Combining Approximations for Inference (PhD thesis)

My PhD thesis, which explores the approximate inference problem from a functional perspective, asking what are the ways of combining two approximations and how these could be used to build new algorithms.


FH Eaton. A conditional game for comparing approximations. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics (AISTATS 2011). Notable Paper Award

Defines a game which can be used to compare the accuracy of two approximations to the marginal probabilities of a statistical model. Apparently this is the first method that anyone has proposed for making such comparisons. I was honored to be able to present the paper at AISTATS 2011.


FH Eaton, Z Ghahramani. Choosing a Variable to Clamp: Approximate Inference Using Conditioned Belief Propagation. In Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics (AISTATS 2009)

Proposes an algorithm for applying Belief Propagation to a model using divide-and-conquer, by recursively conditioning on specific variables. This algorithm can be seen as an approximate version of cutset conditioning. The paper also introduces “BBP” or “Back Belief Propagation”, an application of back-propagation to belief propagation, which is used for the purpose of choosing a variable on which to condition.

After publishing this paper, I discovered that the new “Back Belief Propagation” algorithm which my paper derives by applying back-propagation (i.e. reverse-mode automatic differentiation) to belief propagation, is essentially the same algorithm as that derived by Welling and Teh in 2004 in Linear Response Algorithms for Approximate Inference in Graphical Models, where forward-mode automatic differentiation is applied to belief propagation. This equivalence comes from the fact that the Hessian of the Bethe free energy is symmetrical, and is described in more detail here:


FH Eaton. Statically typed linear algebra in Haskell. In Proceedings of the 2006 ACM SIGPLAN workshop on Haskell (Haskell '06).

This conference abstract explores the possibility of creating a statically typed linear algebra library in the programming language Haskell, in other words a library for numerical mathematics where the conformability of operand dimensions for operations like matrix multiplication can be checked at compile time rather than runtime.

I have been interested in programming languages for a long time, and I made this small contribution before starting my PhD.