CRACS Seminar by Jesse Davis (KU Leuven)

09 February 2012

Recent Advances in Markov Network Structure Learning

A Markov network is an undirected graphical model for compactly representing a joint probability distribution over a set of random variables. However, learning their structure from data is extremely difficult. When learning structure, scoring each candidate requires first learning the optimal weights for it. Weight learning cannot be done in closed form, and requires inference as a subroutine. In turn, inference is intractable.

In this talk, I will review some three classes of algorithms that address this task. The first approach is to treat this task as a global search problem. This is the traditional tacit for learning, but these algorithms are slow as they require running the expensive operation of weight learning many times.

The second, more recent, approach involves learning a set of local models and then combining them into a global model. These algorithms offer improved computational efficiency because they only employ weight learning once, but it can still be expensive to learn the local models for datasets that contain a large number of variables and/or examples.

Finally, the third and newest approach views Markov network structure learning as a feature generation problem. This style of algorithm combines some of the benefits of the two other approaches. First, it uses a data-driven, bottom-up fashion to quickly generate a large set of candidate features. Second, it only needs to perform weight learning once. I will present results from a large empirical study that details both the accuracy and run time characteristics of these approaches.

About Jesse Davis:

Jesse Davis joined K.U. Leuven in October 2010, where he is a lecturer and a member of the Machine Learning group. Jesse did a post-doc with Pedro Domingos at the University of Washington, where he worked on Markov logic networks, which are publicly available as the Alchemy system. Jesse was
also involved with their recent MURI award entitled A Unified Approach to Abductive Inference. Jesse did his Ph.D at the University of Wisconsin-Madison under the supervision of David Page.