## Seminar: NIPS preview talks

Speaker | Alfredo Kalaitzis, Bernadino Romero Paredes, Dino Sejdinovic |
---|---|

Affiliation | UCL |

Date | Friday, 29 Nov 2013 |

Time | 13:00 - 14:00 |

Location | Malet Place Engineering 1.03 |

Event series | DeepMind CSML Seminar Series |

Description |
An opportunity for those with papers accepted at NIPS to practice their talks and for all those not going to get a preview! Talks will be in NIPS format 17mins + 3mins questions. Talk 1: Alfredo Kalaitzis (with Ricardo Silva), Flexible sampling of discrete data correlations without the marginal distributions Learning the joint dependence of discrete variables is a fundamental problem in machine learning, with many applications including prediction, clustering and dimensionality reduction. More recently, the framework of copula modeling has gained popularity due to its modular parametrization of joint distributions. Among other properties, copulas provide a recipe for combining flexible models for univariate marginal distributions with parametric families suitable for potentially high dimensional dependence structures. More radically, the extended rank likelihood approach of Hoff (2007) bypasses learning marginal models completely when such information is ancillary to the learning task at hand as in, e.g., standard dimensionality reduction problems or copula parameter estimation. The main idea is to represent data by their observable rank statistics, ignoring any other information from the marginals. Inference is typically done in a Bayesian framework with Gaussian copulas, and it is complicated by the fact this implies sampling within a space where the number of constraints increases quadratically with the number of data points. The result is slow mixing when using off-the-shelf Gibbs sampling. We present an efficient algorithm based on recent advances on constrained Hamiltonian Markov chain Monte Carlo that is simple to implement and does not require paying for a quadratic cost in sample size. Slides for the talk: PDF Talk 2: Bernadino Romero Paredes (with Massi Pontil), A New Convex Relaxation for Tensor Completion Tensors can be succesfully employed to model the relationships between more than two entities, such as users, products, aspects, and time. Because of this, tensor completion has received a lot of interest recently in several fields such as computer vision, recommendation systems and natural language processing as the natural extension of matrix completion. A prominent methodology for matrix completion is low rank matrix learning by way of trace norm regularization. A generalization framework of this to tensor completion has been studied by several recent works. In this talk, I will highlight some limitations of this approach and propose an alternative convex relaxation on the Euclidean ball. I will then describe a technique to solve the associated regularization problem, which builds upon the alternating direction method of multipliers. Experiments on one synthetic dataset and two real datasets indicate that the proposed method improves significantly over tensor trace norm regularization in terms of estimation error, while remaining computationally tractable. Slides for the talk: PDF We introduce kernel nonparametric tests for Lancaster three-variable interaction and for total independence, using embeddings of signed measures into a reproducing kernel Hilbert space. The resulting test statistics are straightforward to compute, and are used in powerful interaction tests, which are consistent against all alternatives for a large family of reproducing kernels. We show the Lancaster test to be sensitive to cases where two independent causes individually have weak influence on a third dependent variable, but their combined effect has a strong influence. This makes the Lancaster test especially suited to finding structure in directed graphical models, where it outperforms competing nonparametric tests in detecting such V-structures. Slides for the talk: PDF |

iCalendar | csml_id_156.ics |