Seminar: Learning with entropy-regularized optimal transport

SpeakerAude Genevay
AffiliationMIT
DateThursday, 27 Feb 2020
Time13:00 - 14:00
LocationRoberts 106
Event seriesDeepMind CSML Seminar Series
Description

Abstract: Entropy-regularized OT (EOT) was first introduced by Cuturi in 2013 as a solution to the computational burden of OT for machine learning problems. In this talk, after studying the properties of EOT, we will introduce a new family of losses between probability measures called Sinkhorn Divergences. Based on EOT, this family of losses actually interpolates between OT (no regularization) and MMD (infinite regularization). We will illustrate these theoretical claims on a set of learning problems formulated as minimizations over the space of measures.

Bio: Aude Genevay is a postdoctoral researcher in the Geometric Data Processing group at MIT, working with Justin Solomon. Prior to that she obtained at PhD in Mathematics from Ecole Normale Supérieure under the supervision of Gabriel Peyré.

iCalendar csml_id_408.ics