Master Class: Lecture 2

SpeakerMartin Wainright
AffiliationUC Berkeley
DateTuesday, 26 Feb 2013
Time13:00 - 14:00
LocationRoberts G06 Sir Ambrose Fleming LT
Event seriesMaster Class: Martin Wainwright (25 Feb - 1 Mar 2013)

High-dimensional statistics deals with problems in which the number of samples n is of the same order as, or substantially smaller than the ambient dimension p of the data. The study of such "large p, small n" problems dates back to work of Kolmogorov, and has been the subject of intensive study over the past several decades. Of course, these problems are ill-posed without further restrictions, and so a large body of research has focused on models that endowed with some form of low-dimensional structure. Examples include vectors that are sparse (with relatively few non-zero entries), matrices that are sparse and/or low-rank, and regression functions that are defined on manifolds.

In these lectures, we survey certain aspects of this rapidly evolving field, beginning with sparse vector estimation and its applications to graphical model selection, before moving onto more general high-dimensional M-estimators, and concluding with a look at high-dimensional non-parametrics.

Lecture 2: A general framework for high-dimensional $M$-estimators: Many statistical methods are based on minimizing the combination of a loss function with a regularizer, and are known as regularized $M$-estimators. In this lecture, we build on the ideas from the first lecture to develop a more general framework for understanding such $M$-estimators in high dimensions. Two properties turn out to be key: restricted strong convexity of the loss, and decomposability of the regularizer. We show how this general theory recovers as corollaries minimax-optimal rates for estimating sparse matrices, structured matrices, and low-rank matrices.

Slides: Wainwright_Lecture2.pdf


iCalendar csml_id_110.ics