Seminar: Entropy-Relevant Dimensions in Kernel Feature Space

SpeakerRobert Jenssen
AffiliationUniversity of Tromso, Norway
DateFriday, 05 Apr 2013
Time12:30 - 14:00
LocationMalet Place Eng 1.03
Event seriesDeepMind CSML Seminar Series

The non-linear mapping to feature space is a very important concept in kernel-based machine learning for signal processing, within the framework of positive semi-definite (psd) kernels. Given labeled data, algorithms such as support vector machines or projection methods such as Fisher discriminant analysis may be executed in feature space. For unsupervised dimensionality reduction in feature space, the most common approach is to perform principal component analysis (PCA) in that space, thus maximally capturing the variability of the feature space data, however without necessarily capturing any cluster structure in the data. In this talk, the theory behind the feature space mapping is considered and recent advances are reviewed which broaden the understanding and interpretability of the mapping in terms of a key input space quantity, namely the quadratic Renyi entropy of the data, via the eigenvalues and eigenfunctions of a psd convolution operator. Focusing on the unsupervised case, the identification of entropy-relevant dimensions in feature space is described. Recent results showing that these dimensions capture structure in the data in the form of clusters are reviewed, and it is shown that they are in general different from the kernel PCA dimensions. Differences between these approaches to dimensionality reduction for visualization and clustering are illustrated.

The talk is based on the paper R. Jenssen, "Entropy-Relevant Dimensions in Kernel Feature Space," to appear in the IEEE Signal Processing Magazine, special issue on advances in kernel-based learning for signal processing, July 2013.

iCalendar csml_id_140.ics