Good Margins Make Good Neighbors
Speaker: Aryeh Kontorovich, Ben-Gurion University of the Negev
Department: Electrical Engineering
Location: Engineering Quadrangle B205
Date/Time: Thursday, September 18, 2014, 4:30 p.m. - 5:30 p.m.
Although well-known by practitioners to be an effective classification tool, nearest-neighbor methods have been somewhat neglected by earning theory of late. The goal of this talk is to revive interest in this time-tested technique by recasting it in a modern perspective. We will present a paradigm of margin-regularized 1-nearest neighbor classification which: (i) is Bayes-consistent (ii) yields simple,usable finite-sample error bounds (iii) provides for very efficient algorithms with a principled speed-accuracy tradeoff (iv) allows for near-optimal sample compression. Further extensions include multiclass, regression, and metric dimensionality reduction. I will argue that the regularized 1-nearest neighbor is superior to k-nearest neighbors in several crucial statistical and computational aspects.
Based on a series of works with: Lee-Ad Gottlieb, Robert Krauthgamer, Roi Weiss