Statistics Colloquium: Pragya Sur (Harvard University)

Date: 

Monday, September 14, 2020, 10:30am to 11:30am

Location: 

Zoom - please contact emilie_campanelli@fas.harvard.edu for more information

Headshot of Pragya SurTitle:

A precise high-dimensional theory for Boosting

Abstract:

This talk will introduce a precise high-dimensional asymptotic theory for Boosting on separable data, taking both statistical and computational perspectives. We will consider the common modern setting where the number of features p and the sample size n are both large and comparable, and in particular, look at scenarios where the data is separable in an asymptotic sense. Under a class of statistical models, we will provide an (asymptotically) exact analysis of the generalization error of AdaBoost, when the algorithm interpolates the training data and maximizes an empirical L1 margin. The relation between the boosting test error and the optimal Bayes error can be explicitly pinned down using our theory. On the computational front, we provide a sharp analysis of the stopping time when boosting approximately maximizes the empirical L1 margin. Our theory provides several insights into properties of Boosting; for instance, the larger the dimensionality ratio p/n, the faster the optimization reaches interpolation. At the heart of our theory lies an in-depth study of the maximum L1-margin, which can be accurately described by a new system of non-linear equations; we analyze this margin and the properties of this system, using Gaussian comparison techniques and a novel uniform deviation argument. Time permitting, I will present a new class of boosting algorithms that correspond to Lq geometry, for q\(>\)1, together with results on their high-dimensional generalization and optimization behavior. This is based on joint work with Tengyuan Liang.