Probabilitas Seminar Series: Yuejie Chi

Date: 

Friday, March 29, 2024, 10:30am to 11:30am

Location: 

Science Center 316

The Probabilitas Seminar series focuses on high-dimensional problems that combine statistics, probability, information theory, computer science, and other related fields. The upcoming seminar takes place on Friday, March 8, from 10:30-11:30am EST. This week's guest will be Yuejie Chi of the Electrical and Computer Engineering department at Carnegie Mellon University.

 

Title: Solving Inverse Problems with Generative Priors: From Low-rank to Diffusion Models

Abstract: Generative priors are effective countermeasures to combat the curse of dimensionality, and enable efficient learning and inversion that otherwise are ill-posed, in data science. This talk begins with the classical low-rank prior, and introduces scaled gradient descent (ScaledGD), a simple iterative approach to directly recover the low-rank factors for a wide range of matrix and tensor estimation tasks. ScaledGD provably converges linearly at a constant rate independent of the condition number at near-optimal sample complexities, while maintaining the low per-iteration cost of vanilla gradient descent, even when the rank is overspecified and the initialization is random. Going beyond low rank, the talk discusses diffusion models as an expressive data prior in inverse problems, and introduces a plug-and-play method (Diffusion PnP) that alternatively calls two samplers, a data-dependent denoising diffusion sampler based solely on the score functions of data, and a data-independent sampler solely based on the forward model. Performance guarantees and numerical examples will be demonstrated to illustrate the promise.