CCSP Seminar: Yihong Wu, "Self-Regularizing Property of Nonparametric Maximum Likelihood Estimator"

Thursday, April 1, 2021
12:30 p.m.
Online Presentation
Zitan Chen
chenztan@umd.edu

Communication, Control and Signal Processing Seminar

Self-Regularizing Property of Nonparametric Maximum Likelihood Estimator in Mixture Models

Yihong Wu
Yale University

Zoom link: https://umd.zoom.us/s/94188812312

Abstract

Introduced by Kiefer and Wolfowitz 1956, the nonparametric maximum likelihood estimator (NPMLE) is a widely used methodology for learning mixture models and empirical Bayes estimation. Sidestepping the non-convexity in mixture likelihood, the NPMLE estimates the mixing distribution by maximizing the total likelihood over the space of probability measures, which can be viewed as an extreme form of overparameterization.

In this work we discover a surprising property of the NPMLE solution. Consider, for example, a Gaussian mixture model on the real line with a subgaussian mixing distribution. Leveraging complex-analytic techniques, we show that with high probability the NPMLE based on a sample of size n has O(\log n) atoms (mass points), significantly improving the deterministic upper bound of n due to Lindsay 1983. Notably, any such Gaussian mixture is statistically indistinguishable from a finite one with O(\log n) components (and this is tight for certain mixtures). Thus, absent any explicit form of model selection, NPMLE automatically chooses the right model complexity, a property we term self-regularization. Statistical applications and extensions to other exponential families will be given. Connections to rate-distortion functions will be briefly discussed.

This is based on joint work with Yury Polyanskiy (MIT): https://arxiv.org/abs/2008.08244

Audience: Graduate  Undergraduate  Faculty  Post-Docs 

remind we with google calendar

 

April 2021

SU MO TU WE TH FR SA
28 29 30 31 1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 1
Submit an Event