Event
IAI Colloquia Series: Piya Pal, "Sampling Beyond Nyquist"
Wednesday, October 1, 2014
4:00 p.m.
1146 AV Williams Bldg
http://www.isr.umd.edu/events/iai-colloquia
Intelligent Automation, Inc. Colloquia Series
Sampling Beyond Nyquist: Structure, Geometry and Statistical Information
Piya Pal
Assistant Professor
Electrical and Computer Engineering
Affiliate Faculty, Institute for Systems Research
| video |
Abstract
Nyquist-Shannon Sampling Theorem has been the foundation of digital signal processing, dictating the rate at which an analog signal can be sampled and reconstructed from its samples without any loss of information. However, such sampling rates, based entirely on the frequency domain characterization of signals, prove to be insufficient in modern signal processing tasks, which is mainly centered around processing high dimensional data. Instead of merely looking at the frequency content, a more efficient way to represent and process these signals is to find low dimensional structure and prior information, that can lead to more efficient sampling strategies. In this talk, I will explore the role of sparsity, structure and statistical information in deciding such efficient sampling strategies. In the first part of the talk, I will show how statistical information alone can lower sampling rates much below the Nyquist rate. We will consider sampling of wideband wide sense stationary random processes, which arise frequently in a large number of applications such as spectrum sensing, source localization etc. New sampling strategies such as nested, and coprime sampling will be shown to play pivotal roles in pushing sampling rates below Nyquist for these signals. The second part of the talk will introduce additional low dimensional structures on the signal of interest in the form of sparsity and low rank. In this case, it will be possible to reduce the sampling rate even further by incorporating statistical priors alongside low dimensional representation of the data. Finally, we will consider quadratic sampling and compression of large dimensional streaming data, and provide new guarantees on compression and recovery of structured covariance matrices characterizing such data streams.