Narayan receives NSF funding for shared information work
The research will develop the concept of shared information as a fundamental, quantifiable, and compact measure for capturing interdependence among multiple correlated signals. It will emulate and enhance the spirit of Claude Shannon’s celebrated and enormously consequential notion of mutual information, which constitutes a measure of correlation between two random signals.
Narayan will investigate the role of shared information for operational meanings in network information theory with implications for related communication applications, and as a self-contained, compact, and calculable figure-of-merit that can be optimized in learning applications where statistical correlation is of central interest.
His goal is to establish central theoretical and practical roles for shared information in network data compression, distributed function computation, reliable and secure information transmission in networks, signal cluster detection, and a new category of statistical estimation and learning algorithms. Engineering applications include communication and signal processing in a smart home, satellite image reconstruction, and messaging protocols in automated guided vehicles and drone swarms.
Rooted in information theory, the research will connect to algorithms in combinatorial graph theory (in theoretical computer science) and correlated multiarmed bandits (in learning). It aims to create advances in network information theory through new models and methods that highlight interactive communication among terminals, with the concept of shared information serving as a linchpin. Links to important problems in combinatorial algorithms, by way of shared information, highlight interpretations that promise new understanding and solutions.
The estimation of shared information using correlated multiarmed bandits will introduce models, concepts, and algorithms in an essential fledgling realm of machine learning. The research will pull methods from information theory, Markov random fields, combinatorial graph theory, and statistical inference. Outcomes will include interactive techniques for multiuser data compression and channel transmission, algorithms for combinatorial tree packing, methods for detecting clusters of correlated signals, and bandit algorithms for parameter estimation in correlated signals.
Published May 21, 2023