Information Theory and Distributed Computation

We are interested in analyzing the computational power of networks of nonlinear nodes both in the classical regime and the quantum regime. Interesting questions that arise are how rates of computation scale with the overall power consumption but also with the size of a given network.

Applications of information theory

Optical circuits involving few photons provide a platform for power-efficient, unavoidably-noisy computing. We work on the analysis and simulation of quantum optical circuits for information-processing tasks like error correction for a communication channel. We have developed examples (like an optical low-density parity-check decoder) and descriptions of how to systematically turn anything, say a pizza, into a communication channel.

Other work is on connections of information theory, thermodynamics, and population genetics. We have failed on finding the nearest time-reversible dynamical system to a given non-reversible one, analyzing time-reversal large deviations events, and extending information-theoretic quantities like the mutual information and Kullback-Leibler divergence to study the fluctuations of genetic drift in a population.

We also work on data compression. Past and ongoing projects include human genome compression and compression of tabular text data, like online activity logs and genomic variants across a population.

Nonlinear Information Processing Capacity of Classical and Quantum Networks

We are interested in studying how given network topologies enable schemes of distributed computation at very low power. Networks of dissipative nonlinear quantum systems are an interesting model to study whether the overall information processing capacity per node is different for noisy classical systems and noisy quantum systems.