Entropy Vectors and Network Optimization

Friday, October 22, Giedt Hall 1003, 12:00pm-1:00pm

Prof. Babak Hassibi
California Institute of Technology
Host: Professor Anna Scaglione

Abstract:

We study network information theory problems through the notion of entropic vectors. Given n random variables, the 2^n-1 dimensional vector obtained from all possible joint entropies is called an entropic vector. The space of entropic vectors is a convex cone and a large class of network information theory problems can be cast as linear optimization over this convex cone. While this formulation circumvents the "non-convex" and "infinite-letter" characterizations of earlier formulations, the challenge is that the convex cone of entropic vectors is not known for n>4 random variables. In this talk, we develop inner and outer bounds to this space, and describe connections to finite group theory, quasi-uniform distributions, non-Shannon inequalities, matroids, and Cayley's hyperdeterminant. As concrete examples, we will show how determining optimal binary linear codes over GF(2), for arbitrary networks, reduces to linear programming and will develop distributed Monte Carlo Markov chain algorithms for optimal network code design over small alphabet sizes.

Biography:

Babak Hassibi is professor and executive officer of electrical engineering at the California Institute of Technology, where he has been since 2001. From 1998 to 2001 he was a member of the technical staff at the Mathematical Sciences Research Center at Bell Laboratories, Murray Hill, NJ, and prior to that he obtained his PhD in electrical engineering from Stanford University. His research interests span different aspects of communications, signal processing and control. Among other awards, he is a recipient of the David and Lucille Packard Foundation Fellowship, and the Presidential Early Career Award for Scientists and Engineers (PECASE).