TH2.R4.1

On the Relation Between the Common Information Dimension and Wyner Common Information

Osama Hanna, Xinlin Li, Suhas Diggavi, Christina Fragouli, University of California, Los Angeles, United States

Session:
Information Measures and Randomness

Track:
9: Information Measures

Location:
Omikron II

Presentation Time:
Thu, 11 Jul, 11:30 - 11:50

Session Chair:
Suhas Diggavi, UCLA
Abstract
In this paper, we are interested in the regime where the common information between two Gaussian random vectors (X, Y) can be (or can approach) infinity. We ask two main questions: what is the rate of growth for common information from a finite to an infinite number of bits, as the dependency between the variables increases and how well can we “approximately" simulate a pair of random variables (X, Y) with infinite common information using a finite number of shared bits? We analytically prove that the answer to both of these questions depends on the common information dimension d(X, Y) between X and Y, that we introduced in our recent work [1]. Our work characterizes in a closed form the asymptotic behaviors, by building a connection to singular values associated with the covariance matrix Σ of (X, Y). We conclude the paper by providing numerical evaluation results that indicate fast convergence to the asymptotic regime.
Resources