TH3.R4.1

Several Interpretations of Max-Sliced Mutual Information

Dor Tsur, Haim Permuter, Ben-Gurion University of the Negev, Israel; Ziv Goldfeld, Cornell University, United States

Session:
Information Measures II

Track:
9: Shannon Theory

Location:
Omikron II

Presentation Time:
Thu, 11 Jul, 14:35 - 14:55

Session Chair:
Marco Dalai, University of Brescia
Abstract
Max-sliced mutual information (mSMI) was recently proposed as a data-efficient measure of dependence. This measure extends popular correlation-based methods and proves useful in various machine learning tasks. In this paper, we extend the notion of mSMI to discrete variables and investigate its role in popular problems of information theory and statistics. We use mSMI to propose a soft version of the G\'acs-K\"orner common information, which, due to the mSMI structure, naturally extends to continuous domains and multivariate settings. We then characterize the optimal growth rate in a horse race with constrained side information. Additionally, we examine the error of independence testing under communication constraints. Finally, we study mSMI in communications. We characterize the capacity of discrete memoryless channels with constrained encoders and decoders, and propose an mSMI-based scheme to decode information obtained through remote sensing. These connections motivate the use of max-slicing in information theory, and benefit from its merits.
Resources