FR3.R4.2

Lower Bounds on Mutual Information for Linear Codes Transmitted over Binary Input Channels, and for Information Combining

Uri Erez, Tel Aviv University, Israel; Or Ordentlich, Hebrew University of Jerusalem, Israel; Shlomo Shamai (Shitz), Technion, Israel

Session:
Information Inequalities 2

Track:
9: Shannon Theory

Location:
Omikron II

Presentation Time:
Fri, 12 Jul, 14:55 - 15:15

Session Chair:
Venkat Anantharam, University of California, Berkeley
Abstract
It has been known for a long time that the mutual information between the input sequence and output of a binary symmetric channel (BSC) is upper bounded by the mutual information between the same input sequence and the output of a binary erasure channel (BEC) with the same capacity. Recently, Samorodintsky discovered that one may also lower bound the BSC mutual information in terms of the mutual information between the same input sequence and a more capable BEC. In this paper, we strengthen Samordnitsky's bound for the special case where the input to the channel is distributed uniformly over a linear code. Furthermore, for a general (not necessarily binary) input distribution $P_X$ and channel $W_{Y|X}$, we derive a new lower bound on the mutual information $I(X;Y^n)$ for $n$ transmissions of $X\sim P_X$ through the channel $W_{Y|X}$.
Resources