FR2.R9.1

Information Exchange is Harder with Noise at Source

Manuj Mukherjee, Indraprastha Institute of Information Technology Delhi, India; Ran Gelles, Bar Ilan University, Israel

Session:
Complexity and Computation Theory 2

Track:
21: Other topics

Location:
Lamda

Presentation Time:
Fri, 12 Jul, 11:30 - 11:50

Session Chair:
Manuj Mukherjee, Manuj Mukherjee
Abstract
We revisit the fundamental question of information exchange between $n$ parties connected by a noisy binary broadcast channel, where the noise affects the \emph{transmitter} (El-Gamal, 1987). That is, a bit transmitted by a party is flipped with some fixed probability, and all parties receive the same (possibly flipped) bit. We provide matching upper and lower bounds for the omniscience task where each party starts with a single bit and wants to learn the input bit of all other parties. We show that $\Theta(\log n)$ rounds of communication are necessary and sufficient for solving this task with $o(1)$ error probability. This proves an exponential gap between our case, where the noise affects the transmitter, and the case previously studied in the literature, where the noise affects each receiver independently. In that case, $\Theta(\log\log n)$ rounds are necessary and sufficient to achieve omniscience (Gallager, 1988; Goyal, Kindler, Saks, 2008). We complement our results by proving that computing the parity of all input bits also requires $\Omega(\log n)$ rounds of communication, implying again an exponential gap between the two settings. We further extend our positive result to computing any interactive protocol $\pi$ that assumes a (noiseless) broadcast channel. Via a simple coding technique we show that a multiplicative overhead of $O(\log n)$ rounds with respect to the noiseless case is sufficient to reliably compute $\pi$ with $o(1)$ error probability over a noisy broadcast channel, with noise at the transmitter.
Resources