TH2.R4.3

Shared Information under Simple Markov Independencies

Madhura Pathegama, Sagnik Bhattacharya, University of Maryland, College Park, United States

Session:
Information Measures and Randomness

Track:
9: Information Measures

Location:
Omikron II

Presentation Time:
Thu, 11 Jul, 12:10 - 12:30

Session Chair:
Suhas Diggavi, UCLA
Abstract
Shared information is a measure of mutual dependence among $m \geq 2$ jointly distributed discrete random variables. We show that the shared information of a Markov random field in which the underlying graph has at least one cut vertex, is the same as the minimum shared information of its blocks (also called biconnected components). This generalizes prior results on shared information of Markov random fields to a much wider class of nontree graphs.
Resources