FR4.R4.3

Gaussian mixtures: convexity properties and CLT rates for the entropy and Fisher information

Alexandros Eskenazis, CNRS, Sorbonne Universite, France; Lampros Gavalakis, Gustave Eiffel University, France

Session:
Entropy Power Inequalities

Track:
9: Shannon Theory

Location:
Omikron II

Presentation Time:
Fri, 12 Jul, 17:05 - 17:25

Session Chair:
Olivier Rioul, Institut Polytechnique de Paris
Abstract
We study the entropy and Fisher information of mixtures of centered Gaussian random variables (with respect to the variance). First, we prove that if $X_1, X_2$ are independent scalar Gaussian mixtures, then the entropy of $\sqrt{t}X_1 + \sqrt{1-t}X_2$ is concave in $t \in [0,1]$, thus confirming a conjecture of Ball, Nayar and Tkocz (2016) for this class of random variables. In fact, we prove a generalisation of this statement, which also strengthens a result of Eskenazis, Nayar and Tkocz (2018). Secondly, we establish rates of convergence for the Fisher information matrix of the sum of weighted i.i.d.~Gaussian mixtures in the operator norm along the central limit theorem under mild moment assumptions. These are obtained by showing that the Fisher information matrix is operator convex as a matrix-valued function acting on densities of mixtures in $\mathbb{R}^d$, extending a result of Bobkov (2022).
Resources