# Gaussian distribution maximizes entropy for given covariance

###### Theorem 1

Let $f:\mathbb{R}^{n}\to\mathbb{R}$ be a continuous probability density function. Let $X_{1},\ldots,X_{n}$ be random variables with density $f$ and with covariance matrix $\mathbf{K}$, $K_{ij}=\mathrm{cov}(X_{i},X_{j})$. Let $\phi$ be the distribution of the multidimensional Gaussian (http://planetmath.org/JointNormalDistribution) with mean $\mathbf{0}$ and covariance matrix $\mathbf{K}$. Then the Gaussian distribution maximizes the differential entropy for a given covariance matrix $\mathbf{K}$. That is, $h(\phi)\geq h(f)$.

Title Gaussian distribution maximizes entropy for given covariance GaussianDistributionMaximizesEntropyForGivenCovariance 2013-03-22 12:19:26 2013-03-22 12:19:26 Mathprof (13753) Mathprof (13753) 10 Mathprof (13753) Theorem msc 94A17