proof of Gaussian maximizes entropy for given covariance

Let $f,K,\phi$ be as in the parent (http://planetmath.org/GaussianMaximizesEntropyForGivenCovariance) entry.

The proof uses the nonnegativity of relative entropy $D(f||\phi)$, and an interesting property of quadratic forms. If $A$ is a quadratic form and $p,q$ are probability distributions each with mean $\mathbf{0}$ and covariance matrix $\mathbf{K}$, we have

 $\int p\ x_{i}x_{j}\ dx_{i}dx_{j}=K_{ij}=\int q\ x_{i}x_{j}\ dx_{i}dx_{j}$ (1)

and thus

 $\int Ap=\int Aq$ (2)

Now note that since

 $\phi(\mathbf{x})=\left((2\pi)^{n}|\mathbf{K}|\right)^{-\frac{1}{2}}\exp{(-% \frac{1}{2}\mathbf{x}^{\mathrm{T}}\mathbf{K}^{-1}\mathbf{x})},$ (3)

we see that $\log\phi$ is a quadratic form plus a constant.

 0 $\displaystyle\leq D(f||\phi)$ $\displaystyle=\int f\log\frac{f}{\phi}$ $\displaystyle=\int f\log f-\int f\log\phi$ $\displaystyle=-h(f)-\int f\log\phi$ $\displaystyle=-h(f)-\int\phi\log\phi\qquad\text{by the quadratic form property% above}$ $\displaystyle=-h(f)+h(\phi)$

and thus $h(\phi)\geq h(f)$.

Title proof of Gaussian maximizes entropy for given covariance ProofOfGaussianMaximizesEntropyForGivenCovariance 2013-03-22 12:19:35 2013-03-22 12:19:35 Mathprof (13753) Mathprof (13753) 10 Mathprof (13753) Proof msc 94A17 QuadraticForm RelativeEntropy MultidimensionalGaussianIntegral