differential entropy

Let (X,𝔅,μ) be a probability spaceMathworldPlanetmath, and let fLp(X,𝔅,μ), ||f||p=1 be a functionMathworldPlanetmath. The differential entropy h(f) is defined as

h(f)-X|f|plog|f|pdμ (1)

Differential entropy is the continuous version of the Shannon entropy, H[𝐩]=-ipilogpi. Consider first ua, the uniform 1-dimensional distributionDlmfPlanetmathPlanetmath on (0,a). The differential entropy is

h(ua)=-0a1alog1adμ=loga. (2)

Next consider probability distributions such as the function

g=12πσe-(t-μ)22σ2, (3)

the 1-dimensional Gaussian. This pdf has differential entropy

h(g)=-gloggdt=12log2πeσ2. (4)

For a general n-dimensional Gaussian (http://planetmath.org/JointNormalDistribution) 𝒩n(μ,𝐊) with mean vector μ and covariance matrix 𝐊, Kij=cov(xi,xj), we have

h(𝒩n(μ,𝐊))=12log(2πe)n|𝐊| (5)

where |𝐊|=det𝐊.

Title differential entropy
Canonical name DifferentialEntropy
Date of creation 2013-03-22 12:18:48
Last modified on 2013-03-22 12:18:48
Owner Mathprof (13753)
Last modified by Mathprof (13753)
Numerical id 16
Author Mathprof (13753)
Entry type Definition
Classification msc 54C70
Related topic ShannonsTheoremEntropy
Related topic ConditionalEntropy