conditional entropy
Definition (Discrete)
Let (Ω,ℱ,μ) be a discrete probability space, and let X and Y be discrete random variables on Ω.
The conditional entropy H[X|Y], read as “the conditional entropy of X given Y,” is defined as
H[X|Y]=-∑x∈X∑y∈Yμ(X=x,Y=y)logμ(X=x|Y=y) | (1) |
where μ(X|Y) denotes the conditional probability. μ(Y=y) is nonzero in the
discrete case
Discussion
The results for discrete conditional entropy will be assumed to hold for the continuous case unless we indicate otherwise.
With H[X,Y] the joint entropy and f a function, we have the following results:
H[X|Y]+H[Y] | =H[X,Y] | (2) | ||
H[X|Y] | ≤H[X] | (3) | ||
(4) | ||||
(5) | ||||
(6) |
The conditional entropy may be interpreted as the uncertainty in given knowledge of . (Try reading the above equalities and inequalities with this interpretation in mind.)
Title | conditional entropy |
---|---|
Canonical name | ConditionalEntropy |
Date of creation | 2013-03-22 12:25:16 |
Last modified on | 2013-03-22 12:25:16 |
Owner | PrimeFan (13766) |
Last modified by | PrimeFan (13766) |
Numerical id | 9 |
Author | PrimeFan (13766) |
Entry type | Definition |
Classification | msc 94A17 |
Related topic | Entropy |
Related topic | RelativeEntropy |
Related topic | ConditionalProbability |
Related topic | DifferentialEntropy |
Related topic | ShannonsTheoremEntropy |