## You are here

Homeconditional entropy

## Primary tabs

# conditional entropy

# Definition (Discrete)

Let $(\Omega,\mathcal{F},\mu)$ be a discrete probability space, and let $X$ and $Y$ be discrete random variables on $\Omega$.

The conditional entropy $H[X|Y]$, read as “the conditional entropy of $X$ given $Y$,” is defined as

$H[X|Y]=-\sum_{{x\in X}}\sum_{{y\in Y}}\mu(X=x,Y=y)\log\mu(X=x|Y=y)$ | (1) |

where $\mu(X|Y)$ denotes the conditional probability. $\mu(Y=y)$ is nonzero in the discrete case

# Discussion

The results for discrete conditional entropy will be assumed to hold for the continuous case unless we indicate otherwise.

With $H[X,Y]$ the joint entropy and $f$ a function, we have the following results:

$\displaystyle H[X|Y]+H[Y]$ | $\displaystyle=H[X,Y]$ | (2) | |||

$\displaystyle H[X|Y]$ | $\displaystyle\leq H[X]\text{(conditioning reduces entropy)}$ | (3) | |||

$\displaystyle H[X|Y]$ | $\displaystyle\leq H[X]+H[Y]\text{(equality iff }X,Y\text{ independent)}$ | (4) | |||

$\displaystyle H[X|Y]$ | $\displaystyle\leq H[X|f(Y)]$ | (5) | |||

$\displaystyle H[X|Y]$ | $\displaystyle=0\iff X=f(Y)\text{(special case }H[X|X]=0\text{)}$ | (6) |

The conditional entropy $H[X|Y]$ may be interpreted as the uncertainty in $X$ given knowledge of $Y$. (Try reading the above equalities and inequalities with this interpretation in mind.)

Related:

Entropy,RelativeEntropy,ConditionalProbability,DifferentialEntropy, ShannonsTheoremEntropy

Type of Math Object:

Definition

Major Section:

Reference

## Mathematics Subject Classification

94A17*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections