Notes by Saeran Vasanthakumar --- 24/04/09 - Entropy, Cross entropy, and Relative entropy definitions
Given, p,q=p(x),q(y)Hp=EntropyHp,q=Cross-entropyDp,q=Relative-Entropy/KL-Divergence Hp=p1/lnp=Ep[lnp]Hp,q=p1/lnq=Ep[lnq]Dp|q=H(p,q)H(p)=Ep[lnp]Ep[lnq]=Ep[ln(p/q)] So, - Hp is the average of ln(p), weighted by p. - Hpq is the average of ln(q), weighted by p. - Dpq is the average of the log odds ratio ln(p/q), assuming q=1p.
--- email: saeranv @ gmail dot com git: github.com/saeranv