Paper award 36843
Paper award 36842
Basic Notions
- Entropy
- Differential entropy
- Graph entropy
- Conditional entropy
- Mutual infor
Entropy
Definitions
°À°ÚÌý±á(³Ý)=°À±ô´Ç²µ°À´Ú°ù²¹³¦µ÷1°¨µ÷±è³å³Ý(³Ý)°¨.°À±ÕÌý
The base of the logarithm defines the unit of entropy. If the logarithm is to the base 2, the unit of entropy is the bit. If the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Mutual Information
Definitions
Let \(X\) and \(Y\) be discrete random variables defined on finite alphabets \(\mathcal{X}\)Ìý\(\mathcal{Y}\), respectively,Ìýand with joint probability mass function \(p_{X,Y}\). The mutual information of \(X\) and \(Y\) is the random variable \(I(X,Y)\) defined by
\[ I(X,Y) = \log\frac{p_{X,Y}(X,Y)}{p_X(X)p_Y(Y)}.\]
As with entropy, the base of the logarithm defines the units of mutual information. ÌýIf the if the logarithm is to the base \(e\), the unit of entropy is the nat.
Information Theory Knowledge Database
- Basic notions
- Source codingÌý
- Channel coding
Online Committee Report, ITA 2013
Summary
The website had been running smoothly and consistently until February 5th, when it experienced significant slowdowns and errors (error 504). The problem has been fixed on February 8th, with the website running up again, but the fundamental cause of the problem has not been precisely identified yet. The developers have scheduled additional time to fix the issue next week, and the Online Committee will provide more information then.
Ìý
The main topics covered in this report are the following.