Learning Outcomes:
- Understand the general relevance of Shannon's Information Theory in the Information Age.
- Review essential probability theory.
- Become acquainted with fundamental information-theoretical concepts such as entropy, mutual information, relative entropy: Jensen's inequality, log-sum inequality, data-processing inequality, sufficient statistics, Fano's inequality.
- Understand the centrality of the asymptotic equipartition property in Information Theory: typical set.
- Understand the fundamentals of data compression: Kraft inequality, optimal codes, Huffman codes, Shannon-Fano-Elias coding.
- Understand the concept of channel capacity: symmetric channels, channel coding theorem, elementary channel coding techniques (repetition, Hamming codes).
Indicative Module Content:
(see above)