Show/hide contentOpenClose All
Curricular information is subject to change
- Understand the general relevance of Shannon's Information Theory in the Information Age.
- Review essential probability theory.
- Become acquainted with fundamental information-theoretical concepts such as entropy, mutual information, relative entropy: Jensen's inequality, log-sum inequality, data-processing inequality, sufficient statistics, Fano's inequality.
- Understand the centrality of the asymptotic equipartition property in Information Theory: typical set.
- Understand the fundamentals of data compression: Kraft inequality, optimal codes, Huffman codes, Shannon-Fano-Elias coding.
- Understand the concept of channel capacity: symmetric channels, channel coding theorem, elementary channel coding techniques (repetition, Hamming codes).
- Acquire the basic insights into the connection between Information Theory and statistics.
|Student Effort Type||Hours|
|Autonomous Student Learning||
Working knowledge of basic calculus and algebra.Learning Recommendations:
Knowledge of probability theory would be helpful, although the course is self-contained in this respect.
|Description||Timing||Component Scale||% of Final Grade|
|Assignment: Three sets of exercises and questions to be completed individually by the students.||Throughout the Trimester||n/a||Alternative linear conversion grade scale 40%||No||
|Multiple Choice Questionnaire: Midterm online examination.||Unspecified||n/a||Alternative linear conversion grade scale 40%||No||
|Resit In||Terminal Exam|
|Spring||Yes - 2 Hour|
• Feedback individually to students, post-assessment
Not yet recorded.
|Lecture||Offering 1||Week(s) - Autumn: All Weeks||Thurs 12:00 - 12:50|
|Practical||Offering 1||Week(s) - Autumn: All Weeks||Thurs 13:00 - 13:50|
|Practical||Offering 1||Week(s) - Autumn: All Weeks||Tues 14:00 - 15:50|