EE2340/5847 Course Information
Table of Contents
Welcome to EE 2340 (Information Sciences) and EE 5847 (Information theory) 2021 edition. This course will cover the basics of Information Theory. You are also encouraged to take the follow-up courses EE5390 (Source coding), EE6317 (Channel coding), as well as the related course EE5342 (Detection theory) and EE5357 (Estimation theory).
Generally speaking, information theory has its roots in this paper by Claude Shannon, and dealt with the following problems:
- Data compression
- Communication over noisy channels
Since its inception, information theory has proved to be a valuable tool in several fields including cryptography, physics, machine learning and computational biology.
In this course, you will be introduced to basic information quantities and their mathematical properties. In the process, you will see probabilistic models of basic data compression and communication problems, how to performance of algorithms for these two, and the fundamental limits associated with them.
Classes will be online. To facilitate discussions, we will use piazza. An invite will be sent to all registered students in the first class.
1 Assessment (tentative):
Each student will be expected to
- attend online classes regularly and watch assigned videos
- solve homework assignments, roughly one per week (collaboration is fine as long as it is explicitly declared, but you have to write the solutions in your own words), and participate in class (during live lectures and on piazza)
- solve in-class quizzes: roughly 2-3
- solve a final exam/viva (tentative)
The following assessment scheme is tentative:
Homeworks | 50% |
Quizzes | 10% |
Class participation | 10% |
Final exam | 30% |
2 Instructor:
Name | Dr. Shashank Vatedka |
Office | 214/D, Block-C |
shashankvatedka@ee.iith.ac.in |
3 Class timings:
- Slot B: Mondays 10-11, Wednesdays 9-10 and Thursdays 11-12
- Online classes: Microsoft teams/Google meet
- Quiz schedule: TBD
4 Tentative syllabus:
What is information theory?; Probabilistic modeling of sources and channels; The source compression problem; Entropy and the source coding theorem; Proof of achievability for binary memoryless sources; Communication across a noisy channel; Mutual information and the channel coding theorem; Proof of achievability for binary symmetric channel; Convexity, Jensen's inequality; Entropy, joint entropy and conditional entropy, properties; Kullback-Liebler divergence, properties; Mutual information, properties; Chain rules; Log-sum inequality; Data processing inequality; Sufficient statistics; Fano's inequality
5 Textbook and references:
Primarily, the material will be drawn from
- Lecture notes, which will be linked and also made available on piazza.
- Elements of Information Theory, Thomas M Cover and Joy A Thomas, second edition, Wiley inc. (Amazon link). Limited number of copies available in the university library. While the first edition of the book has all the material we need for the course, the second edition has a much expanded problem set.
Other references:
- "A mathematical theory of communication", Claude Shannon, Bell systems Tech Journal, 1948. The classic paper that started this field.
- Information theory, inference, and learning algorithms, David MacKay (soft copy available for free on the author's website). A very nice book, but gives a very different flavour that what will be covered in the course. Suggested for additional reading.
- A student's guide to coding and information theory, Stefan Moser (amazon link).
- Information theory: coding theorems for discrete memoryless systems, Imre Csiszar and Janos Korner. An advanced textbook. Recommended reading after covering this course as well as EE5390 and EE6317.
- Similar courses offered at IISc, Stanford, and MIT. Also one on Coursera.
Light reading:
- The information: A history, a theory, a flood James Gleick
- A mind at play: How Claude Shannon invented the information age Jimmy Soni and Rob Goodman. A biography of Claude Shannon.
6 Tentative schedule
Topic | Date | Notes/links/material | Homeworks |
---|---|---|---|
Introduction | Week 1 | ||
- Logistics | |||
- Motivation for studying information theory | |||
- A refresher in probability | |||
- Probabilistic modeling of sources and channels: memoryless systems | |||
Data compression for discrete sources | Week 2 | ||
- Fixed-length compression: rate, probability of error | |||
- Entropy and the source coding theorem | |||
- Proof of achievability for Bernoulli source | |||
Discrete memoryless channels | |||
Model, modular structure of digital communication system | |||
Channel coding, rate, capacity, probability of error | Week 2 | ||
Mutual information and the channel coding theorem | |||
Coding as a packing problem and capacity of the binary symmetric channel | |||
Information measures for continuous alphabet | |||
Properties of information measures | Week 3 | ||
Convexity, Jensen's inequality | |||
- Joint and conditional entropies | |||
- Positivity | |||
- Chain rule of entropy and mutual information | |||
- Conditioning reduces entropy | |||
More properties | Week4-5 | ||
Log-sum inequality | |||
Properties | |||
- Positivity, convexity and concavity | |||
- Chain rule for mutual information | |||
- Data processing inequality | |||
Fano's inequality | |||
Applications | Week 5 | ||
- A proof of converse of the source coding theorem | |||
- A proof of converse of the channel coding theorem | |||
- Entropy maximizing distributions | |||
Final exam | TBD |