EE2340/5847 Course Information
Table of Contents
Welcome to EE 2340 (Information Sciences) and EE 5847 (Information theory). This course will cover the basics of Information Theory. You are also encouraged to take the follow-up courses EE5390 (Source coding), EE6317 (Channel coding), as well as the related course EE5342 (Detection theory) and EE5357 (Estimation theory).
Generally speaking, information theory has its roots in this paper by Claude Shannon, and dealt with the following problems:
- Data compression
- Communication over noisy channels
Since its inception, information theory has proved to be a valuable tool in several fields including cryptography, physics, machine learning and computational biology.
In this course, you will be introduced to basic information quantities and their mathematical properties. In the process, you will see probabilistic models of basic data compression and communication problems, how to performance of algorithms for these two, and the fundamental limits associated with them.
1 Assessment (tentative):
Each student will be expected to
- attend classes regularly
- solve homework assignments, roughly one per week (collaboration is fine as long as it is explicitly declared, but you have to write the solutions in your own words), and participate in class (online and/or offline)
- solve in-class quizzes: roughly 2-3
- sit for a mid-term (tentative) and a final exam
Homeworks | 20% |
Quizzes | 20% |
Mid-term exam | 25% |
Final exam | 35% |
2 Instructor:
Name | Dr. Shashank Vatedka |
Office | 214/D, Block-C |
shashankvatedka@iith.ac.in |
3 Class timings:
- Slot R: Tuesdays 14:30-15:55 and Fridays 16:00-17:25
- Class venue: A-221
- Quiz schedule: Fridays, in class
4 Tentative syllabus:
What is information theory?; Probabilistic modeling of sources and channels; The source compression problem; Entropy and the source coding theorem; Proof of achievability for binary memoryless sources; Communication across a noisy channel; Mutual information and the channel coding theorem; Proof of achievability for binary symmetric channel; Convexity, Jensen’s inequality; Entropy, joint entropy and conditional entropy, properties; Kullback-Liebler divergence, properties; Mutual information, properties; Chain rules; Log-sum inequality; Data processing inequality; Sufficient statistics; Fano’s inequality
5 Textbook and references:
Primarily, the material will be drawn from
- Lecture notes, which will be linked and also on Google Classroom.
- Elements of Information Theory, Thomas M Cover and Joy A Thomas, second edition, Wiley inc. (Amazon link). Limited number of copies available in the university library. While the first edition of the book has all the material we need for the course, the second edition has a much expanded problem set.
Other references:
- “A mathematical theory of communication”, Claude Shannon, Bell systems Tech Journal, 1948. The classic paper that started this field.
- Information theory, inference, and learning algorithms, David MacKay (soft copy available for free on the author’s website). A very nice book, but gives a very different flavour that what will be covered in the course. Suggested for additional reading.
- A student’s guide to coding and information theory, Stefan Moser (amazon link).
- Information theory: coding theorems for discrete memoryless systems, Imre Csiszar and Janos Korner. An advanced textbook. Recommended reading after covering this course as well as EE5390 and EE6317.
- Similar courses offered at IISc, Stanford, and MIT. Also one on Coursera.
Light reading:
- The information: A history, a theory, a flood James Gleick
- A mind at play: How Claude Shannon invented the information age Jimmy Soni and Rob Goodman. A biography of Claude Shannon.
6 Tentative schedule
Topic | Date | Notes/links/material | Homeworks |
---|---|---|---|
Introduction | Week 1 | Handout 1 | Homework 1 |
- Logistics | Slides | ||
- Motivation for studying information theory | |||
- A refresher in probability | |||
- Probabilistic modeling of sources and channels: memoryless systems | |||
Data compression for discrete sources | Week 2 | Handout 2 | |
- Fixed-length compression: rate, probability of error | Slides | ||
- Entropy and the source coding theorem | |||
- Proof of achievability for Bernoulli source | |||
Discrete memoryless channels | Handout 3 | Homework 2 | |
Model, modular structure of digital communication system | Slides | ||
Channel coding, rate, capacity, probability of error | Week 2 | ||
Mutual information and the channel coding theorem | |||
Coding as a packing problem and capacity of the binary symmetric channel | |||
Information measures for continuous alphabet | |||
Properties of information measures | Week 3 | Handout 4 | |
Convexity, Jensen’s inequality | |||
- Joint and conditional entropies | |||
- Positivity | |||
- Chain rule of entropy and mutual information | |||
- Conditioning reduces entropy | |||
Midterm exam | 24th Jan 2020 | ||
More properties | Week4-5 | Handout 5 | Homework 3 |
Log-sum inequality | |||
Properties | |||
- Positivity, convexity and concavity | |||
- Chain rule for mutual information | |||
- Data processing inequality | |||
Fano’s inequality | |||
Applications | Week 5 | Handout 6 | |
- A proof of converse of the source coding theorem | |||
- A proof of converse of the channel coding theorem | |||
- Entropy maximizing distributions | |||
Final exam | 7th Feb 2020 |