메뉴 건너뛰기

CODING AND INFROMATION THEORY

IC633 - Coding and Information Theory

Networks Communications Research Group

 

 

This course offers and introduction to the quantitative theory of infromation and its application to reliable,

efficient communication systems. Contents are as follow:

 

 

1. Introduction, entropy
2. Jensen's inequality, data processing theorem, Fanos's inequality
3. Different types of convergence, asymptotic equipartition property (AEP), typical set, joint typicality
4. Entropies of stochastic processes
5. Data compression, Kraft inequality, optimal codes
6. Huffman codes
7. Shannon-Fano-Elias codes, Slepian-Wolf
8. Channel capacity
9. Maximizing capacity
10. The channel coding theorem
11. Strong coding theorem
12. Feedback capacity
13. Joint source channel coding
14. Differential entropy
15. Additive Gaussian noise channel
16. Gaussian channels: parallel, colored noise, inter-symbol interference
17. Gaussian channels with feedback
18. Multiple access channels
19. Broadcast channels
20. Network information theory

 

 

Tue 10:30-12:00 (IT Convergence Eng. B/D (E3) - 112), Thu 10:30-12:00 (IT Convergence Eng. B/D (E3) - 112)

Text Book : Elements of Information Theory, 2nd Edition. Cover and Thomas  WILEY 2005

 

위로