Information Theory
This class treats Shannon's mathematical theory of communication and the tools used to derive and understand it. The class is organized around fundamental questions and their solutions, leading to central results such as Shannon's source coding, channel coding, and rate-distortion theorems. Quantities that arise en route to these solutions include entropy, relative entropy, and mutual information for discrete and continuous random variables. The course explores the calculation of fundamental communication limits like entropy rate, capacity, and rate-distortion functions under a variety of source and communication channel models (e.g., memoryless, Markov, ergodic, and Gaussian). The course begins with a foundational discussion of the simplest communication scenarios and then expands to include topics like universal source coding, the role of side information in source coding and communications, and the generalization of earlier results to network systems. Network information theory topics include multiuser data compression and communication over multiple access channels, broadcast channels, and multiterminal networks. Philosophical and practical implications of the theory are also explored. This course, when combined with EE 112, EE/Ma/CS/IDS 127, EE/CS 161, and EE/CS/IDS 167, should prepare the student for research in information theory, coding theory, wireless communications, and/or data compression. Part b not offered 2024-25