Preface
Recurring themes
Insightful proof techniques
1
Entropy
Basics
Combinatorial properties
2
Entropy method in combinatorics
Binary vectors of average weights
Counting subgraphs
3
Kullback-Leibler Divergence
Definition
Differential entropy
Channel, conditional divergence
Chain Rule, DPI
4
Mutual Information
Definition, properties
Conditional MI
Probability of error, Fano’s inequality
5
Variational Characterizations
Geometric interpretation of MI
Lower variational bounds
Continuity
6
Extremization
Convexity
Minimax and saddle-point
Capacity, Saddle point of MI
Capacity as information radius
7
Tensorization
MI, Gaussian saddle point
Information rates
8
f-Divergence
Definition
Information properties, MI
TV and Hellinger, hypothesis testing
Joint range
Rényi divergence
Variational characterizations
Fisher information, location family
Local χ² behavior
9
Statistical Decision Applications
Minimax and Bayes risks
A duality perspective
Sample complexity, tensor products
HCR lower bound
Bayesian perspective
Miscellany: MLE
10
Lossless compression
Variable-length source coding
Uniquely decodable codes
Fixed-length source coding
11
Hypothesis Testing, Large Deviations
Neyman-Pearson
Large deviations theory
12
Lecture notes
Oct 2: Fisher information, classical minimax estimation
Main content
One-parameter families; minimax rates
HCR inequality; Fisher information
Oct 7: Data compression
Oct 9: data compression II
Review
Arithmetic encoder
Lempel-Ziv
Oct 28: Binary hypothesis testing
More on universal compression
Binary hypothesis testing
Nov 4. Channel coding
BHT, large deviations theory
I-projections
Channel coding
Nov 6, Channel coding II
Nov 18, Channel Coding III
Decoding with constraint
Nov 20. Quantization
Water-filling
Metric-entropy
Nov 25: Rate-distortion theorem
Scalar quantization
Vector quantization
Dec 4: Density estimation
Review of metric entropy
Dec 9. Strong DPI
Minimax stats
Combinatorial statistics
Dec 11: SDPI, Distributed Estimation
Spiked Wigner
Correlation estimation
Bibliography
6.7480 Notes
Bibliography
Polyanskiy, Yury, and Yihong Wu. 2025.
Information Theory: From Coding to Learning
. Cambridge University Press.