Module NameDownload


Sl.No Chapter Name MP4 Download
1What is information?Download
2How to model uncertainty?Download
3Basic concepts of probabilityDownload
4Estimates of random variablesDownload
5Limit theoremsDownload
6ReviewDownload
7Source modelDownload
8Motivating examplesDownload
9A compression problemDownload
10Shannon entropyDownload
11Random hashDownload
12Review 2Download
13Uncertainty and randomnessDownload
14Total variation distanceDownload
15Generating almost random bitsDownload
16Generating samples from a distribution using uniform randomnessDownload
17Typical sets and entropyDownload
18Review 3Download
19Hypothesis testing and estimationDownload
20Examples.Download
21The log-likelihood ratio testDownload
22Kullback-Leibler divergence and Stein's lemmaDownload
23Properties of KL divergenceDownload
24Review 4Download
25Information per coin-tossDownload
26Multiple hypothesis testingDownload
27Error analysis of multiple hypothesis testingDownload
28Mutual informationDownload
29Fano's inequalityDownload
30Measures of informationDownload
31Chain rulesDownload
32Shape of measures of informationDownload
33Data processing inequalityDownload
34Midyear ReviewDownload
35Proof of Fano's inequalityDownload
36Variational formulaeDownload
37Capacity as information radiusDownload
38Proof of Pinsker's inequalityDownload
39Continuity of entropyDownload
40Lower bound for compressionDownload
41Lower bound for hypothesis testingDownload

Sl.No Chapter Name English
1What is information?PDF unavailable
2How to model uncertainty?PDF unavailable
3Basic concepts of probabilityPDF unavailable
4Estimates of random variablesPDF unavailable
5Limit theoremsPDF unavailable
6ReviewPDF unavailable
7Source modelPDF unavailable
8Motivating examplesPDF unavailable
9A compression problemPDF unavailable
10Shannon entropyPDF unavailable
11Random hashPDF unavailable
12Review 2PDF unavailable
13Uncertainty and randomnessPDF unavailable
14Total variation distancePDF unavailable
15Generating almost random bitsPDF unavailable
16Generating samples from a distribution using uniform randomnessPDF unavailable
17Typical sets and entropyPDF unavailable
18Review 3PDF unavailable
19Hypothesis testing and estimationPDF unavailable
20Examples.PDF unavailable
21The log-likelihood ratio testPDF unavailable
22Kullback-Leibler divergence and Stein's lemmaPDF unavailable
23Properties of KL divergencePDF unavailable
24Review 4PDF unavailable
25Information per coin-tossPDF unavailable
26Multiple hypothesis testingPDF unavailable
27Error analysis of multiple hypothesis testingPDF unavailable
28Mutual informationPDF unavailable
29Fano's inequalityPDF unavailable
30Measures of informationPDF unavailable
31Chain rulesPDF unavailable
32Shape of measures of informationPDF unavailable
33Data processing inequalityPDF unavailable
34Midyear ReviewPDF unavailable
35Proof of Fano's inequalityPDF unavailable
36Variational formulaePDF unavailable
37Capacity as information radiusPDF unavailable
38Proof of Pinsker's inequalityPDF unavailable
39Continuity of entropyPDF unavailable
40Lower bound for compressionPDF unavailable
41Lower bound for hypothesis testingPDF unavailable


Sl.No Language Book link
1EnglishNot Available
2BengaliNot Available
3GujaratiNot Available
4HindiNot Available
5KannadaNot Available
6MalayalamNot Available
7MarathiNot Available
8TamilNot Available
9TeluguNot Available