|
|
ix | |
Preface |
|
xi | |
|
|
1 | (12) |
|
1.1 Information theory versus coding theory |
|
|
1 | (1) |
|
1.2 Model and basic operations of information processing systems |
|
|
2 | (2) |
|
|
4 | (1) |
|
1.4 Encoding a source alphabet |
|
|
5 | (3) |
|
1.5 Octal and hexadecimal codes |
|
|
8 | (1) |
|
|
9 | (4) |
|
|
11 | (2) |
|
|
13 | (18) |
|
2.1 Review of modular arithmetic |
|
|
13 | (2) |
|
2.2 Independent errors -- white noise |
|
|
15 | (2) |
|
2.3 Single parity-check code |
|
|
17 | (2) |
|
|
19 | (2) |
|
2.5 Simple burst error-detecting code |
|
|
21 | (1) |
|
2.6 Alphabet plus number codes -- weighted codes |
|
|
22 | (5) |
|
2.7 Trade-off between redundancy and error-detecting capability |
|
|
27 | (3) |
|
|
30 | (1) |
|
|
30 | (1) |
|
3 Repetition and Hamming codes |
|
|
31 | (24) |
|
3.1 Arithmetics in the binary field |
|
|
33 | (1) |
|
3.2 Three-times repetition code |
|
|
34 | (6) |
|
|
40 | (12) |
|
3.3.1 Some historical background |
|
|
40 | (2) |
|
3.3.2 Encoding and error correction of the (7,4) Hamming code |
|
|
42 | (6) |
|
3.3.3 Hamming bound: sphere packing |
|
|
48 | (4) |
|
|
52 | (3) |
|
|
53 | (2) |
|
4 Data compression: efficient coding of a random message |
|
|
55 | (26) |
|
|
55 | (2) |
|
4.2 Prefix-free or instantaneous codes |
|
|
57 | (1) |
|
|
58 | (4) |
|
|
62 | (3) |
|
4.5 Trees with probabilities |
|
|
65 | (1) |
|
4.6 Optimal codes: Huffman code |
|
|
66 | (7) |
|
|
73 | (5) |
|
4.8 Some historical background |
|
|
78 | (1) |
|
|
78 | (3) |
|
|
79 | (2) |
|
5 Entropy and Shannon's Source Coding Theorem |
|
|
81 | (34) |
|
|
81 | (5) |
|
5.2 Uncertainty or entropy |
|
|
86 | (6) |
|
|
86 | (2) |
|
5.2.2 Binary entropy function |
|
|
88 | (1) |
|
5.2.3 The Information Theory Inequality |
|
|
89 | (1) |
|
5.2.4 Bounds on the entropy |
|
|
90 | (2) |
|
|
92 | (3) |
|
5.4 Bounds on the efficiency of codes |
|
|
95 | (8) |
|
5.4.1 What we cannot do: fundamental limitations of source coding |
|
|
95 | (2) |
|
5.4.2 What we can do: analysis of the best codes |
|
|
97 | (4) |
|
5.4.3 Coding Theorem for a Single Random Message |
|
|
101 | (2) |
|
5.5 Coding of an information source |
|
|
103 | (5) |
|
5.6 Some historical background |
|
|
108 | (2) |
|
|
110 | (1) |
|
5.8 Appendix: Uniqueness of the definition of entropy |
|
|
111 | (4) |
|
|
112 | (3) |
|
6 Mutual information and channel capacity |
|
|
115 | (28) |
|
|
115 | (1) |
|
|
116 | (2) |
|
6.3 The channel relationships |
|
|
118 | (1) |
|
6.4 The binary symmetric channel |
|
|
119 | (3) |
|
|
122 | (4) |
|
|
126 | (4) |
|
6.7 Definition of channel capacity |
|
|
130 | (1) |
|
6.8 Capacity of the binary symmetric channel |
|
|
131 | (3) |
|
6.9 Uniformly dispersive channel |
|
|
134 | (2) |
|
6.10 Characterization of the capacity-achieving input distribution |
|
|
136 | (2) |
|
6.11 Shannon's Channel Coding Theorem |
|
|
138 | (2) |
|
6.12 Some historical background |
|
|
140 | (1) |
|
|
141 | (2) |
|
|
141 | (2) |
|
7 Approaching the Shannon limit by turbo coding |
|
|
143 | (24) |
|
7.1 Information Transmission Theorem |
|
|
143 | (2) |
|
|
145 | (1) |
|
7.3 Transmission at a rate below capacity |
|
|
146 | (1) |
|
7.4 Transmission at a rate above capacity |
|
|
147 | (8) |
|
7.5 Turbo coding: an introduction |
|
|
155 | (4) |
|
|
159 | (1) |
|
7.7 Appendix: Why we assume uniform and independent data at the encoder |
|
|
160 | (4) |
|
7.8 Appendix: Definition of concavity |
|
|
164 | (3) |
|
|
165 | (2) |
|
8 Other aspects of coding theory |
|
|
167 | (16) |
|
8.1 Hamming code and projective geometry |
|
|
167 | (8) |
|
8.2 Coding and game theory |
|
|
175 | (5) |
|
|
180 | (3) |
|
|
182 | (1) |
References |
|
183 | (4) |
Index |
|
187 | |