Atnaujinkite slapukų nuostatas

El. knyga: Student's Guide to Coding and Information Theory

(National Chiao Tung University, Taiwan), (National Chiao Tung University, Taiwan)
  • Formatas: EPUB+DRM
  • Išleidimo metai: 26-Jan-2012
  • Leidėjas: Cambridge University Press
  • Kalba: eng
  • ISBN-13: 9781107086807
  • Formatas: EPUB+DRM
  • Išleidimo metai: 26-Jan-2012
  • Leidėjas: Cambridge University Press
  • Kalba: eng
  • ISBN-13: 9781107086807

DRM apribojimai

  • Kopijuoti:

    neleidžiama

  • Spausdinti:

    neleidžiama

  • El. knygos naudojimas:

    Skaitmeninių teisių valdymas (DRM)
    Leidykla pateikė šią knygą šifruota forma, o tai reiškia, kad norint ją atrakinti ir perskaityti reikia įdiegti nemokamą programinę įrangą. Norint skaityti šią el. knygą, turite susikurti Adobe ID . Daugiau informacijos  čia. El. knygą galima atsisiųsti į 6 įrenginius (vienas vartotojas su tuo pačiu Adobe ID).

    Reikalinga programinė įranga
    Norint skaityti šią el. knygą mobiliajame įrenginyje (telefone ar planšetiniame kompiuteryje), turite įdiegti šią nemokamą programėlę: PocketBook Reader (iOS / Android)

    Norint skaityti šią el. knygą asmeniniame arba „Mac“ kompiuteryje, Jums reikalinga  Adobe Digital Editions “ (tai nemokama programa, specialiai sukurta el. knygoms. Tai nėra tas pats, kas „Adobe Reader“, kurią tikriausiai jau turite savo kompiuteryje.)

    Negalite skaityti šios el. knygos naudodami „Amazon Kindle“.

This easy-to-read guide provides a concise introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. Background mathematics and specific engineering techniques are kept to a minimum so that only a basic knowledge of high-school mathematics is needed to understand the material covered. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics, making this perfect for anyone who needs a quick introduction to the subject.

Recenzijos

'The book is nicely written, and is recommended as a textbook for a one-semester introductory course on coding and information theory.' Pushpa N. Rathie, Zentralblatt MATH

Daugiau informacijos

Concise, easy-to-read guide, introducing beginners to the engineering background of modern communication systems, from mobile phones to data storage.
List of contributors
ix
Preface xi
1 Introduction
1(12)
1.1 Information theory versus coding theory
1(1)
1.2 Model and basic operations of information processing systems
2(2)
1.3 Information source
4(1)
1.4 Encoding a source alphabet
5(3)
1.5 Octal and hexadecimal codes
8(1)
1.6 Outline of the book
9(4)
References
11(2)
2 Error-detecting codes
13(18)
2.1 Review of modular arithmetic
13(2)
2.2 Independent errors -- white noise
15(2)
2.3 Single parity-check code
17(2)
2.4 The ASCII code
19(2)
2.5 Simple burst error-detecting code
21(1)
2.6 Alphabet plus number codes -- weighted codes
22(5)
2.7 Trade-off between redundancy and error-detecting capability
27(3)
2.8 Further reading
30(1)
References
30(1)
3 Repetition and Hamming codes
31(24)
3.1 Arithmetics in the binary field
33(1)
3.2 Three-times repetition code
34(6)
3.3 Hamming code
40(12)
3.3.1 Some historical background
40(2)
3.3.2 Encoding and error correction of the (7,4) Hamming code
42(6)
3.3.3 Hamming bound: sphere packing
48(4)
3.4 Further reading
52(3)
References
53(2)
4 Data compression: efficient coding of a random message
55(26)
4.1 A motivating example
55(2)
4.2 Prefix-free or instantaneous codes
57(1)
4.3 Trees and codes
58(4)
4.4 The Kraft Inequality
62(3)
4.5 Trees with probabilities
65(1)
4.6 Optimal codes: Huffman code
66(7)
4.7 Types of codes
73(5)
4.8 Some historical background
78(1)
4.9 Further reading
78(3)
References
79(2)
5 Entropy and Shannon's Source Coding Theorem
81(34)
5.1 Motivation
81(5)
5.2 Uncertainty or entropy
86(6)
5.2.1 Definition
86(2)
5.2.2 Binary entropy function
88(1)
5.2.3 The Information Theory Inequality
89(1)
5.2.4 Bounds on the entropy
90(2)
5.3 Trees revisited
92(3)
5.4 Bounds on the efficiency of codes
95(8)
5.4.1 What we cannot do: fundamental limitations of source coding
95(2)
5.4.2 What we can do: analysis of the best codes
97(4)
5.4.3 Coding Theorem for a Single Random Message
101(2)
5.5 Coding of an information source
103(5)
5.6 Some historical background
108(2)
5.7 Further reading
110(1)
5.8 Appendix: Uniqueness of the definition of entropy
111(4)
References
112(3)
6 Mutual information and channel capacity
115(28)
6.1 Introduction
115(1)
6.2 The channel
116(2)
6.3 The channel relationships
118(1)
6.4 The binary symmetric channel
119(3)
6.5 System entropies
122(4)
6.6 Mutual information
126(4)
6.7 Definition of channel capacity
130(1)
6.8 Capacity of the binary symmetric channel
131(3)
6.9 Uniformly dispersive channel
134(2)
6.10 Characterization of the capacity-achieving input distribution
136(2)
6.11 Shannon's Channel Coding Theorem
138(2)
6.12 Some historical background
140(1)
6.13 Further reading
141(2)
References
141(2)
7 Approaching the Shannon limit by turbo coding
143(24)
7.1 Information Transmission Theorem
143(2)
7.2 The Gaussian channel
145(1)
7.3 Transmission at a rate below capacity
146(1)
7.4 Transmission at a rate above capacity
147(8)
7.5 Turbo coding: an introduction
155(4)
7.6 Further reading
159(1)
7.7 Appendix: Why we assume uniform and independent data at the encoder
160(4)
7.8 Appendix: Definition of concavity
164(3)
References
165(2)
8 Other aspects of coding theory
167(16)
8.1 Hamming code and projective geometry
167(8)
8.2 Coding and game theory
175(5)
8.3 Further reading
180(3)
References
182(1)
References 183(4)
Index 187
Stefan M. Moser received the diploma (MSc) in electrical engineering in 1999, the MSc degree in industrial management (MBA) in 2003, and the PhD (Dr. sc. techn.) in the field of information theory in 2004, all from ETH Zurich, Switzerland. From 1999 to 2003 he was a Research and Teaching Assistant, and from 2004 to 2005 a Senior Research Assistant, with the Signal and Information Processing Laboratory at ETH Zurich. From 2005 to 2013, he was a Professor with the Department of Electrical and Computer Engineering at National Chiao Tung University (NCTU), Hsinchu, Taiwan. Currently he is a Senior Researcher with the Signal and Information Processing Laboratory at ETH Zurich and an Adjunct Professor at National Chiao Tung University. His research interests are in information theory and digital communications. Dr Moser was the recipient of the Wu Ta-You Memorial Award by the National Science Council of Taiwan in 2012, and the Best Paper Award for Young Scholars by the IEEE Communications Society Taipei and Tainan Chapters and the IEEE Information Theory Society Taipei Chapter in 2009. Further, he has received various awards from the National Chiao Tung University, two awards for outstanding teaching (in 2007 and 2012), and was presented with the Willi Studer Award of ETH and the ETH Medal (both in 1999), and the Sandoz (Novartis) Basler Maturandenpreis (1993). Po-Ning Chen is a Professor in the Department of Electrical Engineering at the National Chiao Tung University (NCTU). Amongst his awards, he has received the 2000 Young Scholar Paper Award from Academia Sinica. He was also selected as the Outstanding Tutor Teacher of NCTU in 2002 and he received the Distinguished Teaching Award from the College of Electrical and Computer Engineering in 2003.