A Student's Guide to Coding and Information Theory
This easy-to-read guide provides a concise introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. Background mathematics and specific engineering techniques are kept to a minimum so that only a basic knowledge of high-school ma...
Saved in:
| Main Authors | , |
|---|---|
| Format | eBook Book |
| Language | English |
| Published |
Cambridge
Cambridge University Press
26.01.2012
|
| Edition | 1 |
| Subjects | |
| Online Access | Get full text |
| ISBN | 1107601967 1107015839 9781107015838 9781107601963 |
| DOI | 10.1017/CBO9781139059534 |
Cover
Table of Contents:
- Cover -- A Student's Guide to Coding and Information Theory -- Title -- Copyright -- Contents -- Contributors -- Preface -- 1 Introduction -- 1.1 Information theory versus coding theory -- 1.2 Model and basic operations of information processing systems -- 1.3 Information source -- 1.4 Encoding a source alphabet -- 1.5 Octal and hexadecimal codes -- 1.6 Outline of the book -- References -- 2 Error-detecting codes -- 2.1 Review of modular arithmetic -- 2.2 Independent errors - white noise -- 2.3 Single parity-check code -- 2.4 The ASCII code -- 2.5 Simple burst error-detecting code -- 2.6 Alphabet plus number codes - weighted codes -- 2.7 Trade-off between redundancy and error-detecting capability -- 2.8 Further reading -- References -- 3 Repetition and Hamming codes -- 3.1 Arithmetics in the binary field -- 3.2 Three-times repetition code -- 3.3 Hamming code -- 3.3.1 Some historical background -- 3.3.2 Encoding and error correction of the (7,4) Hamming code -- 3.3.3 Hamming bound: sphere packing -- 3.4 Further reading -- References -- 4 Data compression: efficient coding of a random message -- 4.1 A motivating example -- 4.2 Prefix-free or instantaneous codes -- 4.3 Trees and codes -- 4.4 The Kraft Inequality -- 4.5 Trees with probabilities -- 4.6 Optimal codes: Huffman code -- 4.7 Types of codes -- 4.8 Some historical background -- 4.9 Further reading -- References -- 5 Entropy and Shannon's Source Coding Theorem -- 5.1 Motivation -- 5.2 Uncertainty or entropy -- 5.2.1 Definition -- 5.2.2 Binary entropy function -- 5.2.3 The Information Theory Inequality -- 5.2.4 Bounds on the entropy -- 5.3 Trees revisited -- 5.4 Bounds on the efficiency of codes -- 5.4.1 What we cannot do: fundamental limitations of source coding -- 5.4.2 What we can do: analysis of the best codes -- 5.4.3 Coding Theorem for a Single Random Message
- 5.5 Coding of an information source -- 5.6 Some historical background -- 5.7 Further reading -- 5.8 Appendix: Uniqueness of the definition of entropy -- References -- 6 Mutual information and channel capacity -- 6.1 Introduction -- 6.2 The channel -- 6.3 The channel relationships -- 6.4 The binary symmetric channel -- 6.5 System entropies -- 6.6 Mutual information -- 6.7 Definition of channel capacity -- 6.8 Capacity of the binary symmetric channel -- 6.9 Uniformly dispersive channel -- 6.10 Characterization of the capacity-achieving input distribution -- 6.11 Shannon's Channel Coding Theorem -- 6.12 Some historical background -- 6.13 Further reading -- References -- 7 Approaching the Shannon limit by turbo coding -- 7.1 Information Transmission Theorem -- 7.2 The Gaussian channel -- 7.3 Transmission at a rate below capacity -- 7.4 Transmission at a rate above capacity -- 7.5 Turbo coding: an introduction -- 7.6 Further reading -- 7.7 Appendix: Why we assume uniform and independent data at the encoder -- 7.8 Appendix: Definition of concavity -- References -- 8 Other aspects of coding theory -- 8.1 Hamming code and projective geometry -- 8.2 Coding and game theory -- 8.3 Further reading -- References -- References -- Index