A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen [[electronic resource]] |
Autore | Moser Stefan M. |
Pubbl/distr/stampa | Cambridge : , : Cambridge University Press, , 2012 |
Descrizione fisica | 1 online resource (xiii, 191 pages) : digital, PDF file(s) |
Disciplina | 003.54 |
Soggetto topico |
Coding theory
Information theory |
ISBN |
1-107-23030-6
1-107-08680-9 1-280-77481-9 1-139-22305-4 9786613685209 1-139-22134-5 1-139-05953-X 1-139-21825-5 1-139-21516-7 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; A Student's Guide to Coding and Information Theory; Title; Copyright; Contents; Contributors; Preface; 1 Introduction; 1.1 Information theory versus coding theory; 1.2 Model and basic operations of information processing systems; 1.3 Information source; 1.4 Encoding a source alphabet; 1.5 Octal and hexadecimal codes; 1.6 Outline of the book; References; 2 Error-detecting codes; 2.1 Review of modular arithmetic; 2.2 Independent errors - white noise; 2.3 Single parity-check code; 2.4 The ASCII code; 2.5 Simple burst error-detecting code; 2.6 Alphabet plus number codes - weighted codes
2.7 Trade-off between redundancy and error-detecting capability2.8 Further reading; References; 3 Repetition and Hamming codes; 3.1 Arithmetics in the binary field; 3.2 Three-times repetition code; 3.3 Hamming code; 3.3.1 Some historical background; 3.3.2 Encoding and error correction of the (7,4) Hamming code; 3.3.3 Hamming bound: sphere packing; 3.4 Further reading; References; 4 Data compression: efficient coding of a random message; 4.1 A motivating example; 4.2 Prefix-free or instantaneous codes; 4.3 Trees and codes; 4.4 The Kraft Inequality; 4.5 Trees with probabilities 4.6 Optimal codes: Huffman code4.7 Types of codes; 4.8 Some historical background; 4.9 Further reading; References; 5 Entropy and Shannon's Source Coding Theorem; 5.1 Motivation; 5.2 Uncertainty or entropy; 5.2.1 Definition; 5.2.2 Binary entropy function; 5.2.3 The Information Theory Inequality; 5.2.4 Bounds on the entropy; 5.3 Trees revisited; 5.4 Bounds on the efficiency of codes; 5.4.1 What we cannot do: fundamental limitations of source coding; 5.4.2 What we can do: analysis of the best codes; 5.4.3 Coding Theorem for a Single Random Message; 5.5 Coding of an information source 5.6 Some historical background5.7 Further reading; 5.8 Appendix: Uniqueness of the definition of entropy; References; 6 Mutual information and channel capacity; 6.1 Introduction; 6.2 The channel; 6.3 The channel relationships; 6.4 The binary symmetric channel; 6.5 System entropies; 6.6 Mutual information; 6.7 Definition of channel capacity; 6.8 Capacity of the binary symmetric channel; 6.9 Uniformly dispersive channel; 6.10 Characterization of the capacity-achieving input distribution; 6.11 Shannon's Channel Coding Theorem; 6.12 Some historical background; 6.13 Further reading; References 7 Approaching the Shannon limit by turbo coding7.1 Information Transmission Theorem; 7.2 The Gaussian channel; 7.3 Transmission at a rate below capacity; 7.4 Transmission at a rate above capacity; 7.5 Turbo coding: an introduction; 7.6 Further reading; 7.7 Appendix: Why we assume uniform and independent data at the encoder; 7.8 Appendix: Definition of concavity; References; 8 Other aspects of coding theory; 8.1 Hamming code and projective geometry; 8.2 Coding and game theory; 8.3 Further reading; References; References; Index |
Altri titoli varianti | A Student's Guide to Coding & Information Theory |
Record Nr. | UNINA-9910452988603321 |
Moser Stefan M. | ||
Cambridge : , : Cambridge University Press, , 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen [[electronic resource]] |
Autore | Moser Stefan M. |
Pubbl/distr/stampa | Cambridge : , : Cambridge University Press, , 2012 |
Descrizione fisica | 1 online resource (xiii, 191 pages) : digital, PDF file(s) |
Disciplina | 003.54 |
Soggetto topico |
Coding theory
Information theory |
ISBN |
1-107-23030-6
1-107-08680-9 1-280-77481-9 1-139-22305-4 9786613685209 1-139-22134-5 1-139-05953-X 1-139-21825-5 1-139-21516-7 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; A Student's Guide to Coding and Information Theory; Title; Copyright; Contents; Contributors; Preface; 1 Introduction; 1.1 Information theory versus coding theory; 1.2 Model and basic operations of information processing systems; 1.3 Information source; 1.4 Encoding a source alphabet; 1.5 Octal and hexadecimal codes; 1.6 Outline of the book; References; 2 Error-detecting codes; 2.1 Review of modular arithmetic; 2.2 Independent errors - white noise; 2.3 Single parity-check code; 2.4 The ASCII code; 2.5 Simple burst error-detecting code; 2.6 Alphabet plus number codes - weighted codes
2.7 Trade-off between redundancy and error-detecting capability2.8 Further reading; References; 3 Repetition and Hamming codes; 3.1 Arithmetics in the binary field; 3.2 Three-times repetition code; 3.3 Hamming code; 3.3.1 Some historical background; 3.3.2 Encoding and error correction of the (7,4) Hamming code; 3.3.3 Hamming bound: sphere packing; 3.4 Further reading; References; 4 Data compression: efficient coding of a random message; 4.1 A motivating example; 4.2 Prefix-free or instantaneous codes; 4.3 Trees and codes; 4.4 The Kraft Inequality; 4.5 Trees with probabilities 4.6 Optimal codes: Huffman code4.7 Types of codes; 4.8 Some historical background; 4.9 Further reading; References; 5 Entropy and Shannon's Source Coding Theorem; 5.1 Motivation; 5.2 Uncertainty or entropy; 5.2.1 Definition; 5.2.2 Binary entropy function; 5.2.3 The Information Theory Inequality; 5.2.4 Bounds on the entropy; 5.3 Trees revisited; 5.4 Bounds on the efficiency of codes; 5.4.1 What we cannot do: fundamental limitations of source coding; 5.4.2 What we can do: analysis of the best codes; 5.4.3 Coding Theorem for a Single Random Message; 5.5 Coding of an information source 5.6 Some historical background5.7 Further reading; 5.8 Appendix: Uniqueness of the definition of entropy; References; 6 Mutual information and channel capacity; 6.1 Introduction; 6.2 The channel; 6.3 The channel relationships; 6.4 The binary symmetric channel; 6.5 System entropies; 6.6 Mutual information; 6.7 Definition of channel capacity; 6.8 Capacity of the binary symmetric channel; 6.9 Uniformly dispersive channel; 6.10 Characterization of the capacity-achieving input distribution; 6.11 Shannon's Channel Coding Theorem; 6.12 Some historical background; 6.13 Further reading; References 7 Approaching the Shannon limit by turbo coding7.1 Information Transmission Theorem; 7.2 The Gaussian channel; 7.3 Transmission at a rate below capacity; 7.4 Transmission at a rate above capacity; 7.5 Turbo coding: an introduction; 7.6 Further reading; 7.7 Appendix: Why we assume uniform and independent data at the encoder; 7.8 Appendix: Definition of concavity; References; 8 Other aspects of coding theory; 8.1 Hamming code and projective geometry; 8.2 Coding and game theory; 8.3 Further reading; References; References; Index |
Altri titoli varianti | A Student's Guide to Coding & Information Theory |
Record Nr. | UNINA-9910779101803321 |
Moser Stefan M. | ||
Cambridge : , : Cambridge University Press, , 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|