Vai al contenuto principale della pagina

A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen [[electronic resource]]



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Autore: Moser Stefan M. Visualizza persona
Titolo: A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen [[electronic resource]] Visualizza cluster
Pubblicazione: Cambridge : , : Cambridge University Press, , 2012
Descrizione fisica: 1 online resource (xiii, 191 pages) : digital, PDF file(s)
Disciplina: 003.54
Soggetto topico: Coding theory
Information theory
Classificazione: SK 170
Persona (resp. second.): ChenPo-Ning
Note generali: Title from publisher's bibliographic system (viewed on 05 Oct 2015).
Nota di bibliografia: Includes bibliographical references and index.
Nota di contenuto: Cover; A Student's Guide to Coding and Information Theory; Title; Copyright; Contents; Contributors; Preface; 1 Introduction; 1.1 Information theory versus coding theory; 1.2 Model and basic operations of information processing systems; 1.3 Information source; 1.4 Encoding a source alphabet; 1.5 Octal and hexadecimal codes; 1.6 Outline of the book; References; 2 Error-detecting codes; 2.1 Review of modular arithmetic; 2.2 Independent errors - white noise; 2.3 Single parity-check code; 2.4 The ASCII code; 2.5 Simple burst error-detecting code; 2.6 Alphabet plus number codes - weighted codes
2.7 Trade-off between redundancy and error-detecting capability2.8 Further reading; References; 3 Repetition and Hamming codes; 3.1 Arithmetics in the binary field; 3.2 Three-times repetition code; 3.3 Hamming code; 3.3.1 Some historical background; 3.3.2 Encoding and error correction of the (7,4) Hamming code; 3.3.3 Hamming bound: sphere packing; 3.4 Further reading; References; 4 Data compression: efficient coding of a random message; 4.1 A motivating example; 4.2 Prefix-free or instantaneous codes; 4.3 Trees and codes; 4.4 The Kraft Inequality; 4.5 Trees with probabilities
4.6 Optimal codes: Huffman code4.7 Types of codes; 4.8 Some historical background; 4.9 Further reading; References; 5 Entropy and Shannon's Source Coding Theorem; 5.1 Motivation; 5.2 Uncertainty or entropy; 5.2.1 Definition; 5.2.2 Binary entropy function; 5.2.3 The Information Theory Inequality; 5.2.4 Bounds on the entropy; 5.3 Trees revisited; 5.4 Bounds on the efficiency of codes; 5.4.1 What we cannot do: fundamental limitations of source coding; 5.4.2 What we can do: analysis of the best codes; 5.4.3 Coding Theorem for a Single Random Message; 5.5 Coding of an information source
5.6 Some historical background5.7 Further reading; 5.8 Appendix: Uniqueness of the definition of entropy; References; 6 Mutual information and channel capacity; 6.1 Introduction; 6.2 The channel; 6.3 The channel relationships; 6.4 The binary symmetric channel; 6.5 System entropies; 6.6 Mutual information; 6.7 Definition of channel capacity; 6.8 Capacity of the binary symmetric channel; 6.9 Uniformly dispersive channel; 6.10 Characterization of the capacity-achieving input distribution; 6.11 Shannon's Channel Coding Theorem; 6.12 Some historical background; 6.13 Further reading; References
7 Approaching the Shannon limit by turbo coding7.1 Information Transmission Theorem; 7.2 The Gaussian channel; 7.3 Transmission at a rate below capacity; 7.4 Transmission at a rate above capacity; 7.5 Turbo coding: an introduction; 7.6 Further reading; 7.7 Appendix: Why we assume uniform and independent data at the encoder; 7.8 Appendix: Definition of concavity; References; 8 Other aspects of coding theory; 8.1 Hamming code and projective geometry; 8.2 Coding and game theory; 8.3 Further reading; References; References; Index
Sommario/riassunto: This easy-to-read guide provides a concise introduction to the engineering background of modern communication systems, from mobile phones to data compression and storage. Background mathematics and specific engineering techniques are kept to a minimum so that only a basic knowledge of high-school mathematics is needed to understand the material covered. The authors begin with many practical applications in coding, including the repetition code, the Hamming code and the Huffman code. They then explain the corresponding information theory, from entropy and mutual information to channel capacity and the information transmission theorem. Finally, they provide insights into the connections between coding theory and other fields. Many worked examples are given throughout the book, using practical applications to illustrate theoretical definitions. Exercises are also included, enabling readers to double-check what they have learned and gain glimpses into more advanced topics, making this perfect for anyone who needs a quick introduction to the subject.
Altri titoli varianti: A Student's Guide to Coding & Information Theory
Titolo autorizzato: A student's guide to coding and information theory  Visualizza cluster
ISBN: 1-107-23030-6
1-107-08680-9
1-280-77481-9
1-139-22305-4
9786613685209
1-139-22134-5
1-139-05953-X
1-139-21825-5
1-139-21516-7
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9910821690803321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui