LEADER 00919nam0 22002531i 450 001 UON00104275 005 20231205102607.616 100 $a20020107d1960 |0itac50 ba 101 $aeng 102 $aGB 105 $a|||| 1|||| 200 1 $aPersonal column$fCharles Belgrave 210 $aLondon$cHutchinson and Co.$d1960 215 $a248 p.$d21 cm 620 $aGB$dLondon$3UONL003044 686 $aEMA I$cEMIRATI ARABI UNITI - OPERE INTERDISCIPLINARI, GUIDE$2A 700 1$aBELGRAVE$bCharles$3UONV030786$0650388 712 $aHutchinson & Co.$3UONV257511$4650 801 $aIT$bSOL$c20240220$gRICA 899 $aSIBA - SISTEMA BIBLIOTECARIO DI ATENEO$2UONSI 912 $aUON00104275 950 $aSIBA - SISTEMA BIBLIOTECARIO DI ATENEO$dSI EMA I 003 $eSI MR 71604 5 003 996 $aPersonal column$91311439 997 $aUNIOR LEADER 05520nam 2200697 a 450 001 9910830822403321 005 20230721025459.0 010 $a1-280-84766-2 010 $a9786610847662 010 $a0-470-61242-8 010 $a0-470-39455-2 010 $a1-84704-574-X 035 $a(CKB)1000000000335558 035 $a(EBL)700734 035 $a(OCoLC)769341523 035 $a(SSID)ssj0000119980 035 $a(PQKBManifestationID)11130236 035 $a(PQKBTitleCode)TC0000119980 035 $a(PQKBWorkID)10080077 035 $a(PQKB)10370881 035 $a(MiAaPQ)EBC700734 035 $a(MiAaPQ)EBC261984 035 $a(Au-PeEL)EBL261984 035 $a(OCoLC)936813928 035 $a(EXLCZ)991000000000335558 100 $a20061002d2007 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aChannel coding in communication networks$b[electronic resource] $efrom theory to turbocodes /$fedited by Alain Glavieux 210 $aLondon ;$aNewport Beach, CA $cISTE$d2007 215 $a1 online resource (438 p.) 225 1 $aDigital signal and image processing series 300 $aDescription based upon print version of record. 311 $a1-905209-24-X 320 $aIncludes bibliographical references and index. 327 $aChannel Coding in Communication Networks; Table of Contents; Homage to Alain Glavieux; Chapter 1. Information Theory; 1.1. Introduction: the Shannon paradigm; 1.2. Principal coding functions; 1.2.1. Source coding; 1.2.2. Channel coding; 1.2.3. Cryptography; 1.2.4. Standardization of the Shannon diagram blocks; 1.2.5. Fundamental theorems; 1.3. Quantitative measurement of information; 1.3.1. Principle; 1.3.2. Measurement of self-information; 1.3.3. Entropy of a source; 1.3.4. Mutual information measure; 1.3.5. Channel capacity; 1.3.6. Comments on the measurement of information 327 $a1.4. Source coding1.4.1. Introduction; 1.4.2. Decodability, Kraft-McMillan inequality; 1.4.3. Demonstration of the fundamental theorem; 1.4.4. Outline of optimal algorithms of source coding; 1.5. Channel coding; 1.5.1. Introduction and statement of the fundamental theorem; 1.5.2. General comments; 1.5.3. Need for redundancy; 1.5.4. Example of the binary symmetric channel; 1.5.4.1. Hamming's metric; 1.5.4.2. Decoding with minimal Hamming distance; 1.5.4.3. Random coding; 1.5.4.4. Gilbert-Varshamov bound; 1.5.5. A geometrical interpretation; 1.5.6. Fundamental theorem: Gallager's proof 327 $a1.5.6.1. Upper bound of the probability of error1.5.6.2. Use of random coding; 1.5.6.3. Form of exponential limits; 1.6. Channels with continuous noise; 1.6.1. Introduction; 1.6.2. A reference model in physical reality: the channel with Gaussian additive noise; 1.6.3. Communication via a channel with additive white Gaussian noise; 1.6.3.1. Use of a finite alphabet, modulation; 1.6.3.2. Demodulation, decision margin; 1.6.4. Channel with fadings; 1.7. Information theory and channel coding; 1.8. Bibliography; Chapter 2. Block Codes; 2.1. Unstructured codes 327 $a2.1.1. The fundamental question of message redundancy2.1.2. Unstructured codes; 2.1.2.1. Code parameters; 2.1.2.2. Code, coding and decoding; 2.1.2.3. Bounds of code parameters; 2.2. Linear codes; 2.2.1. Introduction; 2.2.2. Properties of linear codes; 2.2.2.1. Minimum distance and minimum weight of a code; 2.2.2.2. Linear code base, coding; 2.2.2.3. Singleton bound; 2.2.3. Dual code; 2.2.3.1. Reminders of the Gaussian method; 2.2.3.2. Lateral classes of a linear code C; 2.2.3.3. Syndromes; 2.2.3.4. Decoding and syndromes; 2.2.3.5. Lateral classes, syndromes and decoding 327 $a2.2.3.6. Parity check matrix and minimum code weight2.2.3.7. Minimum distance of C and matrix H; 2.2.4. Some linear codes; 2.2.5. Decoding of linear codes; 2.3. Finite fields; 2.3.1. Basic concepts; 2.3.2. Polynomial modulo calculations: quotient ring; 2.3.3. Irreducible polynomial modulo calculations: finite field; 2.3.4. Order and the opposite of an element of F2[X]/(p(X)); 2.3.4.1. Order; 2.3.4.2. Properties of the order; 2.3.4.3. Primitive elements; 2.3.4.4. Use of the primitives; 2.3.4.5. How to find a primitive; 2.3.4.6. Exponentiation; 2.3.5. Minimum polynomials 327 $a2.3.6. The field of nth roots of unity 330 $aThis book provides a comprehensive overview of the subject of channel coding. It starts with a description of information theory, focusing on the quantitative measurement of information and introducing two fundamental theorems on source and channel coding. The basics of channel coding in two chapters, block codes and convolutional codes, are then discussed, and for these the authors introduce weighted input and output decoding algorithms and recursive systematic convolutional codes, which are used in the rest of the book. Trellis coded modulations, which have their primary applications in hi 410 0$aDigital signal and image processing series. 606 $aCoding theory 606 $aError-correcting codes (Information theory) 615 0$aCoding theory. 615 0$aError-correcting codes (Information theory) 676 $a003.54 676 $a003/.54 676 $a621.3821 701 $aGlavieux$b Alain$0912232 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910830822403321 996 $aChannel coding in communication networks$92042580 997 $aUNINA