LEADER 03321nam 22005415 450 001 996518463103316 005 20230526105708.0 010 $a3-031-21561-3 024 7 $a10.1007/978-3-031-21561-2 035 $a(CKB)5580000000524213 035 $a(DE-He213)978-3-031-21561-2 035 $a(EXLCZ)995580000000524213 100 $a20230315d2022 u| 0 101 0 $aeng 135 $aurnn#008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aInformation Theory$b[electronic resource] $eThree Theorems by Claude Shannon /$fby Antoine Chambert-Loir 205 $a1st ed. 2022. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2022. 215 $a1 online resource (XII, 209 p. 1 illus.) 225 1 $aLa Matematica per il 3+2,$x2038-5757 ;$v144 311 $a3-031-21560-5 327 $aElements of Theory of Probability -- Entropy and Mutual Information -- Coding -- Sampling -- Solutions to Exercises -- Bibliography -- Notation -- Index. 330 $aThis book provides an introduction to information theory, focussing on Shannon?s three foundational theorems of 1948?1949. Shannon?s first two theorems, based on the notion of entropy in probability theory, specify the extent to which a message can be compressed for fast transmission and how to erase errors associated with poor transmission. The third theorem, using Fourier theory, ensures that a signal can be reconstructed from a sufficiently fine sampling of it. These three theorems constitute the roadmap of the book. The first chapter studies the entropy of a discrete random variable and related notions. The second chapter, on compression and error correcting, introduces the concept of coding, proves the existence of optimal codes and good codes (Shannon's first theorem), and shows how information can be transmitted in the presence of noise (Shannon's second theorem). The third chapter proves the sampling theorem (Shannon's third theorem) and looks at its connections with other results, such as the Poisson summation formula. Finally, there is a discussion of the uncertainty principle in information theory. Featuring a good supply of exercises (with solutions), and an introductory chapter covering the prerequisites, this text stems out lectures given to mathematics/computer science students at the beginning graduate level. 410 0$aLa Matematica per il 3+2,$x2038-5757 ;$v144 606 $aComputer science?Mathematics 606 $aCoding theory 606 $aInformation theory 606 $aMathematics of Computing 606 $aCoding and Information Theory 606 $aTeoria de la informació$2thub 606 $aTeoria de la codificació$2thub 608 $aLlibres electrònics$2thub 615 0$aComputer science?Mathematics. 615 0$aCoding theory. 615 0$aInformation theory. 615 14$aMathematics of Computing. 615 24$aCoding and Information Theory. 615 7$aTeoria de la informació 615 7$aTeoria de la codificació 676 $a004.0151 700 $aChambert-Loir$b Antoine$4aut$4http://id.loc.gov/vocabulary/relators/aut$0285206 906 $aBOOK 912 $a996518463103316 996 $aInformation Theory$93091250 997 $aUNISA LEADER 00785nam a2200253 i 4500 001 991004376937207536 005 20250415091711.0 008 250415s1988 it er 000 0 ita d 020 $a8804312963 040 $aBibl. Dip.le Aggr. Studi Umanistici - Sez. Filosofia$bita$cSocioculturale Scs 041 0 $aita 082 04$a823.914$223 100 1 $aAmis, Martin$0163740 240 10$aEinstein's monsters$91367798 245 12$aI mostri di Einstein /$cMartin Amis ; traduzione di Andrea Kerbaker e Sarah Thorne 260 $aMilano :$bA. Mondadori,$c1988 300 $a127 p. ;$c23 cm 490 1 $aOmnibus 700 1 $aKerbaker, Andrea 700 1 $aThorne, Sarah 830 0$aOmnibus 912 $a991004376937207536 996 $aEinstein's monsters$91367798 997 $aUNISALENTO