LEADER 04240nam 22007335 450 001 996508569903316 005 20230522105133.0 010 $a3-662-65875-5 024 7 $a10.1007/978-3-662-65875-8 035 $a(CKB)5580000000496805 035 $a(DE-He213)978-3-662-65875-8 035 $a(PPN)267807546 035 $a(MiAaPQ)EBC31005839 035 $a(Au-PeEL)EBL31005839 035 $a(EXLCZ)995580000000496805 100 $a20230102d2022 u| 0 101 0 $aeng 135 $aurnn#008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aNovelty, Information and Surprise$b[electronic resource] /$fby Günther Palm 205 $a2nd ed. 2022. 210 1$aBerlin, Heidelberg :$cSpringer Berlin Heidelberg :$cImprint: Springer,$d2022. 215 $a1 online resource (XX, 293 p. 1 illus.) 225 1 $aInformation Science and Statistics,$x2197-4128 311 $a3-662-65874-7 327 $aSurprise and Information of Descriptions: Prerequisites -- Improbability and Novelty of Descriptions -- Conditional Novelty and Information -- Coding and Information Transmission: On Guessing and Coding -- Information Transmission -- Information Rate and Channel Capacity: Stationary Processes and Information Rate -- Channel Capacity -- Shannon's Theorem -- Repertoires and Covers: Repertoires and Descriptions -- Novelty, Information and Surprise of Repertoires -- Conditioning, Mutual Information and Information Gain -- Information, Novelty and Surprise in Science: Information, Novelty and Surprise in Brain Theory -- Surprise from Repetitions and Combination of Surprises -- Entropy in Physics -- Generalized Information Theory: Order- and Lattice-Structures -- Three Orderings on Repertoires -- Information Theory on Lattices of Covers -- Bibliography -- Index. 330 $aThis revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is defined for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space ?) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of ?, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these concepts, mostly in statistics and in neuroscience. 410 0$aInformation Science and Statistics,$x2197-4128 606 $aStatistics 606 $aBiomathematics 606 $aBiometry 606 $aPattern recognition systems 606 $aStatistical Theory and Methods 606 $aMathematical and Computational Biology 606 $aBiostatistics 606 $aAutomated Pattern Recognition 606 $aTeoria de la informaciķ$2thub 606 $aBiomatemātica$2thub 606 $aBiometria$2thub 606 $aReconeixement de formes (Informātica)$2thub 608 $aLlibres electrōnics$2thub 615 0$aStatistics. 615 0$aBiomathematics. 615 0$aBiometry. 615 0$aPattern recognition systems. 615 14$aStatistical Theory and Methods. 615 24$aMathematical and Computational Biology. 615 24$aBiostatistics. 615 24$aAutomated Pattern Recognition. 615 7$aTeoria de la informaciķ 615 7$aBiomatemātica 615 7$aBiometria 615 7$aReconeixement de formes (Informātica) 676 $a519.5 700 $aPalm$b Günther$4aut$4http://id.loc.gov/vocabulary/relators/aut$0347025 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996508569903316 996 $aNovelty, Information and Surprise$93091296 997 $aUNISA