Compressed sensing in information processing / / Gitta Kutyniok, Holger Rauhut, Robert J. Kunsch, editors |
Pubbl/distr/stampa | Cham, Switzerland : , : Birkhäuser, , [2022] |
Descrizione fisica | 1 online resource (549 pages) |
Disciplina | 621.3678 |
Collana | Applied and numerical harmonic analysis |
Soggetto topico |
Compressed sensing (Telecommunication)
Information theory Neural networks (Computer science) - Design and construction Telecomunicació Teoria de la informació Xarxes neuronals (Informàtica) |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-031-09745-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNINA-9910619276503321 |
Cham, Switzerland : , : Birkhäuser, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Compressed sensing in information processing / / Gitta Kutyniok, Holger Rauhut, Robert J. Kunsch, editors |
Pubbl/distr/stampa | Cham, Switzerland : , : Birkhäuser, , [2022] |
Descrizione fisica | 1 online resource (549 pages) |
Disciplina | 621.3678 |
Collana | Applied and numerical harmonic analysis |
Soggetto topico |
Compressed sensing (Telecommunication)
Information theory Neural networks (Computer science) - Design and construction Telecomunicació Teoria de la informació Xarxes neuronals (Informàtica) |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-031-09745-9 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Record Nr. | UNISA-996495169003316 |
Cham, Switzerland : , : Birkhäuser, , [2022] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Identification and other probabilistic models : Rudolf Ahlswede's lectures on information theory 6 / / Rudolf Ahlswede ; editors, Alexander Ahlswede [and three others] |
Autore | Ahlswede Rudolf <1938-> |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2021] |
Descrizione fisica | 1 online resource (720 pages) |
Disciplina | 003.54 |
Collana | Foundations in Signal Processing, Communications and Networking |
Soggetto topico |
Information theory
Teoria de la informació |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-030-65072-3 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Intro -- Words and Introduction of the Editors -- Preface -- Preamble -- Contents -- Notation and Abbreviations -- Part I Identification via Channels -- Identification via Channels -- 1 Results and Preliminaries -- 1.1 Notation and Known Facts -- 1.1.1 Entropy and Information Quantities -- 1.1.2 Channels, Empirical Distributions, Generated Sequences -- 1.1.3 Elementary Properties of Typical Sequences and Generated Sequences -- 1.1.4 Formulation of the Classical Transmission Problem -- 1.2 Formulation of the Identification Problem -- 1.2.1 The Double Exponent Coding Theorem -- 2 The Direct Parts of the Coding Theorems -- 3 The Strong Converses -- 3.1 Analytic Proof of the Strong Converse -- 3.1.1 Proof of Lemma 25 -- 3.2 Combinatorial Proof of the Strong Converse -- 4 Discussion -- References -- Identification in the Presence of Feedback: A Discovery of New Capacity Formulas -- 1 The Results -- 2 Notation and Known Facts -- 3 New Proof of the Direct Part in Theorem 12 -- 4 Proof of the Direct Part of Theorem 40 -- 5 Proof of the Direct Part of Theorem 41 -- 6 Proof of the Converse Part of Theorem 40 -- 7 Proof of the Converse Part of Theorem 41 -- References -- On Identification via Multi-Way Channels with Feedback: Mystery Numbers -- 1 Introduction -- 2 Review of Known Concepts and Results -- 3 A General Model for Communication Systems -- 4 Classes of Feedback Strategies, Common Random Experiments and Their Mystery Numbers -- 5 Main Theorem and Consequences -- 6 A Method for Proving Converses in Case of Feedback -- 7 A 3-Step ID Scheme for the Noiseless BSC -- 8 Extension of the 3-Step ID Scheme to the DMC With and Without Feedback -- 9 Proof of Theorems 53 and 54 -- 10 Proof of Theorem 61, Optimality of Our Coding Scheme -- References -- Identification Without Randomization -- 1 Introduction and Results -- 2 Proof of Theorem 67.
3 Proof of Theorem 69 -- 4 Proof of Theorem 70 -- 5 Proof of Theorem 71 -- 6 Proof of Lemma 73 -- 7 Proof of Theorem 74 -- References -- Identification via Channels with Noisy Feedback -- 1 Introduction -- 2 Proof of Theorem 75 -- References -- Identification via Discrete Memoryless Wiretap Channels -- 1 Introduction -- 2 Proof of Theorem 87 -- References -- Part II A General Theory of Information Transfer -- Introduction -- References -- One Sender Answering Several Questions of Receivers -- 1 A General Communication Model for One Sender -- 2 Analysis of a Specific Model: K-Identification -- 3 Models with Capacity Equal to the Ordinary Capacity -- References -- Models with Prior Knowledge of the Receiver -- 1 Zero-error Decodable Hypergraphs -- 2 K-Separating Codes -- 3 Analysis of a Model with Specific Constraints: 2-Separation and Rényi's Entropy H2 -- 4 Binning via Channels -- 5 K-Identifiability, K-Separability and Related Notions -- References -- Models with Prior Knowledge at the Sender -- 1 Identification via Group Testing and a Stronger Form of the Rate-Distortion Theorem -- References -- Identification and Transmission with Multi-way Channels -- 1 Simultaneous Transfer: Transmission and Identification -- 2 A Proof of the Weak Converse to the Identification Coding Theorem for the DMC -- 3 Two Promised Results: Characterisation of the Capacity Regions for the MAC and the BC for Identification -- 4 The Proof for the MAC -- 5 The Proof for the BC -- References -- Data Compression -- 1 Noiseless Coding for Identification -- 2 Noiseless Coding for Multiple Purposes -- References -- Perspectives -- 1 Comparison of Identification Rate and Common Randomness Capacity: Identification Rate can Exceed Common Randomness Capacity and Vice Versa -- 2 Robustness, Common Randomness and Identification. 3 Beyond Information Theory: Identification as a New Concept of Solution for Probabilistic Algorithms -- References -- Part III Identification, Mystery Numbers, or Common Randomness -- The Role of Common Randomness in Information Theory and Cryptography: Secrecy Constraints -- 1 Introduction -- 2 Generating a Shared Secret Key When the Third Party Has No Side Information -- 3 Secret Sharing When the Third Party Has Side Information -- 4 Proofs -- 5 Conclusions -- References -- Common Randomness in Information Theory and Cryptography CR Capacity -- 1 Introduction -- 2 Preliminaries -- 2.1 Model (i): Two-Source with One-Way Communication -- 2.2 Model (ii): DMC with Active Feedback -- 2.3 Model (iii): Two-Source with Two-Way Noiseless Communication -- 2.4 Models with Robust CR -- 3 Some General Results -- 4 Common Randomness in Models (i), (ii), and (iii) -- 5 Common Randomness, Identification, and Transmission for Arbitrarily Varying Channels -- 5.1 Model (A): AVC Without Feedback and Any Other Side Information -- 5.2 Model (B): AVC with Noiseless (Passive) Feedback -- 5.3 Model (C): Strongly Arbitrarily Varying Channel (SAVC) -- References -- Watermarking Identification Codes with Related Topics on Common Randomness -- 1 Introduction -- 2 The Notation -- 3 The Models -- 3.1 Watermarking Identification Codes -- 3.2 The Common Randomness -- 3.3 The Models for Compound Channels -- 4 The Results -- 4.1 The Results on Common Randomness -- 4.2 The Results on Watermarking Identification Codes -- 4.3 A Result on Watermarking Transmission Code with a Common Experiment Introduced by Steinberg-Merhav -- 5 The Direct Theorems for Common Randomness -- 6 The Converse Theorems for Common Randomness -- 7 Construction of Watermarking Identification Codes from Common Randomness -- 8 A Converse Theorem of a Watermarking Coding Theorem Due to Steinberg-Merhav. References -- Transmission, Identification and Common Randomness Capacities for Wire-Tap Channels with Secure Feedback from the Decoder -- 1 Introduction -- 2 Notation and Definitions -- 3 Previous and Auxiliary Results -- 4 The Coding Theorem for Transmission and Its Proof -- 5 Capacity of Two Special Families of Wire-Tap Channels -- 6 Discussion: Transmission, Building Common Randomness and Identification -- 7 The Secure Common Randomness Capacity in the Presenceof Secure Feedback -- 8 The Secure Identification Capacity in the Presenceof Secure Feedback -- References -- Secrecy Systems for Identification Via Channels with Additive-Like Instantaneous Block Encipherer -- 1 Introduction -- 2 Background -- 3 Model -- 4 Main Result -- References -- Part IV Identification for Sources, Identification Entropy, and Hypothesis Testing -- Identification for Sources -- 1 Introduction -- 1.1 Pioneering Model -- 1.1.1 Further Models and Definitions -- 2 A Probabilistic Tool for Generalized Identification -- 3 The Uniform Distribution -- 4 Bounds on L(P) for General P=(P1,…,PN) -- 4.1 An Upper Bound -- 5 An Average Identification Length -- 5.1 Q is the Uniform Distribution on V=U -- 5.2 The Example Above in Model GID with Average Identification Length for a Uniform Q* -- References -- Identification Entropy -- 1 Introduction -- 2 Noiseless Identification for Sources and Basic Concept of Performance -- 3 Examples for Huffman Codes -- 4 An Identification Code Universally Good for all P on U={1,2,…,N} -- 5 Identification Entropy HI(P) and Its Role as Lower Bound -- 6 On Properties of (PN) -- 6.1 A First Idea -- 6.2 A Rearrangement -- 7 Upper Bounds on (PN) -- 8 The Skeleton -- 9 Directions for Research -- References -- An Interpretation of Identification Entropy -- 1 Introduction -- 1.1 Terminology -- 1.2 A New Terminology Involving Proper Common Prefices. 1.3 Matrix Notation -- 2 An Operational Justification of ID-Entropy asLower Bound for LC(P,P) -- 3 An Alternative Proof of the ID-Entropy Lower Bound for LC(P,P) -- 4 Sufficient and Necessary Conditions for a Prefix Code C to Achieve the ID-Entropy Lower Bound of LC(P,P) -- 5 A Global Balance Principle to Find Good Codes -- 6 Comments on Generalized Entropies -- References -- L-Identification for Sources -- 1 Introduction -- 2 Definitions and Notation -- 2.1 Source Coding and Code Trees -- 2.2 L-Identification -- 3 Two New Results for (1-)Identification -- 3.1 (1-)Identification for Block Codes -- 3.2 An Improved Upper Bound for Binary Codes -- 4 L-Identification for the Uniform Distribution -- 4.1 Colexicographic Balanced Huffman Trees -- 4.2 An Asymptotic Theorem -- 5 Two-Identification for General Distributions -- 5.1 An Asymptotic Approach -- 5.2 The q-ary Identification Entropy of Second Degree -- 5.3 An Upper Bound for Binary Codes -- 6 L-Identification for General Distributions -- 7 L-Identification for Sets -- 8 Open Problems -- 8.1 Induction Base for the Proof of Proposition 243 -- 8.2 L-Identification for Block Codes -- 8.3 L-Identification for Sets for General Distributions -- Appendix -- References -- Testing of Hypotheses and Identification -- 1 Preliminaries: Testing of Hypotheses and L1-Distance -- 2 Measures Separated in L1-Metrics -- 3 Identification Codes or ``How Large is the Set of all Output Measures for Noisy Channel?'' -- Appendix -- References -- On Logarithmically Asymptotically Optimal Testing of Hypotheses and Identification -- 1 Problem Statement -- 2 Background -- 3 Identification Problem for Model with Independent Objects -- 4 Identification Problem for Models with Different Objects -- 5 Identification of the Probability Distribution of an Object -- 6 r-Identification and Ranking Problems. 7 Conclusion and Extensions of Problems. |
Record Nr. | UNINA-9910488706203321 |
Ahlswede Rudolf <1938-> | ||
Cham, Switzerland : , : Springer, , [2021] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Identification and other probabilistic models : Rudolf Ahlswede's lectures on information theory 6 / / Rudolf Ahlswede ; editors, Alexander Ahlswede [and three others] |
Autore | Ahlswede Rudolf <1938-> |
Pubbl/distr/stampa | Cham, Switzerland : , : Springer, , [2021] |
Descrizione fisica | 1 online resource (720 pages) |
Disciplina | 003.54 |
Collana | Foundations in Signal Processing, Communications and Networking |
Soggetto topico |
Information theory
Teoria de la informació |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-030-65072-3 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Intro -- Words and Introduction of the Editors -- Preface -- Preamble -- Contents -- Notation and Abbreviations -- Part I Identification via Channels -- Identification via Channels -- 1 Results and Preliminaries -- 1.1 Notation and Known Facts -- 1.1.1 Entropy and Information Quantities -- 1.1.2 Channels, Empirical Distributions, Generated Sequences -- 1.1.3 Elementary Properties of Typical Sequences and Generated Sequences -- 1.1.4 Formulation of the Classical Transmission Problem -- 1.2 Formulation of the Identification Problem -- 1.2.1 The Double Exponent Coding Theorem -- 2 The Direct Parts of the Coding Theorems -- 3 The Strong Converses -- 3.1 Analytic Proof of the Strong Converse -- 3.1.1 Proof of Lemma 25 -- 3.2 Combinatorial Proof of the Strong Converse -- 4 Discussion -- References -- Identification in the Presence of Feedback: A Discovery of New Capacity Formulas -- 1 The Results -- 2 Notation and Known Facts -- 3 New Proof of the Direct Part in Theorem 12 -- 4 Proof of the Direct Part of Theorem 40 -- 5 Proof of the Direct Part of Theorem 41 -- 6 Proof of the Converse Part of Theorem 40 -- 7 Proof of the Converse Part of Theorem 41 -- References -- On Identification via Multi-Way Channels with Feedback: Mystery Numbers -- 1 Introduction -- 2 Review of Known Concepts and Results -- 3 A General Model for Communication Systems -- 4 Classes of Feedback Strategies, Common Random Experiments and Their Mystery Numbers -- 5 Main Theorem and Consequences -- 6 A Method for Proving Converses in Case of Feedback -- 7 A 3-Step ID Scheme for the Noiseless BSC -- 8 Extension of the 3-Step ID Scheme to the DMC With and Without Feedback -- 9 Proof of Theorems 53 and 54 -- 10 Proof of Theorem 61, Optimality of Our Coding Scheme -- References -- Identification Without Randomization -- 1 Introduction and Results -- 2 Proof of Theorem 67.
3 Proof of Theorem 69 -- 4 Proof of Theorem 70 -- 5 Proof of Theorem 71 -- 6 Proof of Lemma 73 -- 7 Proof of Theorem 74 -- References -- Identification via Channels with Noisy Feedback -- 1 Introduction -- 2 Proof of Theorem 75 -- References -- Identification via Discrete Memoryless Wiretap Channels -- 1 Introduction -- 2 Proof of Theorem 87 -- References -- Part II A General Theory of Information Transfer -- Introduction -- References -- One Sender Answering Several Questions of Receivers -- 1 A General Communication Model for One Sender -- 2 Analysis of a Specific Model: K-Identification -- 3 Models with Capacity Equal to the Ordinary Capacity -- References -- Models with Prior Knowledge of the Receiver -- 1 Zero-error Decodable Hypergraphs -- 2 K-Separating Codes -- 3 Analysis of a Model with Specific Constraints: 2-Separation and Rényi's Entropy H2 -- 4 Binning via Channels -- 5 K-Identifiability, K-Separability and Related Notions -- References -- Models with Prior Knowledge at the Sender -- 1 Identification via Group Testing and a Stronger Form of the Rate-Distortion Theorem -- References -- Identification and Transmission with Multi-way Channels -- 1 Simultaneous Transfer: Transmission and Identification -- 2 A Proof of the Weak Converse to the Identification Coding Theorem for the DMC -- 3 Two Promised Results: Characterisation of the Capacity Regions for the MAC and the BC for Identification -- 4 The Proof for the MAC -- 5 The Proof for the BC -- References -- Data Compression -- 1 Noiseless Coding for Identification -- 2 Noiseless Coding for Multiple Purposes -- References -- Perspectives -- 1 Comparison of Identification Rate and Common Randomness Capacity: Identification Rate can Exceed Common Randomness Capacity and Vice Versa -- 2 Robustness, Common Randomness and Identification. 3 Beyond Information Theory: Identification as a New Concept of Solution for Probabilistic Algorithms -- References -- Part III Identification, Mystery Numbers, or Common Randomness -- The Role of Common Randomness in Information Theory and Cryptography: Secrecy Constraints -- 1 Introduction -- 2 Generating a Shared Secret Key When the Third Party Has No Side Information -- 3 Secret Sharing When the Third Party Has Side Information -- 4 Proofs -- 5 Conclusions -- References -- Common Randomness in Information Theory and Cryptography CR Capacity -- 1 Introduction -- 2 Preliminaries -- 2.1 Model (i): Two-Source with One-Way Communication -- 2.2 Model (ii): DMC with Active Feedback -- 2.3 Model (iii): Two-Source with Two-Way Noiseless Communication -- 2.4 Models with Robust CR -- 3 Some General Results -- 4 Common Randomness in Models (i), (ii), and (iii) -- 5 Common Randomness, Identification, and Transmission for Arbitrarily Varying Channels -- 5.1 Model (A): AVC Without Feedback and Any Other Side Information -- 5.2 Model (B): AVC with Noiseless (Passive) Feedback -- 5.3 Model (C): Strongly Arbitrarily Varying Channel (SAVC) -- References -- Watermarking Identification Codes with Related Topics on Common Randomness -- 1 Introduction -- 2 The Notation -- 3 The Models -- 3.1 Watermarking Identification Codes -- 3.2 The Common Randomness -- 3.3 The Models for Compound Channels -- 4 The Results -- 4.1 The Results on Common Randomness -- 4.2 The Results on Watermarking Identification Codes -- 4.3 A Result on Watermarking Transmission Code with a Common Experiment Introduced by Steinberg-Merhav -- 5 The Direct Theorems for Common Randomness -- 6 The Converse Theorems for Common Randomness -- 7 Construction of Watermarking Identification Codes from Common Randomness -- 8 A Converse Theorem of a Watermarking Coding Theorem Due to Steinberg-Merhav. References -- Transmission, Identification and Common Randomness Capacities for Wire-Tap Channels with Secure Feedback from the Decoder -- 1 Introduction -- 2 Notation and Definitions -- 3 Previous and Auxiliary Results -- 4 The Coding Theorem for Transmission and Its Proof -- 5 Capacity of Two Special Families of Wire-Tap Channels -- 6 Discussion: Transmission, Building Common Randomness and Identification -- 7 The Secure Common Randomness Capacity in the Presenceof Secure Feedback -- 8 The Secure Identification Capacity in the Presenceof Secure Feedback -- References -- Secrecy Systems for Identification Via Channels with Additive-Like Instantaneous Block Encipherer -- 1 Introduction -- 2 Background -- 3 Model -- 4 Main Result -- References -- Part IV Identification for Sources, Identification Entropy, and Hypothesis Testing -- Identification for Sources -- 1 Introduction -- 1.1 Pioneering Model -- 1.1.1 Further Models and Definitions -- 2 A Probabilistic Tool for Generalized Identification -- 3 The Uniform Distribution -- 4 Bounds on L(P) for General P=(P1,…,PN) -- 4.1 An Upper Bound -- 5 An Average Identification Length -- 5.1 Q is the Uniform Distribution on V=U -- 5.2 The Example Above in Model GID with Average Identification Length for a Uniform Q* -- References -- Identification Entropy -- 1 Introduction -- 2 Noiseless Identification for Sources and Basic Concept of Performance -- 3 Examples for Huffman Codes -- 4 An Identification Code Universally Good for all P on U={1,2,…,N} -- 5 Identification Entropy HI(P) and Its Role as Lower Bound -- 6 On Properties of (PN) -- 6.1 A First Idea -- 6.2 A Rearrangement -- 7 Upper Bounds on (PN) -- 8 The Skeleton -- 9 Directions for Research -- References -- An Interpretation of Identification Entropy -- 1 Introduction -- 1.1 Terminology -- 1.2 A New Terminology Involving Proper Common Prefices. 1.3 Matrix Notation -- 2 An Operational Justification of ID-Entropy asLower Bound for LC(P,P) -- 3 An Alternative Proof of the ID-Entropy Lower Bound for LC(P,P) -- 4 Sufficient and Necessary Conditions for a Prefix Code C to Achieve the ID-Entropy Lower Bound of LC(P,P) -- 5 A Global Balance Principle to Find Good Codes -- 6 Comments on Generalized Entropies -- References -- L-Identification for Sources -- 1 Introduction -- 2 Definitions and Notation -- 2.1 Source Coding and Code Trees -- 2.2 L-Identification -- 3 Two New Results for (1-)Identification -- 3.1 (1-)Identification for Block Codes -- 3.2 An Improved Upper Bound for Binary Codes -- 4 L-Identification for the Uniform Distribution -- 4.1 Colexicographic Balanced Huffman Trees -- 4.2 An Asymptotic Theorem -- 5 Two-Identification for General Distributions -- 5.1 An Asymptotic Approach -- 5.2 The q-ary Identification Entropy of Second Degree -- 5.3 An Upper Bound for Binary Codes -- 6 L-Identification for General Distributions -- 7 L-Identification for Sets -- 8 Open Problems -- 8.1 Induction Base for the Proof of Proposition 243 -- 8.2 L-Identification for Block Codes -- 8.3 L-Identification for Sets for General Distributions -- Appendix -- References -- Testing of Hypotheses and Identification -- 1 Preliminaries: Testing of Hypotheses and L1-Distance -- 2 Measures Separated in L1-Metrics -- 3 Identification Codes or ``How Large is the Set of all Output Measures for Noisy Channel?'' -- Appendix -- References -- On Logarithmically Asymptotically Optimal Testing of Hypotheses and Identification -- 1 Problem Statement -- 2 Background -- 3 Identification Problem for Model with Independent Objects -- 4 Identification Problem for Models with Different Objects -- 5 Identification of the Probability Distribution of an Object -- 6 r-Identification and Ranking Problems. 7 Conclusion and Extensions of Problems. |
Record Nr. | UNISA-996466396403316 |
Ahlswede Rudolf <1938-> | ||
Cham, Switzerland : , : Springer, , [2021] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Information Theory [[electronic resource] ] : Three Theorems by Claude Shannon / / by Antoine Chambert-Loir |
Autore | Chambert-Loir Antoine |
Edizione | [1st ed. 2022.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2022 |
Descrizione fisica | 1 online resource (XII, 209 p. 1 illus.) |
Disciplina | 004.0151 |
Collana | La Matematica per il 3+2 |
Soggetto topico |
Computer science—Mathematics
Coding theory Information theory Mathematics of Computing Coding and Information Theory Teoria de la informació Teoria de la codificació |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-031-21561-3 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Elements of Theory of Probability -- Entropy and Mutual Information -- Coding -- Sampling -- Solutions to Exercises -- Bibliography -- Notation -- Index. |
Record Nr. | UNISA-996518463103316 |
Chambert-Loir Antoine | ||
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2022 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Information Theory [[electronic resource] ] : Three Theorems by Claude Shannon / / by Antoine Chambert-Loir |
Autore | Chambert-Loir Antoine |
Edizione | [1st ed. 2022.] |
Pubbl/distr/stampa | Cham : , : Springer International Publishing : , : Imprint : Springer, , 2022 |
Descrizione fisica | 1 online resource (XII, 209 p. 1 illus.) |
Disciplina | 004.0151 |
Collana | La Matematica per il 3+2 |
Soggetto topico |
Computer science—Mathematics
Coding theory Information theory Mathematics of Computing Coding and Information Theory Teoria de la informació Teoria de la codificació |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-031-21561-3 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Elements of Theory of Probability -- Entropy and Mutual Information -- Coding -- Sampling -- Solutions to Exercises -- Bibliography -- Notation -- Index. |
Record Nr. | UNINA-9910682556703321 |
Chambert-Loir Antoine | ||
Cham : , : Springer International Publishing : , : Imprint : Springer, , 2022 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Novelty, Information and Surprise [[electronic resource] /] / by Günther Palm |
Autore | Palm Günther |
Edizione | [2nd ed. 2022.] |
Pubbl/distr/stampa | Berlin, Heidelberg : , : Springer Berlin Heidelberg : , : Imprint : Springer, , 2022 |
Descrizione fisica | 1 online resource (XX, 293 p. 1 illus.) |
Disciplina | 519.5 |
Collana | Information Science and Statistics |
Soggetto topico |
Statistics
Biomathematics Biometry Pattern recognition systems Statistical Theory and Methods Mathematical and Computational Biology Biostatistics Automated Pattern Recognition Teoria de la informació Biomatemàtica Biometria Reconeixement de formes (Informàtica) |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-662-65875-5 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Surprise and Information of Descriptions: Prerequisites -- Improbability and Novelty of Descriptions -- Conditional Novelty and Information -- Coding and Information Transmission: On Guessing and Coding -- Information Transmission -- Information Rate and Channel Capacity: Stationary Processes and Information Rate -- Channel Capacity -- Shannon's Theorem -- Repertoires and Covers: Repertoires and Descriptions -- Novelty, Information and Surprise of Repertoires -- Conditioning, Mutual Information and Information Gain -- Information, Novelty and Surprise in Science: Information, Novelty and Surprise in Brain Theory -- Surprise from Repetitions and Combination of Surprises -- Entropy in Physics -- Generalized Information Theory: Order- and Lattice-Structures -- Three Orderings on Repertoires -- Information Theory on Lattices of Covers -- Bibliography -- Index. |
Record Nr. | UNISA-996508569903316 |
Palm Günther | ||
Berlin, Heidelberg : , : Springer Berlin Heidelberg : , : Imprint : Springer, , 2022 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Novelty, Information and Surprise / / by Günther Palm |
Autore | Palm Günther |
Edizione | [2nd ed. 2022.] |
Pubbl/distr/stampa | Berlin, Heidelberg : , : Springer Berlin Heidelberg : , : Imprint : Springer, , 2022 |
Descrizione fisica | 1 online resource (XX, 293 p. 1 illus.) |
Disciplina | 519.5 |
Collana | Information Science and Statistics |
Soggetto topico |
Statistics
Biomathematics Biometry Pattern recognition systems Statistical Theory and Methods Mathematical and Computational Biology Biostatistics Automated Pattern Recognition Teoria de la informació Biomatemàtica Biometria Reconeixement de formes (Informàtica) |
Soggetto genere / forma | Llibres electrònics |
ISBN | 3-662-65875-5 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Surprise and Information of Descriptions: Prerequisites -- Improbability and Novelty of Descriptions -- Conditional Novelty and Information -- Coding and Information Transmission: On Guessing and Coding -- Information Transmission -- Information Rate and Channel Capacity: Stationary Processes and Information Rate -- Channel Capacity -- Shannon's Theorem -- Repertoires and Covers: Repertoires and Descriptions -- Novelty, Information and Surprise of Repertoires -- Conditioning, Mutual Information and Information Gain -- Information, Novelty and Surprise in Science: Information, Novelty and Surprise in Brain Theory -- Surprise from Repetitions and Combination of Surprises -- Entropy in Physics -- Generalized Information Theory: Order- and Lattice-Structures -- Three Orderings on Repertoires -- Information Theory on Lattices of Covers -- Bibliography -- Index. |
Record Nr. | UNINA-9910637729603321 |
Palm Günther | ||
Berlin, Heidelberg : , : Springer Berlin Heidelberg : , : Imprint : Springer, , 2022 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|