Einfuhrung in die Kombinatorik / / Konrad Jacobs, Dieter Jungnickel |
Autore | Jacobs Konrad |
Edizione | [2nd and extended ed.] |
Pubbl/distr/stampa | Berlin ; ; New York, : W. de Gruyter, 2004 |
Descrizione fisica | 1 online resource (420 p.) |
Disciplina | 511 |
Altri autori (Persone) | JungnickelDieter |
Collana | De Gruyter Lehrbuch |
Soggetto topico |
Combinatorial analysis
Mathematical analysis |
ISBN |
1-282-19427-5
9786612194276 3-11-019799-5 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | ger |
Nota di contenuto | Frontmatter -- Inhaltsverzeichnis -- I. Das kleine Einmaleins der Kombinatorik -- II. Der Heiratssatz und seine Verwandten -- III. Orthogonale lateinische Quadrate -- IV. Der Satz vom Diktator -- V. Fastperiodische 0-1-Folgen -- VI. Der Satz von Ramsey -- VII. Der Satz von van der Waerden -- VIII. Codes -- IX. Endliche projektive Ebenen und Räume -- X. Blockpläne -- XI. Symmetrische Blockpläne und Differenzmengen -- XII. Partitionen -- XIII. Die Abzähltheorie von Pólya -- XIV. Kombinatorische Betrachtungen topologischen Ursprungs -- XV. Spiele auf Graphen -- XVI. Spezielle Folgen von ganzen Zahlen -- Backmatter |
Record Nr. | UNINA-9910811797603321 |
Jacobs Konrad | ||
Berlin ; ; New York, : W. de Gruyter, 2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Einführung in die Kombinatorik [[electronic resource] /] / Konrad Jacobs, Dieter Jungnickel |
Autore | Jacobs Konrad <1928-2015.> |
Edizione | [2nd and extended ed.] |
Pubbl/distr/stampa | Berlin ; ; New York, : W. de Gruyter, 2004 |
Descrizione fisica | 1 online resource (420 p.) |
Disciplina | 511 |
Altri autori (Persone) | JungnickelDieter |
Collana | De Gruyter Lehrbuch |
Soggetto topico |
Combinatorial analysis
Mathematical analysis |
Soggetto genere / forma | Electronic books. |
ISBN |
1-282-19427-5
9786612194276 3-11-019799-5 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | ger |
Nota di contenuto | Frontmatter -- Inhaltsverzeichnis -- I. Das kleine Einmaleins der Kombinatorik -- II. Der Heiratssatz und seine Verwandten -- III. Orthogonale lateinische Quadrate -- IV. Der Satz vom Diktator -- V. Fastperiodische 0-1-Folgen -- VI. Der Satz von Ramsey -- VII. Der Satz von van der Waerden -- VIII. Codes -- IX. Endliche projektive Ebenen und Räume -- X. Blockpläne -- XI. Symmetrische Blockpläne und Differenzmengen -- XII. Partitionen -- XIII. Die Abzähltheorie von Pólya -- XIV. Kombinatorische Betrachtungen topologischen Ursprungs -- XV. Spiele auf Graphen -- XVI. Spezielle Folgen von ganzen Zahlen -- Backmatter |
Record Nr. | UNINA-9910454619903321 |
Jacobs Konrad <1928-2015.> | ||
Berlin ; ; New York, : W. de Gruyter, 2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Einführung in die Kombinatorik [[electronic resource] /] / Konrad Jacobs, Dieter Jungnickel |
Autore | Jacobs Konrad <1928-2015.> |
Edizione | [2nd and extended ed.] |
Pubbl/distr/stampa | Berlin ; ; New York, : W. de Gruyter, 2004 |
Descrizione fisica | 1 online resource (420 p.) |
Disciplina | 511 |
Altri autori (Persone) | JungnickelDieter |
Collana | De Gruyter Lehrbuch |
Soggetto topico |
Combinatorial analysis
Mathematical analysis |
ISBN |
1-282-19427-5
9786612194276 3-11-019799-5 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | ger |
Nota di contenuto | Frontmatter -- Inhaltsverzeichnis -- I. Das kleine Einmaleins der Kombinatorik -- II. Der Heiratssatz und seine Verwandten -- III. Orthogonale lateinische Quadrate -- IV. Der Satz vom Diktator -- V. Fastperiodische 0-1-Folgen -- VI. Der Satz von Ramsey -- VII. Der Satz von van der Waerden -- VIII. Codes -- IX. Endliche projektive Ebenen und Räume -- X. Blockpläne -- XI. Symmetrische Blockpläne und Differenzmengen -- XII. Partitionen -- XIII. Die Abzähltheorie von Pólya -- XIV. Kombinatorische Betrachtungen topologischen Ursprungs -- XV. Spiele auf Graphen -- XVI. Spezielle Folgen von ganzen Zahlen -- Backmatter |
Record Nr. | UNINA-9910782523603321 |
Jacobs Konrad <1928-2015.> | ||
Berlin ; ; New York, : W. de Gruyter, 2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
A primer in combinatorics / / Alexander Kheyfits |
Autore | Kheyfits Alexander |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | Berlin : , : Walter de Gruyter GmbH, , [2010] |
Descrizione fisica | 1 online resource (344 pages) |
Disciplina | 511.6 |
Collana | De Gruyter Textbook |
Soggetto topico |
Combinatorial analysis
Graph theory |
ISBN | 3-11-075118-6 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Intro -- Preface to the second edition -- Preface to the first edition -- Contents -- Part I: Introductory combinatorics and graph theory -- 1 Basic counting -- 2 Basic graph theory -- 3 Hierarchical clustering and dendrogram graphs -- Part II: Combinatorial analysis -- 4 Enumerative combinatorics -- 5 Existence theorems in combinatorics -- 6 Secondary structures of the RNA -- Answers/solutions to selected problems -- Bibliography -- Index. |
Record Nr. | UNINA-9910554249303321 |
Kheyfits Alexander | ||
Berlin : , : Walter de Gruyter GmbH, , [2010] | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen [[electronic resource]] |
Autore | Moser Stefan M. |
Pubbl/distr/stampa | Cambridge : , : Cambridge University Press, , 2012 |
Descrizione fisica | 1 online resource (xiii, 191 pages) : digital, PDF file(s) |
Disciplina | 003.54 |
Soggetto topico |
Coding theory
Information theory |
ISBN |
1-107-23030-6
1-107-08680-9 1-280-77481-9 1-139-22305-4 9786613685209 1-139-22134-5 1-139-05953-X 1-139-21825-5 1-139-21516-7 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; A Student's Guide to Coding and Information Theory; Title; Copyright; Contents; Contributors; Preface; 1 Introduction; 1.1 Information theory versus coding theory; 1.2 Model and basic operations of information processing systems; 1.3 Information source; 1.4 Encoding a source alphabet; 1.5 Octal and hexadecimal codes; 1.6 Outline of the book; References; 2 Error-detecting codes; 2.1 Review of modular arithmetic; 2.2 Independent errors - white noise; 2.3 Single parity-check code; 2.4 The ASCII code; 2.5 Simple burst error-detecting code; 2.6 Alphabet plus number codes - weighted codes
2.7 Trade-off between redundancy and error-detecting capability2.8 Further reading; References; 3 Repetition and Hamming codes; 3.1 Arithmetics in the binary field; 3.2 Three-times repetition code; 3.3 Hamming code; 3.3.1 Some historical background; 3.3.2 Encoding and error correction of the (7,4) Hamming code; 3.3.3 Hamming bound: sphere packing; 3.4 Further reading; References; 4 Data compression: efficient coding of a random message; 4.1 A motivating example; 4.2 Prefix-free or instantaneous codes; 4.3 Trees and codes; 4.4 The Kraft Inequality; 4.5 Trees with probabilities 4.6 Optimal codes: Huffman code4.7 Types of codes; 4.8 Some historical background; 4.9 Further reading; References; 5 Entropy and Shannon's Source Coding Theorem; 5.1 Motivation; 5.2 Uncertainty or entropy; 5.2.1 Definition; 5.2.2 Binary entropy function; 5.2.3 The Information Theory Inequality; 5.2.4 Bounds on the entropy; 5.3 Trees revisited; 5.4 Bounds on the efficiency of codes; 5.4.1 What we cannot do: fundamental limitations of source coding; 5.4.2 What we can do: analysis of the best codes; 5.4.3 Coding Theorem for a Single Random Message; 5.5 Coding of an information source 5.6 Some historical background5.7 Further reading; 5.8 Appendix: Uniqueness of the definition of entropy; References; 6 Mutual information and channel capacity; 6.1 Introduction; 6.2 The channel; 6.3 The channel relationships; 6.4 The binary symmetric channel; 6.5 System entropies; 6.6 Mutual information; 6.7 Definition of channel capacity; 6.8 Capacity of the binary symmetric channel; 6.9 Uniformly dispersive channel; 6.10 Characterization of the capacity-achieving input distribution; 6.11 Shannon's Channel Coding Theorem; 6.12 Some historical background; 6.13 Further reading; References 7 Approaching the Shannon limit by turbo coding7.1 Information Transmission Theorem; 7.2 The Gaussian channel; 7.3 Transmission at a rate below capacity; 7.4 Transmission at a rate above capacity; 7.5 Turbo coding: an introduction; 7.6 Further reading; 7.7 Appendix: Why we assume uniform and independent data at the encoder; 7.8 Appendix: Definition of concavity; References; 8 Other aspects of coding theory; 8.1 Hamming code and projective geometry; 8.2 Coding and game theory; 8.3 Further reading; References; References; Index |
Altri titoli varianti | A Student's Guide to Coding & Information Theory |
Record Nr. | UNINA-9910779101803321 |
Moser Stefan M. | ||
Cambridge : , : Cambridge University Press, , 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
A student's guide to coding and information theory / / Stefan M. Moser, Po-Ning Chen |
Autore | Moser Stefan M |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Cambridge ; ; New York, : Cambridge University Press, 2012 |
Descrizione fisica | 1 online resource (xiii, 191 pages) : digital, PDF file(s) |
Disciplina | 003.54 |
Altri autori (Persone) | ChenPo-Ning |
Soggetto topico |
Coding theory
Information theory |
ISBN |
1-107-23030-6
1-107-08680-9 1-280-77481-9 1-139-22305-4 9786613685209 1-139-22134-5 1-139-05953-X 1-139-21825-5 1-139-21516-7 |
Classificazione | SK 170 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Cover; A Student's Guide to Coding and Information Theory; Title; Copyright; Contents; Contributors; Preface; 1 Introduction; 1.1 Information theory versus coding theory; 1.2 Model and basic operations of information processing systems; 1.3 Information source; 1.4 Encoding a source alphabet; 1.5 Octal and hexadecimal codes; 1.6 Outline of the book; References; 2 Error-detecting codes; 2.1 Review of modular arithmetic; 2.2 Independent errors - white noise; 2.3 Single parity-check code; 2.4 The ASCII code; 2.5 Simple burst error-detecting code; 2.6 Alphabet plus number codes - weighted codes
2.7 Trade-off between redundancy and error-detecting capability2.8 Further reading; References; 3 Repetition and Hamming codes; 3.1 Arithmetics in the binary field; 3.2 Three-times repetition code; 3.3 Hamming code; 3.3.1 Some historical background; 3.3.2 Encoding and error correction of the (7,4) Hamming code; 3.3.3 Hamming bound: sphere packing; 3.4 Further reading; References; 4 Data compression: efficient coding of a random message; 4.1 A motivating example; 4.2 Prefix-free or instantaneous codes; 4.3 Trees and codes; 4.4 The Kraft Inequality; 4.5 Trees with probabilities 4.6 Optimal codes: Huffman code4.7 Types of codes; 4.8 Some historical background; 4.9 Further reading; References; 5 Entropy and Shannon's Source Coding Theorem; 5.1 Motivation; 5.2 Uncertainty or entropy; 5.2.1 Definition; 5.2.2 Binary entropy function; 5.2.3 The Information Theory Inequality; 5.2.4 Bounds on the entropy; 5.3 Trees revisited; 5.4 Bounds on the efficiency of codes; 5.4.1 What we cannot do: fundamental limitations of source coding; 5.4.2 What we can do: analysis of the best codes; 5.4.3 Coding Theorem for a Single Random Message; 5.5 Coding of an information source 5.6 Some historical background5.7 Further reading; 5.8 Appendix: Uniqueness of the definition of entropy; References; 6 Mutual information and channel capacity; 6.1 Introduction; 6.2 The channel; 6.3 The channel relationships; 6.4 The binary symmetric channel; 6.5 System entropies; 6.6 Mutual information; 6.7 Definition of channel capacity; 6.8 Capacity of the binary symmetric channel; 6.9 Uniformly dispersive channel; 6.10 Characterization of the capacity-achieving input distribution; 6.11 Shannon's Channel Coding Theorem; 6.12 Some historical background; 6.13 Further reading; References 7 Approaching the Shannon limit by turbo coding7.1 Information Transmission Theorem; 7.2 The Gaussian channel; 7.3 Transmission at a rate below capacity; 7.4 Transmission at a rate above capacity; 7.5 Turbo coding: an introduction; 7.6 Further reading; 7.7 Appendix: Why we assume uniform and independent data at the encoder; 7.8 Appendix: Definition of concavity; References; 8 Other aspects of coding theory; 8.1 Hamming code and projective geometry; 8.2 Coding and game theory; 8.3 Further reading; References; References; Index |
Record Nr. | UNINA-9910821690803321 |
Moser Stefan M | ||
Cambridge ; ; New York, : Cambridge University Press, 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|