Random graphs for statistical pattern recognition [[electronic resource] /] / David J. Marchette |
Autore | Marchette David J |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2004 |
Descrizione fisica | 1 online resource (261 p.) |
Disciplina |
511.5
511/.5 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Random graphs
Pattern perception - Statistical methods Pattern recognition systems |
Soggetto genere / forma | Electronic books. |
ISBN |
1-280-27535-9
9786610275359 0-470-34946-8 0-471-72208-1 0-471-72209-X |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Random Graphs for Statistical Pattern Recognition; Contents; Preface; Acknowledgments; 1 Preliminaries; 1.1 Graphs and Digraphs; 1.1.1 Graphs; 1.1.2 Digraphs; 1.1.3 Random Graphs; 1.2 Statistical Pattern Recognition; 1.2.1 Classification; 1.2.2 Curse of Dimensionality; 1.2.3 Clustering; 1.3 Statistical Issues; 1.4 Applications; 1.4.1 Artificial Nose; 1.4.2 Hyperspectral Image; 1.4.3 Gene Expression; 1.5 Further Reading; 2 Computational Geometry; 2.1 Introduction; 2.2 Voronoi Cells and Delaunay Triangularization; 2.2.1 Poisson Voronoi Cells; 2.3 Alpha Hulls; 2.4 Minimum Spanning Trees
2.4.1 Alpha Hulls and the MST2.4.2 Clustering; 2.4.3 Classification Complexity; 2.4.4 Application: Renyi Divergence; 2.4.5 Application: Image Segmentation; 2.5 Further Reading; 3 Neighborhood Graphs; 3.1 Introduction; 3.1.1 Application: Image Processing; 3.2 Nearest-Neighbor Graphs; 3.3 k-Nearest-Neighbor Graphs; 3.3.1 Application: Measures of Association; 3.3.2 Application: Artificial Nose; 3.3.3 Application: Outlier Detection; 3.3.4 Application: Dimensionality Reduction; 3.4 Relative Neighborhood Graphs; 3.5 Gabriel Graphs; 3.5.1 Gabriel Graphs and Alpha Hulls 3.5.2 Application: Nearest-Neighbor Prototypes3.6 Sphere-of-Influence Graphs; 3.7 Sphere-of-Attraction Graphs; 3.8 Other Relatives; 3.9 Asymptotics; 3.10 Further Reading; 4 Class Cover Catch Digraphs; 4.1 Catch Digraphs; 4.1.1 Sphere Digraphs; 4.2 Class Covers; 4.2.1 Basic Definitions; 4.3 Dominating sets; 4.4 Distributional Results for Cn,m-graphs; 4.4.1 Univariate Case; 4.4.2 Multivariate CCCDs; 4.5 Characterizations; 4.6 Scale Dimension; 4.6.1 Application: Latent Class Discovery; 4.7 (a,b) Graphs; 4.8 CCCD Classification; 4.9 Homogeneous CCCDs; 4.10 Vector Quantization 4.11 Random Walk Version4.11.1 Application: Face Detection; 4.12 Further Reading; 5 Cluster Catch Digraphs; 5.1 Basic Definitions; 5.2 Dominating Sets; 5.3 Connected Components; 5.4 Variable Metric Clustering; 6 Computational Methods; 6.1 Introduction; 6.2 Kd- Trees; 6.2.1 Data Structure; 6.2.2 Building the Tree; 6.2.3 Searching the Tree; 6.3 Class Cover Catch Digraphs; 6.4 Cluster Catch Digraphs; 6.5 Voronoi Regions and Delaunay Triangularizations; 6.6 Further Reading; References; Author Index; Subject Index |
Record Nr. | UNINA-9910146082903321 |
Marchette David J | ||
Hoboken, N.J., : Wiley-Interscience, c2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Random graphs for statistical pattern recognition [[electronic resource] /] / David J. Marchette |
Autore | Marchette David J |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2004 |
Descrizione fisica | 1 online resource (261 p.) |
Disciplina |
511.5
511/.5 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Random graphs
Pattern perception - Statistical methods Pattern recognition systems |
ISBN |
1-280-27535-9
9786610275359 0-470-34946-8 0-471-72208-1 0-471-72209-X |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Random Graphs for Statistical Pattern Recognition; Contents; Preface; Acknowledgments; 1 Preliminaries; 1.1 Graphs and Digraphs; 1.1.1 Graphs; 1.1.2 Digraphs; 1.1.3 Random Graphs; 1.2 Statistical Pattern Recognition; 1.2.1 Classification; 1.2.2 Curse of Dimensionality; 1.2.3 Clustering; 1.3 Statistical Issues; 1.4 Applications; 1.4.1 Artificial Nose; 1.4.2 Hyperspectral Image; 1.4.3 Gene Expression; 1.5 Further Reading; 2 Computational Geometry; 2.1 Introduction; 2.2 Voronoi Cells and Delaunay Triangularization; 2.2.1 Poisson Voronoi Cells; 2.3 Alpha Hulls; 2.4 Minimum Spanning Trees
2.4.1 Alpha Hulls and the MST2.4.2 Clustering; 2.4.3 Classification Complexity; 2.4.4 Application: Renyi Divergence; 2.4.5 Application: Image Segmentation; 2.5 Further Reading; 3 Neighborhood Graphs; 3.1 Introduction; 3.1.1 Application: Image Processing; 3.2 Nearest-Neighbor Graphs; 3.3 k-Nearest-Neighbor Graphs; 3.3.1 Application: Measures of Association; 3.3.2 Application: Artificial Nose; 3.3.3 Application: Outlier Detection; 3.3.4 Application: Dimensionality Reduction; 3.4 Relative Neighborhood Graphs; 3.5 Gabriel Graphs; 3.5.1 Gabriel Graphs and Alpha Hulls 3.5.2 Application: Nearest-Neighbor Prototypes3.6 Sphere-of-Influence Graphs; 3.7 Sphere-of-Attraction Graphs; 3.8 Other Relatives; 3.9 Asymptotics; 3.10 Further Reading; 4 Class Cover Catch Digraphs; 4.1 Catch Digraphs; 4.1.1 Sphere Digraphs; 4.2 Class Covers; 4.2.1 Basic Definitions; 4.3 Dominating sets; 4.4 Distributional Results for Cn,m-graphs; 4.4.1 Univariate Case; 4.4.2 Multivariate CCCDs; 4.5 Characterizations; 4.6 Scale Dimension; 4.6.1 Application: Latent Class Discovery; 4.7 (a,b) Graphs; 4.8 CCCD Classification; 4.9 Homogeneous CCCDs; 4.10 Vector Quantization 4.11 Random Walk Version4.11.1 Application: Face Detection; 4.12 Further Reading; 5 Cluster Catch Digraphs; 5.1 Basic Definitions; 5.2 Dominating Sets; 5.3 Connected Components; 5.4 Variable Metric Clustering; 6 Computational Methods; 6.1 Introduction; 6.2 Kd- Trees; 6.2.1 Data Structure; 6.2.2 Building the Tree; 6.2.3 Searching the Tree; 6.3 Class Cover Catch Digraphs; 6.4 Cluster Catch Digraphs; 6.5 Voronoi Regions and Delaunay Triangularizations; 6.6 Further Reading; References; Author Index; Subject Index |
Record Nr. | UNINA-9910830590103321 |
Marchette David J | ||
Hoboken, N.J., : Wiley-Interscience, c2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Random graphs for statistical pattern recognition / / David J. Marchette |
Autore | Marchette David J |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley-Interscience, c2004 |
Descrizione fisica | 1 online resource (261 p.) |
Disciplina | 511/.5 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Random graphs
Pattern perception - Statistical methods Pattern recognition systems |
ISBN |
1-280-27535-9
9786610275359 0-470-34946-8 0-471-72208-1 0-471-72209-X |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Random Graphs for Statistical Pattern Recognition; Contents; Preface; Acknowledgments; 1 Preliminaries; 1.1 Graphs and Digraphs; 1.1.1 Graphs; 1.1.2 Digraphs; 1.1.3 Random Graphs; 1.2 Statistical Pattern Recognition; 1.2.1 Classification; 1.2.2 Curse of Dimensionality; 1.2.3 Clustering; 1.3 Statistical Issues; 1.4 Applications; 1.4.1 Artificial Nose; 1.4.2 Hyperspectral Image; 1.4.3 Gene Expression; 1.5 Further Reading; 2 Computational Geometry; 2.1 Introduction; 2.2 Voronoi Cells and Delaunay Triangularization; 2.2.1 Poisson Voronoi Cells; 2.3 Alpha Hulls; 2.4 Minimum Spanning Trees
2.4.1 Alpha Hulls and the MST2.4.2 Clustering; 2.4.3 Classification Complexity; 2.4.4 Application: Renyi Divergence; 2.4.5 Application: Image Segmentation; 2.5 Further Reading; 3 Neighborhood Graphs; 3.1 Introduction; 3.1.1 Application: Image Processing; 3.2 Nearest-Neighbor Graphs; 3.3 k-Nearest-Neighbor Graphs; 3.3.1 Application: Measures of Association; 3.3.2 Application: Artificial Nose; 3.3.3 Application: Outlier Detection; 3.3.4 Application: Dimensionality Reduction; 3.4 Relative Neighborhood Graphs; 3.5 Gabriel Graphs; 3.5.1 Gabriel Graphs and Alpha Hulls 3.5.2 Application: Nearest-Neighbor Prototypes3.6 Sphere-of-Influence Graphs; 3.7 Sphere-of-Attraction Graphs; 3.8 Other Relatives; 3.9 Asymptotics; 3.10 Further Reading; 4 Class Cover Catch Digraphs; 4.1 Catch Digraphs; 4.1.1 Sphere Digraphs; 4.2 Class Covers; 4.2.1 Basic Definitions; 4.3 Dominating sets; 4.4 Distributional Results for Cn,m-graphs; 4.4.1 Univariate Case; 4.4.2 Multivariate CCCDs; 4.5 Characterizations; 4.6 Scale Dimension; 4.6.1 Application: Latent Class Discovery; 4.7 (a,b) Graphs; 4.8 CCCD Classification; 4.9 Homogeneous CCCDs; 4.10 Vector Quantization 4.11 Random Walk Version4.11.1 Application: Face Detection; 4.12 Further Reading; 5 Cluster Catch Digraphs; 5.1 Basic Definitions; 5.2 Dominating Sets; 5.3 Connected Components; 5.4 Variable Metric Clustering; 6 Computational Methods; 6.1 Introduction; 6.2 Kd- Trees; 6.2.1 Data Structure; 6.2.2 Building the Tree; 6.2.3 Searching the Tree; 6.3 Class Cover Catch Digraphs; 6.4 Cluster Catch Digraphs; 6.5 Voronoi Regions and Delaunay Triangularizations; 6.6 Further Reading; References; Author Index; Subject Index |
Record Nr. | UNINA-9910877210603321 |
Marchette David J | ||
Hoboken, N.J., : Wiley-Interscience, c2004 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Statistical pattern recognition [[electronic resource] /] / Andrew R. Webb, Keith D. Copsey |
Autore | Webb A. R (Andrew R.) |
Edizione | [3rd ed.] |
Pubbl/distr/stampa | Hoboken, : Wiley, 2011 |
Descrizione fisica | 1 online resource (xxiv, 642 pages) : illustrations, tables |
Disciplina | 006.4 |
Altri autori (Persone) | CopseyKeith D |
Soggetto topico | Pattern perception - Statistical methods |
Soggetto genere / forma | Electronic books. |
ISBN |
1-119-96140-8
1-283-28311-5 9786613283115 1-118-30535-3 1-119-95295-6 1-119-95296-4 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Statistical Pattern Recognition; Contents; Preface; Notation; 1 Introduction to Statistical Pattern Recognition; 1.1 Statistical Pattern Recognition; 1.1.1 Introduction; 1.1.2 The Basic Model; 1.2 Stages in a Pattern Recognition Problem; 1.3 Issues; 1.4 Approaches to Statistical Pattern Recognition; 1.5 Elementary Decision Theory; 1.5.1 Bayes' Decision Rule for Minimum Error; 1.5.2 Bayes' Decision Rule for Minimum Error - Reject Option; 1.5.3 Bayes' Decision Rule for Minimum Risk; 1.5.4 Bayes' Decision Rule for Minimum Risk - Reject Option; 1.5.5 Neyman-Pearson Decision Rule
1.5.6 Minimax Criterion1.5.7 Discussion; 1.6 Discriminant Functions; 1.6.1 Introduction; 1.6.2 Linear Discriminant Functions; 1.6.3 Piecewise Linear Discriminant Functions; 1.6.4 Generalised Linear Discriminant Function; 1.6.5 Summary; 1.7 Multiple Regression; 1.8 Outline of Book; 1.9 Notes and References; Exercises; 2 Density Estimation - Parametric; 2.1 Introduction; 2.2 Estimating the Parameters of the Distributions; 2.2.1 Estimative Approach; 2.2.2 Predictive Approach; 2.3 The Gaussian Classifier; 2.3.1 Specification; 2.3.2 Derivation of the Gaussian Classifier Plug-In Estimates 2.3.3 Example Application Study2.4 Dealing with Singularities in the Gaussian Classifier; 2.4.1 Introduction; 2.4.2 Na ̈ıve Bayes; 2.4.3 Projection onto a Subspace; 2.4.4 Linear Discriminant Function; 2.4.5 Regularised Discriminant Analysis; 2.4.6 Example Application Study; 2.4.7 Further Developments; 2.4.8 Summary; 2.5 Finite Mixture Models; 2.5.1 Introduction; 2.5.2 Mixture Models for Discrimination; 2.5.3 Parameter Estimation for Normal Mixture Models; 2.5.4 Normal Mixture Model Covariance Matrix Constraints; 2.5.5 How Many Components?; 2.5.6 Maximum Likelihood Estimation via EM 2.5.7 Example Application Study2.5.8 Further Developments; 2.5.9 Summary; 2.6 Application Studies; 2.7 Summary and Discussion; 2.8 Recommendations; 2.9 Notes and References; Exercises; 3 Density Estimation - Bayesian; 3.1 Introduction; 3.1.1 Basics; 3.1.2 Recursive Calculation; 3.1.3 Proportionality; 3.2 Analytic Solutions; 3.2.1 Conjugate Priors; 3.2.2 Estimating the Mean of a Normal Distribution with Known Variance; 3.2.3 Estimating the Mean and the Covariance Matrix of a Multivariate Normal Distribution; 3.2.4 Unknown Prior Class Probabilities; 3.2.5 Summary; 3.3 Bayesian Sampling Schemes 3.3.1 Introduction3.3.2 Summarisation; 3.3.3 Sampling Version of the Bayesian Classifier; 3.3.4 Rejection Sampling; 3.3.5 Ratio of Uniforms; 3.3.6 Importance Sampling; 3.4 Markov Chain Monte Carlo Methods; 3.4.1 Introduction; 3.4.2 The Gibbs Sampler; 3.4.3 Metropolis-Hastings Algorithm; 3.4.4 Data Augmentation; 3.4.5 Reversible Jump Markov Chain Monte Carlo; 3.4.6 Slice Sampling; 3.4.7 MCMC Example - Estimation of Noisy Sinusoids; 3.4.8 Summary; 3.4.9 Notes and References; 3.5 Bayesian Approaches to Discrimination; 3.5.1 Labelled Training Data; 3.5.2 Unlabelled Training Data 3.6 Sequential Monte Carlo Samplers |
Record Nr. | UNISA-996204067403316 |
Webb A. R (Andrew R.) | ||
Hoboken, : Wiley, 2011 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|
Statistical pattern recognition [[electronic resource] /] / Andrew R. Webb, Keith D. Copsey |
Autore | Webb A. R (Andrew R.) |
Edizione | [3rd ed.] |
Pubbl/distr/stampa | Hoboken, : Wiley, 2011 |
Descrizione fisica | 1 online resource (xxiv, 642 pages) : illustrations, tables |
Disciplina | 006.4 |
Altri autori (Persone) | CopseyKeith D |
Soggetto topico | Pattern perception - Statistical methods |
ISBN |
1-119-96140-8
1-283-28311-5 9786613283115 1-118-30535-3 1-119-95295-6 1-119-95296-4 |
Classificazione | MAT029000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Statistical Pattern Recognition; Contents; Preface; Notation; 1 Introduction to Statistical Pattern Recognition; 1.1 Statistical Pattern Recognition; 1.1.1 Introduction; 1.1.2 The Basic Model; 1.2 Stages in a Pattern Recognition Problem; 1.3 Issues; 1.4 Approaches to Statistical Pattern Recognition; 1.5 Elementary Decision Theory; 1.5.1 Bayes' Decision Rule for Minimum Error; 1.5.2 Bayes' Decision Rule for Minimum Error - Reject Option; 1.5.3 Bayes' Decision Rule for Minimum Risk; 1.5.4 Bayes' Decision Rule for Minimum Risk - Reject Option; 1.5.5 Neyman-Pearson Decision Rule
1.5.6 Minimax Criterion1.5.7 Discussion; 1.6 Discriminant Functions; 1.6.1 Introduction; 1.6.2 Linear Discriminant Functions; 1.6.3 Piecewise Linear Discriminant Functions; 1.6.4 Generalised Linear Discriminant Function; 1.6.5 Summary; 1.7 Multiple Regression; 1.8 Outline of Book; 1.9 Notes and References; Exercises; 2 Density Estimation - Parametric; 2.1 Introduction; 2.2 Estimating the Parameters of the Distributions; 2.2.1 Estimative Approach; 2.2.2 Predictive Approach; 2.3 The Gaussian Classifier; 2.3.1 Specification; 2.3.2 Derivation of the Gaussian Classifier Plug-In Estimates 2.3.3 Example Application Study2.4 Dealing with Singularities in the Gaussian Classifier; 2.4.1 Introduction; 2.4.2 Na ̈ıve Bayes; 2.4.3 Projection onto a Subspace; 2.4.4 Linear Discriminant Function; 2.4.5 Regularised Discriminant Analysis; 2.4.6 Example Application Study; 2.4.7 Further Developments; 2.4.8 Summary; 2.5 Finite Mixture Models; 2.5.1 Introduction; 2.5.2 Mixture Models for Discrimination; 2.5.3 Parameter Estimation for Normal Mixture Models; 2.5.4 Normal Mixture Model Covariance Matrix Constraints; 2.5.5 How Many Components?; 2.5.6 Maximum Likelihood Estimation via EM 2.5.7 Example Application Study2.5.8 Further Developments; 2.5.9 Summary; 2.6 Application Studies; 2.7 Summary and Discussion; 2.8 Recommendations; 2.9 Notes and References; Exercises; 3 Density Estimation - Bayesian; 3.1 Introduction; 3.1.1 Basics; 3.1.2 Recursive Calculation; 3.1.3 Proportionality; 3.2 Analytic Solutions; 3.2.1 Conjugate Priors; 3.2.2 Estimating the Mean of a Normal Distribution with Known Variance; 3.2.3 Estimating the Mean and the Covariance Matrix of a Multivariate Normal Distribution; 3.2.4 Unknown Prior Class Probabilities; 3.2.5 Summary; 3.3 Bayesian Sampling Schemes 3.3.1 Introduction3.3.2 Summarisation; 3.3.3 Sampling Version of the Bayesian Classifier; 3.3.4 Rejection Sampling; 3.3.5 Ratio of Uniforms; 3.3.6 Importance Sampling; 3.4 Markov Chain Monte Carlo Methods; 3.4.1 Introduction; 3.4.2 The Gibbs Sampler; 3.4.3 Metropolis-Hastings Algorithm; 3.4.4 Data Augmentation; 3.4.5 Reversible Jump Markov Chain Monte Carlo; 3.4.6 Slice Sampling; 3.4.7 MCMC Example - Estimation of Noisy Sinusoids; 3.4.8 Summary; 3.4.9 Notes and References; 3.5 Bayesian Approaches to Discrimination; 3.5.1 Labelled Training Data; 3.5.2 Unlabelled Training Data 3.6 Sequential Monte Carlo Samplers |
Record Nr. | UNINA-9910139578003321 |
Webb A. R (Andrew R.) | ||
Hoboken, : Wiley, 2011 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Statistical pattern recognition / / Andrew R. Webb, Keith D. Copsey |
Autore | Webb A. R (Andrew R.) |
Edizione | [3rd ed.] |
Pubbl/distr/stampa | Hoboken, : Wiley, 2011 |
Descrizione fisica | 1 online resource (xxiv, 642 pages) : illustrations, tables |
Disciplina | 006.4 |
Altri autori (Persone) | CopseyKeith D |
Soggetto topico | Pattern perception - Statistical methods |
ISBN |
1-119-96140-8
1-283-28311-5 9786613283115 1-118-30535-3 1-119-95295-6 1-119-95296-4 |
Classificazione | MAT029000 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Statistical Pattern Recognition; Contents; Preface; Notation; 1 Introduction to Statistical Pattern Recognition; 1.1 Statistical Pattern Recognition; 1.1.1 Introduction; 1.1.2 The Basic Model; 1.2 Stages in a Pattern Recognition Problem; 1.3 Issues; 1.4 Approaches to Statistical Pattern Recognition; 1.5 Elementary Decision Theory; 1.5.1 Bayes' Decision Rule for Minimum Error; 1.5.2 Bayes' Decision Rule for Minimum Error - Reject Option; 1.5.3 Bayes' Decision Rule for Minimum Risk; 1.5.4 Bayes' Decision Rule for Minimum Risk - Reject Option; 1.5.5 Neyman-Pearson Decision Rule
1.5.6 Minimax Criterion1.5.7 Discussion; 1.6 Discriminant Functions; 1.6.1 Introduction; 1.6.2 Linear Discriminant Functions; 1.6.3 Piecewise Linear Discriminant Functions; 1.6.4 Generalised Linear Discriminant Function; 1.6.5 Summary; 1.7 Multiple Regression; 1.8 Outline of Book; 1.9 Notes and References; Exercises; 2 Density Estimation - Parametric; 2.1 Introduction; 2.2 Estimating the Parameters of the Distributions; 2.2.1 Estimative Approach; 2.2.2 Predictive Approach; 2.3 The Gaussian Classifier; 2.3.1 Specification; 2.3.2 Derivation of the Gaussian Classifier Plug-In Estimates 2.3.3 Example Application Study2.4 Dealing with Singularities in the Gaussian Classifier; 2.4.1 Introduction; 2.4.2 Na ̈ıve Bayes; 2.4.3 Projection onto a Subspace; 2.4.4 Linear Discriminant Function; 2.4.5 Regularised Discriminant Analysis; 2.4.6 Example Application Study; 2.4.7 Further Developments; 2.4.8 Summary; 2.5 Finite Mixture Models; 2.5.1 Introduction; 2.5.2 Mixture Models for Discrimination; 2.5.3 Parameter Estimation for Normal Mixture Models; 2.5.4 Normal Mixture Model Covariance Matrix Constraints; 2.5.5 How Many Components?; 2.5.6 Maximum Likelihood Estimation via EM 2.5.7 Example Application Study2.5.8 Further Developments; 2.5.9 Summary; 2.6 Application Studies; 2.7 Summary and Discussion; 2.8 Recommendations; 2.9 Notes and References; Exercises; 3 Density Estimation - Bayesian; 3.1 Introduction; 3.1.1 Basics; 3.1.2 Recursive Calculation; 3.1.3 Proportionality; 3.2 Analytic Solutions; 3.2.1 Conjugate Priors; 3.2.2 Estimating the Mean of a Normal Distribution with Known Variance; 3.2.3 Estimating the Mean and the Covariance Matrix of a Multivariate Normal Distribution; 3.2.4 Unknown Prior Class Probabilities; 3.2.5 Summary; 3.3 Bayesian Sampling Schemes 3.3.1 Introduction3.3.2 Summarisation; 3.3.3 Sampling Version of the Bayesian Classifier; 3.3.4 Rejection Sampling; 3.3.5 Ratio of Uniforms; 3.3.6 Importance Sampling; 3.4 Markov Chain Monte Carlo Methods; 3.4.1 Introduction; 3.4.2 The Gibbs Sampler; 3.4.3 Metropolis-Hastings Algorithm; 3.4.4 Data Augmentation; 3.4.5 Reversible Jump Markov Chain Monte Carlo; 3.4.6 Slice Sampling; 3.4.7 MCMC Example - Estimation of Noisy Sinusoids; 3.4.8 Summary; 3.4.9 Notes and References; 3.5 Bayesian Approaches to Discrimination; 3.5.1 Labelled Training Data; 3.5.2 Unlabelled Training Data 3.6 Sequential Monte Carlo Samplers |
Record Nr. | UNINA-9910813232703321 |
Webb A. R (Andrew R.) | ||
Hoboken, : Wiley, 2011 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Statistical pattern recognition [[electronic resource] /] / Andrew R. Webb |
Autore | Webb Andrew R (Andrew Roy) |
Edizione | [2nd ed.] |
Pubbl/distr/stampa | West Sussex, England ; ; New Jersey, : Wiley, c2002 |
Descrizione fisica | 1 online resource (516 p.) |
Disciplina | 006.4 |
Soggetto topico |
Pattern perception - Statistical methods
Mathematical statistics |
ISBN |
0-470-33906-3
0-470-85477-4 9786610270101 1-280-27010-1 0-470-85478-2 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Statistical Pattern Recognition; Contents; Preface; Notation; 1 Introduction to statistical pattern recognition; 1.1 Statistical pattern recognition; 1.1.1 Introduction; 1.1.2 The basic model; 1.2 Stages in a pattern recognition problem; 1.3 Issues; 1.4 Supervised versus unsupervised; 1.5 Approaches to statistical pattern recognition; 1.5.1 Elementary decision theory; 1.5.2 Discriminant functions; 1.6 Multiple regression; 1.7 Outline of book; 1.8 Notes and references; Exercises; 2 Density estimation - parametric; 2.1 Introduction; 2.2 Normal-based models
2.2.1 Linear and quadratic discriminant functions2.2.2 Regularised discriminant analysis; 2.2.3 Example application study; 2.2.4 Further developments; 2.2.5 Summary; 2.3 Normal mixture models; 2.3.1 Maximum likelihood estimation via EM; 2.3.2 Mixture models for discrimination; 2.3.3 How many components?; 2.3.4 Example application study; 2.3.5 Further developments; 2.3.6 Summary; 2.4 Bayesian estimates; 2.4.1 Bayesian learning methods; 2.4.2 Markov chain Monte Carlo; 2.4.3 Bayesian approaches to discrimination; 2.4.4 Example application study; 2.4.5 Further developments; 2.4.6 Summary 2.5 Application studies2.6 Summary and discussion; 2.7 Recommendations; 2.8 Notes and references; Exercises; 3 Density estimation - nonparametric; 3.1 Introduction; 3.2 Histogram method; 3.2.1 Data-adaptive histograms; 3.2.2 Independence assumption; 3.2.3 Lancaster models; 3.2.4 Maximum weight dependence trees; 3.2.5 Bayesian networks; 3.2.6 Example application study; 3.2.7 Further developments; 3.2.8 Summary; 3.3 k-nearest-neighbour method; 3.3.1 k-nearest-neighbour decision rule; 3.3.2 Properties of the nearest-neighbour rule; 3.3.3 Algorithms; 3.3.4 Editing techniques 3.3.5 Choice of distance metric3.3.6 Example application study; 3.3.7 Further developments; 3.3.8 Summary; 3.4 Expansion by basis functions; 3.5 Kernel methods; 3.5.1 Choice of smoothing parameter; 3.5.2 Choice of kernel; 3.5.3 Example application study; 3.5.4 Further developments; 3.5.5 Summary; 3.6 Application studies; 3.7 Summary and discussion; 3.8 Recommendations; 3.9 Notes and references; Exercises; 4 Linear discriminant analysis; 4.1 Introduction; 4.2 Two-class algorithms; 4.2.1 General ideas; 4.2.2 Perceptron criterion; 4.2.3 Fisher's criterion 4.2.4 Least mean squared error procedures4.2.5 Support vector machines; 4.2.6 Example application study; 4.2.7 Further developments; 4.2.8 Summary; 4.3 Multiclass algorithms; 4.3.1 General ideas; 4.3.2 Error-correction procedure; 4.3.3 Fisher's criterion - linear discriminant analysis; 4.3.4 Least mean squared error procedures; 4.3.5 Optimal scaling; 4.3.6 Regularisation; 4.3.7 Multiclass support vector machines; 4.3.8 Example application study; 4.3.9 Further developments; 4.3.10 Summary; 4.4 Logistic discrimination; 4.4.1 Two-group case; 4.4.2 Maximum likelihood estimation 4.4.3 Multiclass logistic discrimination |
Record Nr. | UNISA-996212456103316 |
Webb Andrew R (Andrew Roy) | ||
West Sussex, England ; ; New Jersey, : Wiley, c2002 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. di Salerno | ||
|