Correspondence analysis : theory, practice and new strategies / / Eric Beh, Rosaria Lombardo |
Autore | Beh Eric J. |
Pubbl/distr/stampa | Chichester, England : , : Wiley, , 2014 |
Descrizione fisica | 1 online resource (593 p.) |
Disciplina | 519.5/37 |
Collana | Wiley Series in Probability and Statistics |
Soggetto topico | Correspondence analysis (Statistics) |
ISBN |
1-118-76287-8
1-118-76289-4 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Correspondence Analysis: Theory, Practice and New Strategies; Contents; Foreword; Preface; Part One: Introduction; 1 Data Visualisation; 1.1 A Very Brief Introduction to Data Visualisation; 1.1.1 A Very Brief History; 1.1.2 Introduction to Visualisation Tools for Numerical Data; 1.1.3 Introduction to Visualisation Tools for Univariate Categorical Data; 1.2 Data Visualisation for Contingency Tables; 1.2.1 Fourfold Displays; 1.3 Other Plots; 1.4 Studying Exposure to Asbestos; 1.4.1 Asbestos and Irving J. Selikoff; 1.4.2 Selikoff's Data; 1.4.3 Numerical Analysis of Selikoff's Data
1.4.4 A Graphical Analysis of Selikoff's Data1.4.5 Classical Correspondence Analysis of Selikoff's Data; 1.4.6 Other Methods of Graphical Analysis; 1.5 Happiness Data; 1.6 Correspondence Analysis Now; 1.6.1 A Bibliographic Taste; 1.6.2 The Increasing Popularity of Correspondence Analysis; 1.6.3 The Growth of the Correspondence Analysis Family Tree; 1.7 Overview of the Book; 1.8 R Code; References; 2 Pearson's Chi-Squared Statistic; 2.1 Introduction; 2.2 Pearson's Chi-Squared Statistic; 2.2.1 Notation; 2.2.2 Measuring the Departure from Independence; 2.2.3 Pearson's Chi-Squared Statistic 2.6.3 The Cressie--Read StatisticReferences; Part Two: Correspondence Analysis of Two-Way Contingency Tables; 3 Methods of Decomposition; 3.1 Introduction; 3.2 Reducing Multidimensional Space; 3.3 Profiles and Cloud of Points; 3.4 Property of Distributional Equivalence; 3.5 The Triplet and Classical Reciprocal Averaging; 3.5.1 One-Dimensional Reciprocal Averaging; 3.5.2 Matrix Form of One-Dimensional Reciprocal Averaging; 3.5.3 M-Dimensional Reciprocal Averaging; 3.5.4 Some Historical Comments; 3.6 Solving the Triplet Using Eigen-Decomposition; 3.6.1 The Decomposition; 3.6.2 Example 3.7 Solving the Triplet Using Singular Value Decomposition3.7.1 The Standard Decomposition; 3.7.2 The Generalised Decomposition; 3.8 The Generalised Triplet and Reciprocal Averaging; 3.9 Solving the Generalised Triplet Using Gram--Schmidt Process; 3.9.1 Ordered Categorical Variables and a priori Scores; 3.9.2 On Finding Orthogonalised Vectors; 3.9.3 A Recurrence Formulae Approach; 3.9.4 Changing the Basis Vector; 3.9.5 Generalised Correlations; 3.10 Bivariate Moment Decomposition; 3.11 Hybrid Decomposition; 3.11.1 An Alternative Singly Ordered Approach; 3.12 R Code 3.12.1 Eigen-Decomposition in R |
Record Nr. | UNINA-9910132173303321 |
Beh Eric J. | ||
Chichester, England : , : Wiley, , 2014 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Correspondence analysis : theory, practice and new strategies / / Eric Beh, Rosaria Lombardo |
Autore | Beh Eric J. |
Pubbl/distr/stampa | Chichester, England : , : Wiley, , 2014 |
Descrizione fisica | 1 online resource (593 p.) |
Disciplina | 519.5/37 |
Collana | Wiley Series in Probability and Statistics |
Soggetto topico | Correspondence analysis (Statistics) |
ISBN |
1-118-76287-8
1-118-76289-4 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
Correspondence Analysis: Theory, Practice and New Strategies; Contents; Foreword; Preface; Part One: Introduction; 1 Data Visualisation; 1.1 A Very Brief Introduction to Data Visualisation; 1.1.1 A Very Brief History; 1.1.2 Introduction to Visualisation Tools for Numerical Data; 1.1.3 Introduction to Visualisation Tools for Univariate Categorical Data; 1.2 Data Visualisation for Contingency Tables; 1.2.1 Fourfold Displays; 1.3 Other Plots; 1.4 Studying Exposure to Asbestos; 1.4.1 Asbestos and Irving J. Selikoff; 1.4.2 Selikoff's Data; 1.4.3 Numerical Analysis of Selikoff's Data
1.4.4 A Graphical Analysis of Selikoff's Data1.4.5 Classical Correspondence Analysis of Selikoff's Data; 1.4.6 Other Methods of Graphical Analysis; 1.5 Happiness Data; 1.6 Correspondence Analysis Now; 1.6.1 A Bibliographic Taste; 1.6.2 The Increasing Popularity of Correspondence Analysis; 1.6.3 The Growth of the Correspondence Analysis Family Tree; 1.7 Overview of the Book; 1.8 R Code; References; 2 Pearson's Chi-Squared Statistic; 2.1 Introduction; 2.2 Pearson's Chi-Squared Statistic; 2.2.1 Notation; 2.2.2 Measuring the Departure from Independence; 2.2.3 Pearson's Chi-Squared Statistic 2.6.3 The Cressie--Read StatisticReferences; Part Two: Correspondence Analysis of Two-Way Contingency Tables; 3 Methods of Decomposition; 3.1 Introduction; 3.2 Reducing Multidimensional Space; 3.3 Profiles and Cloud of Points; 3.4 Property of Distributional Equivalence; 3.5 The Triplet and Classical Reciprocal Averaging; 3.5.1 One-Dimensional Reciprocal Averaging; 3.5.2 Matrix Form of One-Dimensional Reciprocal Averaging; 3.5.3 M-Dimensional Reciprocal Averaging; 3.5.4 Some Historical Comments; 3.6 Solving the Triplet Using Eigen-Decomposition; 3.6.1 The Decomposition; 3.6.2 Example 3.7 Solving the Triplet Using Singular Value Decomposition3.7.1 The Standard Decomposition; 3.7.2 The Generalised Decomposition; 3.8 The Generalised Triplet and Reciprocal Averaging; 3.9 Solving the Generalised Triplet Using Gram--Schmidt Process; 3.9.1 Ordered Categorical Variables and a priori Scores; 3.9.2 On Finding Orthogonalised Vectors; 3.9.3 A Recurrence Formulae Approach; 3.9.4 Changing the Basis Vector; 3.9.5 Generalised Correlations; 3.10 Bivariate Moment Decomposition; 3.11 Hybrid Decomposition; 3.11.1 An Alternative Singly Ordered Approach; 3.12 R Code 3.12.1 Eigen-Decomposition in R |
Record Nr. | UNINA-9910822547203321 |
Beh Eric J. | ||
Chichester, England : , : Wiley, , 2014 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
High-dimensional covariance estimation [[electronic resource] /] / Mohsen Pourahmadi |
Autore | Pourahmadi Mohsen |
Pubbl/distr/stampa | Hoboken, NJ, : Wiley, c2013 |
Descrizione fisica | 1 online resource (208 p.) |
Disciplina | 519.5/38 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Analysis of covariance
Multivariate analysis |
ISBN |
1-118-57366-8
1-118-57361-7 1-118-57365-X |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
HIGH-DIMENSIONAL COVARIANCE ESTIMATION; CONTENTS; PREFACE; I MOTIVATION AND THE BASICS; 1 INTRODUCTION; 1.1 Least Squares and Regularized Regression; 1.2 Lasso: Survival of the Bigger; 1.3 Thresholding the Sample Covariance Matrix; 1.4 Sparse PCA and Regression; 1.5 Graphical Models: Nodewise Regression; 1.6 Cholesky Decomposition and Regression; 1.7 The Bigger Picture: Latent Factor Models; 1.8 Further Reading; 2 DATA, SPARSITY, AND REGULARIZATION; 2.1 Data Matrix: Examples; 2.2 Shrinking the Sample Covariance Matrix; 2.3 Distribution of the Sample Eigenvalues
2.4 Regularizing Covariances Like a Mean2.5 The Lasso Regression; 2.6 Lasso: Variable Selection and Prediction; 2.7 Lasso: Degrees of Freedom and BIC; 2.8 Some Alternatives to the Lasso Penalty; 3 COVARIANCE MATRICES; 3.1 Definition and Basic Properties; 3.2 The Spectral Decomposition; 3.3 Structured Covariance Matrices; 3.4 Functions of a Covariance Matrix; 3.5 PCA: The Maximum Variance Property; 3.6 Modified Cholesky Decomposition; 3.7 Latent Factor Models; 3.8 GLM for Covariance Matrices; 3.9 GLM via the Cholesky Decomposition; 3.10 GLM for Incomplete Longitudinal Data 3.10.1 The Incoherency Problem in Incomplete Longitudinal Data3.10.2 The Incomplete Data and The EM Algorithm; 3.11 A Data Example: Fruit Fly Mortality Rate; 3.12 Simulating Random Correlation Matrices; 3.13 Bayesian Analysis of Covariance Matrices; II COVARIANCE ESTIMATION: REGULARIZATION; 4 REGULARIZING THE EIGENSTRUCTURE; 4.1 Shrinking the Eigenvalues; 4.2 Regularizing The Eigenvectors; 4.3 A Duality between PCA and SVD; 4.4 Implementing Sparse PCA: A Data Example; 4.5 Sparse Singular Value Decomposition (SSVD); 4.6 Consistency of PCA; 4.7 Principal Subspace Estimation; 4.8 Further Reading 5 SPARSE GAUSSIAN GRAPHICAL MODELS5.1 Covariance Selection Models: Two Examples; 5.2 Regression Interpretation of Entries of Σ-1; 5.3 Penalized Likelihood and Graphical Lasso; 5.4 Penalized Quasi-Likelihood Formulation; 5.5 Penalizing the Cholesky Factor; 5.6 Consistency and Sparsistency; 5.7 Joint Graphical Models; 5.8 Further Reading; 6 BANDING, TAPERING, AND THRESHOLDING; 6.1 Banding the Sample Covariance Matrix; 6.2 Tapering the Sample Covariance Matrix; 6.3 Thresholding the Sample Covariance Matrix; 6.4 Low-Rank Plus Sparse Covariance Matrices; 6.5 Further Reading 7 MULTIVARIATE REGRESSION: ACCOUNTING FOR CORRELATION7.1 Multivariate Regression and LS Estimators; 7.2 Reduced Rank Regressions (RRR); 7.3 Regularized Estimation of B; 7.4 Joint Regularization of (B, Ω); 7.5 Implementing MRCE: Data Examples; 7.5.1 Intraday Electricity Prices; 7.5.2 Predicting Asset Returns; 7.6 Further Reading; BIBLIOGRAPHY; INDEX; WILEY SERIES IN PROBABILITY AND STATISTICS |
Record Nr. | UNINA-9910141572603321 |
Pourahmadi Mohsen | ||
Hoboken, NJ, : Wiley, c2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
High-dimensional covariance estimation / / Mohsen Pourahmadi |
Autore | Pourahmadi Mohsen |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Hoboken, NJ, : Wiley, c2013 |
Descrizione fisica | 1 online resource (208 p.) |
Disciplina | 519.5/38 |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Analysis of covariance
Multivariate analysis |
ISBN |
1-118-57366-8
1-118-57361-7 1-118-57365-X |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto |
HIGH-DIMENSIONAL COVARIANCE ESTIMATION; CONTENTS; PREFACE; I MOTIVATION AND THE BASICS; 1 INTRODUCTION; 1.1 Least Squares and Regularized Regression; 1.2 Lasso: Survival of the Bigger; 1.3 Thresholding the Sample Covariance Matrix; 1.4 Sparse PCA and Regression; 1.5 Graphical Models: Nodewise Regression; 1.6 Cholesky Decomposition and Regression; 1.7 The Bigger Picture: Latent Factor Models; 1.8 Further Reading; 2 DATA, SPARSITY, AND REGULARIZATION; 2.1 Data Matrix: Examples; 2.2 Shrinking the Sample Covariance Matrix; 2.3 Distribution of the Sample Eigenvalues
2.4 Regularizing Covariances Like a Mean2.5 The Lasso Regression; 2.6 Lasso: Variable Selection and Prediction; 2.7 Lasso: Degrees of Freedom and BIC; 2.8 Some Alternatives to the Lasso Penalty; 3 COVARIANCE MATRICES; 3.1 Definition and Basic Properties; 3.2 The Spectral Decomposition; 3.3 Structured Covariance Matrices; 3.4 Functions of a Covariance Matrix; 3.5 PCA: The Maximum Variance Property; 3.6 Modified Cholesky Decomposition; 3.7 Latent Factor Models; 3.8 GLM for Covariance Matrices; 3.9 GLM via the Cholesky Decomposition; 3.10 GLM for Incomplete Longitudinal Data 3.10.1 The Incoherency Problem in Incomplete Longitudinal Data3.10.2 The Incomplete Data and The EM Algorithm; 3.11 A Data Example: Fruit Fly Mortality Rate; 3.12 Simulating Random Correlation Matrices; 3.13 Bayesian Analysis of Covariance Matrices; II COVARIANCE ESTIMATION: REGULARIZATION; 4 REGULARIZING THE EIGENSTRUCTURE; 4.1 Shrinking the Eigenvalues; 4.2 Regularizing The Eigenvectors; 4.3 A Duality between PCA and SVD; 4.4 Implementing Sparse PCA: A Data Example; 4.5 Sparse Singular Value Decomposition (SSVD); 4.6 Consistency of PCA; 4.7 Principal Subspace Estimation; 4.8 Further Reading 5 SPARSE GAUSSIAN GRAPHICAL MODELS5.1 Covariance Selection Models: Two Examples; 5.2 Regression Interpretation of Entries of Σ-1; 5.3 Penalized Likelihood and Graphical Lasso; 5.4 Penalized Quasi-Likelihood Formulation; 5.5 Penalizing the Cholesky Factor; 5.6 Consistency and Sparsistency; 5.7 Joint Graphical Models; 5.8 Further Reading; 6 BANDING, TAPERING, AND THRESHOLDING; 6.1 Banding the Sample Covariance Matrix; 6.2 Tapering the Sample Covariance Matrix; 6.3 Thresholding the Sample Covariance Matrix; 6.4 Low-Rank Plus Sparse Covariance Matrices; 6.5 Further Reading 7 MULTIVARIATE REGRESSION: ACCOUNTING FOR CORRELATION7.1 Multivariate Regression and LS Estimators; 7.2 Reduced Rank Regressions (RRR); 7.3 Regularized Estimation of B; 7.4 Joint Regularization of (B, Ω); 7.5 Implementing MRCE: Data Examples; 7.5.1 Intraday Electricity Prices; 7.5.2 Predicting Asset Returns; 7.6 Further Reading; BIBLIOGRAPHY; INDEX; WILEY SERIES IN PROBABILITY AND STATISTICS |
Record Nr. | UNINA-9910828712603321 |
Pourahmadi Mohsen | ||
Hoboken, NJ, : Wiley, c2013 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Methods of multivariate analysis [[electronic resource] /] / Alvin C. Rencher, William F. Christensen |
Autore | Rencher Alvin C. <1934-> |
Edizione | [3rd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2012 |
Descrizione fisica | 1 online resource (xxv, 758 p.) : ill |
Disciplina | 519.5/35 |
Altri autori (Persone) | ChristensenWilliam F. <1970-> |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Multivariate analysis
Anàlisi multivariable |
Soggetto genere / forma | Llibres electrònics |
ISBN |
1-118-39167-5
1-282-24188-5 9786613813008 1-118-39168-3 1-118-30458-6 1-118-39165-9 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | 1. Introduction; 2. Matrix Algebra; 3. Characterizing and Displaying Multivariate Data; 4. The Multivariate Normal Distribution; 5. Tests on One or Two Mean Vectors; 6. Multivariate Analysis of Variance; 7. Tests on Covariance Matrices; 8. Discriminant Analysis: Description of Group Separation; 9. Classification Analysis: Allocation of Observations to Groups; 10. Multivariate Regression; 11. Canonical Correlation; 12. Principal Component Analysis; 13. Exploratory Factor Analysis; 14. Confirmatory Factor Analysis; 15. Cluster Analysis; 16. Graphical Procedures; Appendix A: Tables; Appendix B: Answers and Hints to Problems; Appendix C: Data Sets and SAS Files; References; Index. |
Record Nr. | UNINA-9910141438803321 |
Rencher Alvin C. <1934-> | ||
Hoboken, N.J., : Wiley, c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Methods of multivariate analysis / / Alvin C. Rencher, William F. Christensen |
Autore | Rencher Alvin C. <1934-> |
Edizione | [3rd ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2012 |
Descrizione fisica | 1 online resource (xxv, 758 p.) : ill |
Disciplina | 519.5/35 |
Altri autori (Persone) | ChristensenWilliam F. <1970-> |
Collana | Wiley series in probability and statistics |
Soggetto topico |
Multivariate analysis
Anàlisi multivariable |
Soggetto genere / forma | Llibres electrònics |
ISBN |
1-118-39167-5
1-282-24188-5 9786613813008 1-118-39168-3 1-118-30458-6 1-118-39165-9 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | 1. Introduction; 2. Matrix Algebra; 3. Characterizing and Displaying Multivariate Data; 4. The Multivariate Normal Distribution; 5. Tests on One or Two Mean Vectors; 6. Multivariate Analysis of Variance; 7. Tests on Covariance Matrices; 8. Discriminant Analysis: Description of Group Separation; 9. Classification Analysis: Allocation of Observations to Groups; 10. Multivariate Regression; 11. Canonical Correlation; 12. Principal Component Analysis; 13. Exploratory Factor Analysis; 14. Confirmatory Factor Analysis; 15. Cluster Analysis; 16. Graphical Procedures; Appendix A: Tables; Appendix B: Answers and Hints to Problems; Appendix C: Data Sets and SAS Files; References; Index. |
Record Nr. | UNINA-9910822646203321 |
Rencher Alvin C. <1934-> | ||
Hoboken, N.J., : Wiley, c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Multivariate analysis for the biobehavioral and social sciences / / Bruce L. Brown [et al.] |
Autore | Brown Bruce (Bruce L.) |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2012 |
Descrizione fisica | 1 online resource (xiv, 475 pages) : illustrations |
Disciplina | 300.1/519535 |
Soggetto topico |
Social sciences - Statistical methods
Multivariate analysis |
Soggetto genere / forma | Electronic books. |
ISBN |
1-283-33225-6
9786613332257 1-118-13162-2 1-118-13161-4 1-118-13159-2 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Overview of Multivariate and Regression Methods -- The Seven Habits of Highly Effective Quants: A Review of Elementary Statistics Using Matrix Algebra -- Fundamentals of Matrix Algebra -- Factor Analysis and Related Methods: Quintessentially Multivariate -- Multivariate Graphics -- Canonical Correlation: The Underused Method -- Hotelling's as the Simplest Case of Multivariate Inference -- Multivariate Analysis of Variance -- Multiple Regression and the General Linear Model -- Appendices: Statistical Tables -- Name Index -- Subject Index. |
Record Nr. | UNINA-9910139563903321 |
Brown Bruce (Bruce L.) | ||
Hoboken, N.J., : Wiley, c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Multivariate analysis for the biobehavioral and social sciences / / Bruce L. Brown, ... [et al.] |
Autore | Brown Bruce (Bruce L.) |
Edizione | [1st ed.] |
Pubbl/distr/stampa | Hoboken, N.J., : Wiley, c2012 |
Descrizione fisica | 1 online resource (xiv, 475 pages) : illustrations |
Disciplina | 300.1/519535 |
Soggetto topico |
Social sciences - Statistical methods
Multivariate analysis |
ISBN |
1-283-33225-6
9786613332257 1-118-13162-2 1-118-13161-4 1-118-13159-2 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Overview of Multivariate and Regression Methods -- The Seven Habits of Highly Effective Quants: A Review of Elementary Statistics Using Matrix Algebra -- Fundamentals of Matrix Algebra -- Factor Analysis and Related Methods: Quintessentially Multivariate -- Multivariate Graphics -- Canonical Correlation: The Underused Method -- Hotelling's as the Simplest Case of Multivariate Inference -- Multivariate Analysis of Variance -- Multiple Regression and the General Linear Model -- Appendices: Statistical Tables -- Name Index -- Subject Index. |
Record Nr. | UNINA-9910826557703321 |
Brown Bruce (Bruce L.) | ||
Hoboken, N.J., : Wiley, c2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Statistical monitoring of complex multivariate processes [[electronic resource] ] : with applications in industrial process control / / Uwe Kruger and Lei Xie |
Autore | Krüger Uwe, Dr. |
Edizione | [1st edition] |
Pubbl/distr/stampa | Chichester [England] ; ; Hoboken, N.J., : Wiley, 2012 |
Descrizione fisica | 1 online resource (472 p.) |
Disciplina | 519.5/35 |
Altri autori (Persone) | XieLei |
Collana | Statistics in practice |
Soggetto topico | Multivariate analysis |
ISBN |
1-283-54977-8
9786613862228 0-470-51725-5 0-470-51724-7 1-118-38126-2 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Machine generated contents note: Preface Introduction I Fundamentals of Multivariate Statistical Process Control 1 Motivation for Multivariate Statistical Process Control 1.1 Summary of Statistical Process Control 1.1.1 Roots and Evolution of Statistical Process Control 1.1.2 Principles of Statistical Process Control 1.1.3 Hypothesis Testing, Type I and II errors 1.2 Why Multivariate Statistical Process Control 1.2.1 Statistically Uncorrelated Variables 1.2.2 Perfectly Correlated Variables 1.2.3 Highly Correlated Variables 1.2.4 Type I and II Errors and Dimension Reduction 1.3 Tutorial Session 2 Multivariate Data Modeling Methods 2.1 Principal Component Analysis 2.1.1 Assumptions for Underlying Data Structure 2.1.2 Geometric Analysis of Data Structure 2.1.3 A Simulation Example 2.2 Partial Least Squares 2.2.1 Assumptions for Underlying Data Structure 2.2.2 Deflation Procedure for Estimating Data Models 2.2.3 A Simulation Example 2.3 Maximum Redundancy Partial Least Squares 2.3.1 Assumptions for Underlying Data Structure 2.3.2 Source Signal Estimation 2.3.3 Geometric Analysis of Data Structure 2.3.4 A Simulation Example 2.4 Estimating the Number of Source Signals 2.4.1 Stopping Rules for PCA Models 2.4.2 Stopping Rules for PLS Models 2.5 Tutorial Session 3 Process Monitoring Charts 3.1 Fault Detection 3.1.1 Scatter Diagrams 3.1.2 Nonnegative Quadratic Monitoring Statistics 3.2 Fault Isolation and Identification 3.2.1 Contribution Charts 3.2.2 Residual-Based Tests 3.2.3 Variable Reconstruction 3.3 Geometry of Variable Projections 3.3.1 Linear Dependency of Projection Residuals 3.3.2 Geometric Analysis of Variable Reconstruction 3.4 Tutorial Session II Application Studies 4 Application to a Chemical Reaction Process 4.1 Process Description 4.2 Identification of a Monitoring Model 4.3 Diagnosis of a Fault Condition 5 Application to a Distillation Process 5.1 Process Description 5.2 Identification of a Monitoring Model 5.3 Diagnosis of a Fault Condition III Advances in Multivariate Statistical Process Control 6 Further Modeling Issues 6.1 Accuracy of Estimating PCA Models 6.1.1 Revisiting the Eigendecomposition of Sz0z0 6.1.2 Two Illustrative Examples 6.1.3 Maximum Likelihood PCA for Known Sgg 6.1.4 Maximum Likelihood PCA for Unknown Sgg 6.1.5 A Simulation Example 6.1.6 A Stopping Rule for Maximum Likelihood PCA Models 6.1.7 Properties of Model and Residual Subspace Estimates 6.1.8 Application to a Chemical Reaction Process - Revisited 6.2 Accuracy of Estimating PLS Models 6.2.1 Bias and Variance of Parameter Estimation 6.2.2 Comparing Accuracy of PLS and OLS Regression Models 6.2.3 Impact of Error-in-Variables Structure upon PLS Models 6.2.4 Error-in-Variable Estimate for Known See 6.2.5 Error-in-Variable Estimate for Unknown See 6.2.6 Application to a Distillation Process - Revisited 6.3 Robust Model Estimation 6.3.1 Robust Parameter Estimation 6.3.2 Trimming Approaches 6.4 Small Sample Sets 6.5 Tutorial Session 7 Monitoring Multivariate Time-Varying Processes 7.1 Problem Analysis 7.2 Recursive Principal Component Analysis 7.3 MovingWindow Principal Component Analysis 7.3.1 Adapting the Data Correlation Matrix 7.3.2 Adapting the Eigendecomposition 7.3.3 Computational Analysis of the Adaptation Procedure 7.3.4 Adaptation of Control Limits 7.3.5 Process Monitoring using an Application Delay 7.3.6 MinimumWindow Length 7.4 A Simulation Example 7.4.1 Data Generation 7.4.2 Application of PCA 7.4.3 Utilizing MWPCA based on an Application Delay 7.5 Application to a Fluid Catalytic Cracking Unit 7.5.1 Process Description 7.5.2 Data Generation 7.5.3 Pre-analysis of Simulated Data 7.5.4 Application of PCA 7.5.5 Application of MWPCA 7.6 Application to a Furnace Process 7.6.1 Process Description 7.6.2 Description of Sensor Bias 7.6.3 Application of PCA 7.6.4 Utilizing MWPCA based on an Application Delay 7.7 Adaptive Partial Least Squares 7.7.1 Recursive Adaptation of Sx0x0 and Sx0y0 7.7.2 MovingWindow Adaptation of Sv0v0 and Sv0y0 7.7.3 Adapting The Number of Source Signals 7.7.4 Adaptation of the PLS Model 7.8 Tutorial Session 8 Monitoring Changes in Covariance Structure 8.1 Problem Analysis 8.1.1 First Intuitive Example 8.1.2 Generic Statistical Analysis 8.1.3 Second Intuitive Example 8.2 Preliminary Discussion of Related Techniques 8.3 Definition of Primary and Improved Residuals 8.3.1 Primary Residuals for Eigenvectors 8.3.2 Primary Residuals for Eigenvalues 8.3.3 Comparing both Types of Primary Residuals 8.3.4 Statistical Properties of Primary Residuals 8.3.5 Improved Residuals for Eigenvalues 8.4 Revisiting the Simulation Examples in Section 8.1 8.4.1 First Simulation Example 8.4.2 Second Simulation Example 8.5 Fault Isolation and Identification 8.5.1 Diagnosis of Step-Type Fault Conditions 8.5.2 Diagnosis of General Deterministic Fault Conditions 8.5.3 A Simulation Example 8.6 Application Study to a Gearbox System 8.6.1 Process Description 8.6.2 Fault Description 8.6.3 Identification of a Monitoring Model 8.6.4 Detecting a Fault Condition 8.7 Analysis of Primary and Improved Residuals 8.7.1 Central Limit Theorem 8.7.2 Further Statistical Properties of Primary Residuals 8.7.3 Sensitivity of Statistics based on Improved Residuals 8.8 Tutorial Session IV Description of Modeling Methods 9 Principal Component Analysis 9.1 The Core Algorithm 9.2 Summary of the PCA Algorithm 9.3 Properties of a PCA Model 10 Partial Least Squares 10.1 Preliminaries 10.2 The Core Algorithm 10.3 Summary of the PLS Algorithm10.4 Properties of PLS 10.5 Properties of Maximum Redundancy PLS References Index. |
Record Nr. | UNINA-9910139093203321 |
Krüger Uwe, Dr. | ||
Chichester [England] ; ; Hoboken, N.J., : Wiley, 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|
Statistical monitoring of complex multivariate processes : with applications in industrial process control / / Uwe Kruger and Lei Xie |
Autore | Krüger Uwe, Dr. |
Edizione | [1st edition] |
Pubbl/distr/stampa | Chichester [England] ; ; Hoboken, N.J., : Wiley, 2012 |
Descrizione fisica | 1 online resource (472 p.) |
Disciplina | 519.5/35 |
Altri autori (Persone) | XieLei |
Collana | Statistics in practice |
Soggetto topico | Multivariate analysis |
ISBN |
1-283-54977-8
9786613862228 0-470-51725-5 0-470-51724-7 1-118-38126-2 |
Classificazione | MAT029020 |
Formato | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione | eng |
Nota di contenuto | Machine generated contents note: Preface Introduction I Fundamentals of Multivariate Statistical Process Control 1 Motivation for Multivariate Statistical Process Control 1.1 Summary of Statistical Process Control 1.1.1 Roots and Evolution of Statistical Process Control 1.1.2 Principles of Statistical Process Control 1.1.3 Hypothesis Testing, Type I and II errors 1.2 Why Multivariate Statistical Process Control 1.2.1 Statistically Uncorrelated Variables 1.2.2 Perfectly Correlated Variables 1.2.3 Highly Correlated Variables 1.2.4 Type I and II Errors and Dimension Reduction 1.3 Tutorial Session 2 Multivariate Data Modeling Methods 2.1 Principal Component Analysis 2.1.1 Assumptions for Underlying Data Structure 2.1.2 Geometric Analysis of Data Structure 2.1.3 A Simulation Example 2.2 Partial Least Squares 2.2.1 Assumptions for Underlying Data Structure 2.2.2 Deflation Procedure for Estimating Data Models 2.2.3 A Simulation Example 2.3 Maximum Redundancy Partial Least Squares 2.3.1 Assumptions for Underlying Data Structure 2.3.2 Source Signal Estimation 2.3.3 Geometric Analysis of Data Structure 2.3.4 A Simulation Example 2.4 Estimating the Number of Source Signals 2.4.1 Stopping Rules for PCA Models 2.4.2 Stopping Rules for PLS Models 2.5 Tutorial Session 3 Process Monitoring Charts 3.1 Fault Detection 3.1.1 Scatter Diagrams 3.1.2 Nonnegative Quadratic Monitoring Statistics 3.2 Fault Isolation and Identification 3.2.1 Contribution Charts 3.2.2 Residual-Based Tests 3.2.3 Variable Reconstruction 3.3 Geometry of Variable Projections 3.3.1 Linear Dependency of Projection Residuals 3.3.2 Geometric Analysis of Variable Reconstruction 3.4 Tutorial Session II Application Studies 4 Application to a Chemical Reaction Process 4.1 Process Description 4.2 Identification of a Monitoring Model 4.3 Diagnosis of a Fault Condition 5 Application to a Distillation Process 5.1 Process Description 5.2 Identification of a Monitoring Model 5.3 Diagnosis of a Fault Condition III Advances in Multivariate Statistical Process Control 6 Further Modeling Issues 6.1 Accuracy of Estimating PCA Models 6.1.1 Revisiting the Eigendecomposition of Sz0z0 6.1.2 Two Illustrative Examples 6.1.3 Maximum Likelihood PCA for Known Sgg 6.1.4 Maximum Likelihood PCA for Unknown Sgg 6.1.5 A Simulation Example 6.1.6 A Stopping Rule for Maximum Likelihood PCA Models 6.1.7 Properties of Model and Residual Subspace Estimates 6.1.8 Application to a Chemical Reaction Process - Revisited 6.2 Accuracy of Estimating PLS Models 6.2.1 Bias and Variance of Parameter Estimation 6.2.2 Comparing Accuracy of PLS and OLS Regression Models 6.2.3 Impact of Error-in-Variables Structure upon PLS Models 6.2.4 Error-in-Variable Estimate for Known See 6.2.5 Error-in-Variable Estimate for Unknown See 6.2.6 Application to a Distillation Process - Revisited 6.3 Robust Model Estimation 6.3.1 Robust Parameter Estimation 6.3.2 Trimming Approaches 6.4 Small Sample Sets 6.5 Tutorial Session 7 Monitoring Multivariate Time-Varying Processes 7.1 Problem Analysis 7.2 Recursive Principal Component Analysis 7.3 MovingWindow Principal Component Analysis 7.3.1 Adapting the Data Correlation Matrix 7.3.2 Adapting the Eigendecomposition 7.3.3 Computational Analysis of the Adaptation Procedure 7.3.4 Adaptation of Control Limits 7.3.5 Process Monitoring using an Application Delay 7.3.6 MinimumWindow Length 7.4 A Simulation Example 7.4.1 Data Generation 7.4.2 Application of PCA 7.4.3 Utilizing MWPCA based on an Application Delay 7.5 Application to a Fluid Catalytic Cracking Unit 7.5.1 Process Description 7.5.2 Data Generation 7.5.3 Pre-analysis of Simulated Data 7.5.4 Application of PCA 7.5.5 Application of MWPCA 7.6 Application to a Furnace Process 7.6.1 Process Description 7.6.2 Description of Sensor Bias 7.6.3 Application of PCA 7.6.4 Utilizing MWPCA based on an Application Delay 7.7 Adaptive Partial Least Squares 7.7.1 Recursive Adaptation of Sx0x0 and Sx0y0 7.7.2 MovingWindow Adaptation of Sv0v0 and Sv0y0 7.7.3 Adapting The Number of Source Signals 7.7.4 Adaptation of the PLS Model 7.8 Tutorial Session 8 Monitoring Changes in Covariance Structure 8.1 Problem Analysis 8.1.1 First Intuitive Example 8.1.2 Generic Statistical Analysis 8.1.3 Second Intuitive Example 8.2 Preliminary Discussion of Related Techniques 8.3 Definition of Primary and Improved Residuals 8.3.1 Primary Residuals for Eigenvectors 8.3.2 Primary Residuals for Eigenvalues 8.3.3 Comparing both Types of Primary Residuals 8.3.4 Statistical Properties of Primary Residuals 8.3.5 Improved Residuals for Eigenvalues 8.4 Revisiting the Simulation Examples in Section 8.1 8.4.1 First Simulation Example 8.4.2 Second Simulation Example 8.5 Fault Isolation and Identification 8.5.1 Diagnosis of Step-Type Fault Conditions 8.5.2 Diagnosis of General Deterministic Fault Conditions 8.5.3 A Simulation Example 8.6 Application Study to a Gearbox System 8.6.1 Process Description 8.6.2 Fault Description 8.6.3 Identification of a Monitoring Model 8.6.4 Detecting a Fault Condition 8.7 Analysis of Primary and Improved Residuals 8.7.1 Central Limit Theorem 8.7.2 Further Statistical Properties of Primary Residuals 8.7.3 Sensitivity of Statistics based on Improved Residuals 8.8 Tutorial Session IV Description of Modeling Methods 9 Principal Component Analysis 9.1 The Core Algorithm 9.2 Summary of the PCA Algorithm 9.3 Properties of a PCA Model 10 Partial Least Squares 10.1 Preliminaries 10.2 The Core Algorithm 10.3 Summary of the PLS Algorithm10.4 Properties of PLS 10.5 Properties of Maximum Redundancy PLS References Index. |
Record Nr. | UNINA-9910812539303321 |
Krüger Uwe, Dr. | ||
Chichester [England] ; ; Hoboken, N.J., : Wiley, 2012 | ||
Materiale a stampa | ||
Lo trovi qui: Univ. Federico II | ||
|