top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Condition monitoring with vibration signals : compressive sampling and learning algorithms for rotating machines / / Hosameldin Ahmed and Asoke K. Nandi
Condition monitoring with vibration signals : compressive sampling and learning algorithms for rotating machines / / Hosameldin Ahmed and Asoke K. Nandi
Autore Ahmed Hosameldin <1976->
Pubbl/distr/stampa Hoboken, New Jersey, USA : , : John Wiley & Sons, Inc., , 2020
Descrizione fisica 1 online resource (437 pages)
Disciplina 621.80287
Soggetto topico Machinery - Monitoring
ISBN 1-119-54464-5
1-119-54467-X
1-119-54463-7
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Preface xvii -- About the Authors xxi -- List of Abbreviations xxiii -- Part I Introduction 1 -- 1 Introduction to Machine Condition Monitoring 3 -- 1.1 Background 3 -- 1.2 Maintenance Approaches for Rotating Machines Failures 4 -- 1.2.1 Corrective Maintenance 4 -- 1.2.2 Preventive Maintenance 5 -- 1.2.2.1 Time-Based Maintenance (TBM) 5 -- 1.2.2.2 Condition-Based Maintenance (CBM) 5 -- 1.3 Applications of MCM 5 -- 1.3.1 Wind Turbines 5 -- 1.3.2 Oil and Gas 6 -- 1.3.3 Aerospace and Defence Industry 6 -- 1.3.4 Automotive 7 -- 1.3.5 Marine Engines 7 -- 1.3.6 Locomotives 7 -- 1.4 Condition Monitoring Techniques 7 -- 1.4.1 Vibration Monitoring 7 -- 1.4.2 Acoustic Emission 8 -- 1.4.3 Fusion of Vibration and Acoustic 8 -- 1.4.4 Motor Current Monitoring 8 -- 1.4.5 Oil Analysis and Lubrication Monitoring 8 -- 1.4.6 Thermography 9 -- 1.4.7 Visual Inspection 9 -- 1.4.8 Performance Monitoring 9 -- 1.4.9 Trend Monitoring 10 -- 1.5 Topic Overview and Scope of the Book 10 -- 1.6 Summary 11 -- References 11 -- 2 Principles of Rotating Machine Vibration Signals 17 -- 2.1 Introduction 17 -- 2.2 Machine Vibration Principles 17 -- 2.3 Sources of Rotating Machines Vibration Signals 20 -- 2.3.1 Rotor Mass Unbalance 21 -- 2.3.2 Misalignment 21 -- 2.3.3 Cracked Shafts 21 -- 2.3.4 Rolling Element Bearings 23 -- 2.3.5 Gears 25 -- 2.4 Types of Vibration Signals 25 -- 2.4.1 Stationary 26 -- 2.4.2 Nonstationary 26 -- 2.5 Vibration Signal Acquisition 26 -- 2.5.1 Displacement Transducers 26 -- 2.5.2 Velocity Transducers 26 -- 2.5.3 Accelerometers 27 -- 2.6 Advantages and Limitations of Vibration Signal Monitoring 27 -- 2.7 Summary 28 -- References 28 -- Part II Vibration Signal Analysis Techniques 31 -- 3 Time Domain Analysis 33 -- 3.1 Introduction 33 -- 3.1.1 Visual Inspection 33 -- 3.1.2 Features-Based Inspection 35 -- 3.2 Statistical Functions 35 -- 3.2.1 Peak Amplitude 36 -- 3.2.2 Mean Amplitude 36 -- 3.2.3 Root Mean Square Amplitude 36 -- 3.2.4 Peak-to-Peak Amplitude 36 -- 3.2.5 Crest Factor (CF) 36.
3.2.6 Variance and Standard Deviation 37 -- 3.2.7 Standard Error 37 -- 3.2.8 Zero Crossing 38 -- 3.2.9 Wavelength 39 -- 3.2.10 Willison Amplitude 39 -- 3.2.11 Slope Sign Change 39 -- 3.2.12 Impulse Factor 39 -- 3.2.13 Margin Factor 40 -- 3.2.14 Shape Factor 40 -- 3.2.15 Clearance Factor 40 -- 3.2.16 Skewness 40 -- 3.2.17 Kurtosis 40 -- 3.2.18 Higher-Order Cumulants (HOCs) 41 -- 3.2.19 Histograms 42 -- 3.2.20 Normal/Weibull Negative Log-Likelihood Value 42 -- 3.2.21 Entropy 42 -- 3.3 Time Synchronous Averaging 44 -- 3.3.1 TSA Signals 44 -- 3.3.2 Residual Signal (RES) 44 -- 3.3.2.1 NA4 44 -- 3.3.2.2 NA4* 45 -- 3.3.3 Difference Signal (DIFS) 45 -- 3.3.3.1 FM4 46 -- 3.3.3.2 M6A 46 -- 3.3.3.3 M8A 46 -- 3.4 Time Series Regressive Models 46 -- 3.4.1 AR Model 47 -- 3.4.2 MA Model 48 -- 3.4.3 ARMA Model 48 -- 3.4.4 ARIMA Model 48 -- 3.5 Filter-Based Methods 49 -- 3.5.1 Demodulation 49 -- 3.5.2 Prony Model 52 -- 3.5.3 Adaptive Noise Cancellation (ANC) 53 -- 3.6 Stochastic Parameter Techniques 54 -- 3.7 Blind Source Separation (BSS) 54 -- 3.8 Summary 55 -- References 56 -- 4 Frequency Domain Analysis 63 -- 4.1 Introduction 63 -- 4.2 Fourier Analysis 64 -- 4.2.1 Fourier Series 64 -- 4.2.2 Discrete Fourier Transform 66 -- 4.2.3 Fast Fourier Transform (FFT) 67 -- 4.3 Envelope Analysis 71 -- 4.4 Frequency Spectrum Statistical Features 73 -- 4.4.1 Arithmetic Mean 73 -- 4.4.2 Geometric Mean 73 -- 4.4.3 Matched Filter RMS 73 -- 4.4.4 The RMS of Spectral Difference 74 -- 4.4.5 The Sum of Squares Spectral Difference 74 -- 4.4.6 High-Order Spectra Techniques 74 -- 4.5 Summary 75 -- References 76 -- 5 Time-Frequency Domain Analysis 79 -- 5.1 Introduction 79 -- 5.2 Short-Time Fourier Transform (STFT) 79 -- 5.3 Wavelet Analysis 82 -- 5.3.1 Wavelet Transform (WT) 82 -- 5.3.1.1 Continuous Wavelet Transform (CWT) 83 -- 5.3.1.2 Discrete Wavelet Transform (DWT) 85 -- 5.3.2 Wavelet Packet Transform (WPT) 89 -- 5.4 Empirical Mode Decomposition (EMD) 91 -- 5.5 Hilbert-Huang Transform (HHT) 94 -- 5.6 Wigner-Ville Distribution 96.
5.7 Local Mean Decomposition (LMD) 98 -- 5.8 Kurtosis and Kurtograms 100 -- 5.9 Summary 105 -- References 106 -- Part III Rotating Machine Condition Monitoring Using Machine Learning 115 -- 6 Vibration-Based Condition Monitoring Using Machine Learning 117 -- 6.1 Introduction 117 -- 6.2 Overview of the Vibration-Based MCM Process 118 -- 6.2.1 Fault-Detection and -Diagnosis Problem Framework 118 -- 6.3 Learning from Vibration Data 122 -- 6.3.1 Types of Learning 123 -- 6.3.1.1 Batch vs. Online Learning 123 -- 6.3.1.2 Instance-Based vs. Model-Based Learning 123 -- 6.3.1.3 Supervised Learning vs. Unsupervised Learning 123 -- 6.3.1.4 Semi-Supervised Learning 123 -- 6.3.1.5 Reinforcement Learning 124 -- 6.3.1.6 Transfer Learning 124 -- 6.3.2 Main Challenges of Learning from Vibration Data 125 -- 6.3.2.1 The Curse of Dimensionality 125 -- 6.3.2.2 Irrelevant Features 126 -- 6.3.2.3 Environment and Operating Conditions of a Rotating Machine 126 -- 6.3.3 Preparing Vibration Data for Analysis 126 -- 6.3.3.1 Normalisation 126 -- 6.3.3.2 Dimensionality Reduction 127 -- 6.4 Summary 128 -- References 128 -- 7 Linear Subspace Learning 131 -- 7.1 Introduction 131 -- 7.2 Principal Component Analysis (PCA) 132 -- 7.2.1 PCA Using Eigenvector Decomposition 132 -- 7.2.2 PCA Using SVD 133 -- 7.2.3 Application of PCA in Machine Fault Diagnosis 134 -- 7.3 Independent Component Analysis (ICA) 137 -- 7.3.1 Minimisation of Mutual Information 138 -- 7.3.2 Maximisation of the Likelihood 138 -- 7.3.3 Application of ICA in Machine Fault Diagnosis 139 -- 7.4 Linear Discriminant Analysis (LDA) 141 -- 7.4.1 Application of LDA in Machine Fault Diagnosis 142 -- 7.5 Canonical Correlation Analysis (CCA) 143 -- 7.6 Partial Least Squares (PLS) 145 -- 7.7 Summary 146 -- References 147 -- 8 Nonlinear Subspace Learning 153 -- 8.1 Introduction 153 -- 8.2 Kernel Principal Component Analysis (KPCA) 153 -- 8.2.1 Application of KPCA in Machine Fault Diagnosis 156 -- 8.3 Isometric Feature Mapping (ISOMAP) 156 -- 8.3.1 Application of ISOMAP in Machine Fault Diagnosis 158.
8.4 Diffusion Maps (DMs) and Diffusion Distances 159 -- 8.4.1 Application of DMs in Machine Fault Diagnosis 160 -- 8.5 Laplacian Eigenmap (LE) 161 -- 8.5.1 Application of the LE in Machine Fault Diagnosis 161 -- 8.6 Local Linear Embedding (LLE) 162 -- 8.6.1 Application of LLE in Machine Fault Diagnosis 163 -- 8.7 Hessian-Based LLE 163 -- 8.7.1 Application of HLLE in Machine Fault Diagnosis 164 -- 8.8 Local Tangent Space Alignment Analysis (LTSA) 165 -- 8.8.1 Application of LTSA in Machine Fault Diagnosis 165 -- 8.9 Maximum Variance Unfolding (MVU) 166 -- 8.9.1 Application of MVU in Machine Fault Diagnosis 167 -- 8.10 Stochastic Proximity Embedding (SPE) 168 -- 8.10.1 Application of SPE in Machine Fault Diagnosis 168 -- 8.11 Summary 169 -- References 170 -- 9 Feature Selection 173 -- 9.1 Introduction 173 -- 9.2 Filter Model-Based Feature Selection 175 -- 9.2.1 Fisher Score (FS) 176 -- 9.2.2 Laplacian Score (LS) 177 -- 9.2.3 Relief and Relief-F Algorithms 178 -- 9.2.3.1 Relief Algorithm 178 -- 9.2.3.2 Relief-F Algorithm 179 -- 9.2.4 Pearson Correlation Coefficient (PCC) 180 -- 9.2.5 Information Gain (IG) and Gain Ratio (GR) 180 -- 9.2.6 Mutual Information (MI) 181 -- 9.2.7 Chi-Squared (Chi-2) 181 -- 9.2.8 Wilcoxon Ranking 181 -- 9.2.9 Application of Feature Ranking in Machine Fault Diagnosis 182 -- 9.3 Wrapper ModeĺôBased Feature Subset Selection 185 -- 9.3.1 Sequential Selection Algorithms 185 -- 9.3.2 Heuristic-Based Selection Algorithms 185 -- 9.3.2.1 Ant Colony Optimisation (ACO) 185 -- 9.3.2.2 Genetic Algorithms (GAs) and Genetic Programming 187 -- 9.3.2.3 Particle Swarm Optimisation (PSO) 188 -- 9.3.3 Application of Wrapper ModeĺôBased Feature Subset Selection in Machine Fault Diagnosis 189 -- 9.4 Embedded ModeĺôBased Feature Selection 192 -- 9.5 Summary 193 -- References 194 -- Part IV Classification Algorithms 199 -- 10 Decision Trees and Random Forests 201 -- 10.1 Introduction 201 -- 10.2 Decision Trees 202 -- 10.2.1 Univariate Splitting Criteria 204 -- 10.2.1.1 Gini Index 205.
10.2.1.2 Information Gain 206 -- 10.2.1.3 Distance Measure 207 -- 10.2.1.4 Orthogonal Criterion (ORT) 207 -- 10.2.2 Multivariate Splitting Criteria 207 -- 10.2.3 Tree-Pruning Methods 208 -- 10.2.3.1 Error-Complexity Pruning 208 -- 10.2.3.2 Minimum-Error Pruning 209 -- 10.2.3.3 Reduced-Error Pruning 209 -- 10.2.3.4 Critical-Value Pruning 210 -- 10.2.3.5 Pessimistic Pruning 210 -- 10.2.3.6 Minimum Description Length (MDL) Pruning 210 -- 10.2.4 Decision Tree Inducers 211 -- 10.2.4.1 CART 211 -- 10.2.4.2 ID3 211 -- 10.2.4.3 C4.5 211 -- 10.2.4.4 CHAID 212 -- 10.3 Decision Forests 212 -- 10.4 Application of Decision Trees/Forests in Machine Fault Diagnosis 213 -- 10.5 Summary 217 -- References 217 -- 11 Probabilistic Classification Methods 225 -- 11.1 Introduction 225 -- 11.2 Hidden Markov Model 225 -- 11.2.1 Application of Hidden Markov Models in Machine Fault Diagnosis 228 -- 11.3 Logistic Regression Model 230 -- 11.3.1 Logistic Regression Regularisation 232 -- 11.3.2 Multinomial Logistic Regression Model (MLR) 232 -- 11.3.3 Application of Logistic Regression in Machine Fault Diagnosis 233 -- 11.4 Summary 234 -- References 235 -- 12 Artificial Neural Networks (ANNs) 239 -- 12.1 Introduction 239 -- 12.2 Neural Network Basic Principles 240 -- 12.2.1 The Multilayer Perceptron 241 -- 12.2.2 The Radial Basis Function Network 243 -- 12.2.3 The Kohonen Network 244 -- 12.3 Application of Artificial Neural Networks in Machine Fault Diagnosis 245 -- 12.4 Summary 253 -- References 254 -- 13 Support Vector Machines (SVMs) 259 -- 13.1 Introduction 259 -- 13.2 Multiclass SVMs 262 -- 13.3 Selection of Kernel Parameters 263 -- 13.4 Application of SVMs in Machine Fault Diagnosis 263 -- 13.5 Summary 274 -- References 274 -- 14 Deep Learning 279 -- 14.1 Introduction 279 -- 14.2 Autoencoders 280 -- 14.3 Convolutional Neural Networks (CNNs) 283 -- 14.4 Deep Belief Networks (DBNs) 284 -- 14.5 Recurrent Neural Networks (RNNs) 285 -- 14.6 Overview of Deep Learning in MCM 286 -- 14.6.1 Application of AE-based DNNs in Machine Fault Diagnosis 286.
14.6.2 Application of CNNs in Machine Fault Diagnosis 292 -- 14.6.3 Application of DBNs in Machine Fault Diagnosis 296 -- 14.6.4 Application of RNNs in Machine Fault Diagnosis 298 -- 14.7 Summary 299 -- References 301 -- 15 Classification Algorithm Validation 307 -- 15.1 Introduction 307 -- 15.2 The Hold-Out Technique 308 -- 15.2.1 Three-Way Data Split 309 -- 15.3 Random Subsampling 309 -- 15.4 K-Fold Cross-Validation 310 -- 15.5 Leave-One-Out Cross-Validation 311 -- 15.6 Bootstrapping 311 -- 15.7 Overall Classification Accuracy 312 -- 15.8 Confusion Matrix 313 -- 15.9 Recall and Precision 314 -- 15.10 ROC Graphs 315 -- 15.11 Summary 317 -- References 318 -- Part V New Fault Diagnosis Frameworks Designed for MCM 321 -- 16 Compressive Sampling and Subspace Learning (CS-SL) 323 -- 16.1 Introduction 323 -- 16.2 Compressive Sampling for Vibration-Based MCM 325 -- 16.2.1 Compressive Sampling Basics 325 -- 16.2.2 CS for Sparse Frequency Representation 328 -- 16.2.3 CS for Sparse Time-Frequency Representation 329 -- 16.3 Overview of CS in Machine Condition Monitoring 330 -- 16.3.1 Compressed Sensed Data Followed by Complete Data Construction 330 -- 16.3.2 Compressed Sensed Data Followed by Incomplete Data Construction 331 -- 16.3.3 Compressed Sensed Data as the Input of a Classifier 332 -- 16.3.4 Compressed Sensed Data Followed by Feature Learning 333 -- 16.4 Compressive Sampling and Feature Ranking (CS-FR) 333 -- 16.4.1 Implementations 334 -- 16.4.1.1 CS-LS 336 -- 16.4.1.2 CS-FS 336 -- 16.4.1.3 CS-Relief-F 337 -- 16.4.1.4 CS-PCC 338 -- 16.4.1.5 CS-Chi-2 338 -- 16.5 CS and Linear Subspace Learning-Based Framework for Fault Diagnosis 339 -- 16.5.1 Implementations 339 -- 16.5.1.1 CS-PCA 339 -- 16.5.1.2 CS-LDA 340 -- 16.5.1.3 CS-CPDC 341 -- 16.6 CS and Nonlinear Subspace Learning-Based Framework for Fault Diagnosis 343 -- 16.6.1 Implementations 344 -- 16.6.1.1 CS-KPCA 344 -- 16.6.1.2 CS-KLDA 345 -- 16.6.1.3 CS-CMDS 346 -- 16.6.1.4 CS-SPE 346 -- 16.7 Applications 348 -- 16.7.1 Case Study 1 348.
16.7.1.1 The Combination of MMV-CS and Several Feature-Ranking Techniques 350 -- 16.7.1.2 The Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques 352 -- 16.7.2 Case Study 2 354 -- 16.7.2.1 The Combination of MMV-CS and Several Feature-Ranking Techniques 354 -- 16.7.2.2 The Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques 355 -- 16.8 Discussion 355 -- References 357 -- 17 Compressive Sampling and Deep Neural Network (CS-DNN) 361 -- 17.1 Introduction 361 -- 17.2 Related Work 361 -- 17.3 CS-SAE-DNN 362 -- 17.3.1 Compressed Measurements Generation 362 -- 17.3.2 CS Model Testing Using the Flip Test 363 -- 17.3.3 DNN-Based Unsupervised Sparse Overcomplete Feature Learning 363 -- 17.3.4 Supervised Fine Tuning 367 -- 17.4 Applications 367 -- 17.4.1 Case Study 1 367 -- 17.4.2 Case Study 2 372 -- 17.5 Discussion 375 -- References 375 -- 18 Conclusion 379 -- 18.1 Introduction 379 -- 18.2 Summary and Conclusion 380 -- Appendix Machinery Vibration Data Resources and Analysis Algorithms 389 -- References 394 -- Index 395.
Record Nr. UNINA-9910555291703321
Ahmed Hosameldin <1976->  
Hoboken, New Jersey, USA : , : John Wiley & Sons, Inc., , 2020
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Condition monitoring with vibration signals : compressive sampling and learning algorithms for rotating machines / / Hosameldin Ahmed and Asoke K. Nandi
Condition monitoring with vibration signals : compressive sampling and learning algorithms for rotating machines / / Hosameldin Ahmed and Asoke K. Nandi
Autore Ahmed Hosameldin <1976->
Pubbl/distr/stampa Hoboken, New Jersey, USA : , : John Wiley & Sons, Inc., , 2020
Descrizione fisica 1 online resource (437 pages)
Disciplina 621.80287
Soggetto topico Machinery - Monitoring
ISBN 1-119-54464-5
1-119-54467-X
1-119-54463-7
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Preface xvii -- About the Authors xxi -- List of Abbreviations xxiii -- Part I Introduction 1 -- 1 Introduction to Machine Condition Monitoring 3 -- 1.1 Background 3 -- 1.2 Maintenance Approaches for Rotating Machines Failures 4 -- 1.2.1 Corrective Maintenance 4 -- 1.2.2 Preventive Maintenance 5 -- 1.2.2.1 Time-Based Maintenance (TBM) 5 -- 1.2.2.2 Condition-Based Maintenance (CBM) 5 -- 1.3 Applications of MCM 5 -- 1.3.1 Wind Turbines 5 -- 1.3.2 Oil and Gas 6 -- 1.3.3 Aerospace and Defence Industry 6 -- 1.3.4 Automotive 7 -- 1.3.5 Marine Engines 7 -- 1.3.6 Locomotives 7 -- 1.4 Condition Monitoring Techniques 7 -- 1.4.1 Vibration Monitoring 7 -- 1.4.2 Acoustic Emission 8 -- 1.4.3 Fusion of Vibration and Acoustic 8 -- 1.4.4 Motor Current Monitoring 8 -- 1.4.5 Oil Analysis and Lubrication Monitoring 8 -- 1.4.6 Thermography 9 -- 1.4.7 Visual Inspection 9 -- 1.4.8 Performance Monitoring 9 -- 1.4.9 Trend Monitoring 10 -- 1.5 Topic Overview and Scope of the Book 10 -- 1.6 Summary 11 -- References 11 -- 2 Principles of Rotating Machine Vibration Signals 17 -- 2.1 Introduction 17 -- 2.2 Machine Vibration Principles 17 -- 2.3 Sources of Rotating Machines Vibration Signals 20 -- 2.3.1 Rotor Mass Unbalance 21 -- 2.3.2 Misalignment 21 -- 2.3.3 Cracked Shafts 21 -- 2.3.4 Rolling Element Bearings 23 -- 2.3.5 Gears 25 -- 2.4 Types of Vibration Signals 25 -- 2.4.1 Stationary 26 -- 2.4.2 Nonstationary 26 -- 2.5 Vibration Signal Acquisition 26 -- 2.5.1 Displacement Transducers 26 -- 2.5.2 Velocity Transducers 26 -- 2.5.3 Accelerometers 27 -- 2.6 Advantages and Limitations of Vibration Signal Monitoring 27 -- 2.7 Summary 28 -- References 28 -- Part II Vibration Signal Analysis Techniques 31 -- 3 Time Domain Analysis 33 -- 3.1 Introduction 33 -- 3.1.1 Visual Inspection 33 -- 3.1.2 Features-Based Inspection 35 -- 3.2 Statistical Functions 35 -- 3.2.1 Peak Amplitude 36 -- 3.2.2 Mean Amplitude 36 -- 3.2.3 Root Mean Square Amplitude 36 -- 3.2.4 Peak-to-Peak Amplitude 36 -- 3.2.5 Crest Factor (CF) 36.
3.2.6 Variance and Standard Deviation 37 -- 3.2.7 Standard Error 37 -- 3.2.8 Zero Crossing 38 -- 3.2.9 Wavelength 39 -- 3.2.10 Willison Amplitude 39 -- 3.2.11 Slope Sign Change 39 -- 3.2.12 Impulse Factor 39 -- 3.2.13 Margin Factor 40 -- 3.2.14 Shape Factor 40 -- 3.2.15 Clearance Factor 40 -- 3.2.16 Skewness 40 -- 3.2.17 Kurtosis 40 -- 3.2.18 Higher-Order Cumulants (HOCs) 41 -- 3.2.19 Histograms 42 -- 3.2.20 Normal/Weibull Negative Log-Likelihood Value 42 -- 3.2.21 Entropy 42 -- 3.3 Time Synchronous Averaging 44 -- 3.3.1 TSA Signals 44 -- 3.3.2 Residual Signal (RES) 44 -- 3.3.2.1 NA4 44 -- 3.3.2.2 NA4* 45 -- 3.3.3 Difference Signal (DIFS) 45 -- 3.3.3.1 FM4 46 -- 3.3.3.2 M6A 46 -- 3.3.3.3 M8A 46 -- 3.4 Time Series Regressive Models 46 -- 3.4.1 AR Model 47 -- 3.4.2 MA Model 48 -- 3.4.3 ARMA Model 48 -- 3.4.4 ARIMA Model 48 -- 3.5 Filter-Based Methods 49 -- 3.5.1 Demodulation 49 -- 3.5.2 Prony Model 52 -- 3.5.3 Adaptive Noise Cancellation (ANC) 53 -- 3.6 Stochastic Parameter Techniques 54 -- 3.7 Blind Source Separation (BSS) 54 -- 3.8 Summary 55 -- References 56 -- 4 Frequency Domain Analysis 63 -- 4.1 Introduction 63 -- 4.2 Fourier Analysis 64 -- 4.2.1 Fourier Series 64 -- 4.2.2 Discrete Fourier Transform 66 -- 4.2.3 Fast Fourier Transform (FFT) 67 -- 4.3 Envelope Analysis 71 -- 4.4 Frequency Spectrum Statistical Features 73 -- 4.4.1 Arithmetic Mean 73 -- 4.4.2 Geometric Mean 73 -- 4.4.3 Matched Filter RMS 73 -- 4.4.4 The RMS of Spectral Difference 74 -- 4.4.5 The Sum of Squares Spectral Difference 74 -- 4.4.6 High-Order Spectra Techniques 74 -- 4.5 Summary 75 -- References 76 -- 5 Time-Frequency Domain Analysis 79 -- 5.1 Introduction 79 -- 5.2 Short-Time Fourier Transform (STFT) 79 -- 5.3 Wavelet Analysis 82 -- 5.3.1 Wavelet Transform (WT) 82 -- 5.3.1.1 Continuous Wavelet Transform (CWT) 83 -- 5.3.1.2 Discrete Wavelet Transform (DWT) 85 -- 5.3.2 Wavelet Packet Transform (WPT) 89 -- 5.4 Empirical Mode Decomposition (EMD) 91 -- 5.5 Hilbert-Huang Transform (HHT) 94 -- 5.6 Wigner-Ville Distribution 96.
5.7 Local Mean Decomposition (LMD) 98 -- 5.8 Kurtosis and Kurtograms 100 -- 5.9 Summary 105 -- References 106 -- Part III Rotating Machine Condition Monitoring Using Machine Learning 115 -- 6 Vibration-Based Condition Monitoring Using Machine Learning 117 -- 6.1 Introduction 117 -- 6.2 Overview of the Vibration-Based MCM Process 118 -- 6.2.1 Fault-Detection and -Diagnosis Problem Framework 118 -- 6.3 Learning from Vibration Data 122 -- 6.3.1 Types of Learning 123 -- 6.3.1.1 Batch vs. Online Learning 123 -- 6.3.1.2 Instance-Based vs. Model-Based Learning 123 -- 6.3.1.3 Supervised Learning vs. Unsupervised Learning 123 -- 6.3.1.4 Semi-Supervised Learning 123 -- 6.3.1.5 Reinforcement Learning 124 -- 6.3.1.6 Transfer Learning 124 -- 6.3.2 Main Challenges of Learning from Vibration Data 125 -- 6.3.2.1 The Curse of Dimensionality 125 -- 6.3.2.2 Irrelevant Features 126 -- 6.3.2.3 Environment and Operating Conditions of a Rotating Machine 126 -- 6.3.3 Preparing Vibration Data for Analysis 126 -- 6.3.3.1 Normalisation 126 -- 6.3.3.2 Dimensionality Reduction 127 -- 6.4 Summary 128 -- References 128 -- 7 Linear Subspace Learning 131 -- 7.1 Introduction 131 -- 7.2 Principal Component Analysis (PCA) 132 -- 7.2.1 PCA Using Eigenvector Decomposition 132 -- 7.2.2 PCA Using SVD 133 -- 7.2.3 Application of PCA in Machine Fault Diagnosis 134 -- 7.3 Independent Component Analysis (ICA) 137 -- 7.3.1 Minimisation of Mutual Information 138 -- 7.3.2 Maximisation of the Likelihood 138 -- 7.3.3 Application of ICA in Machine Fault Diagnosis 139 -- 7.4 Linear Discriminant Analysis (LDA) 141 -- 7.4.1 Application of LDA in Machine Fault Diagnosis 142 -- 7.5 Canonical Correlation Analysis (CCA) 143 -- 7.6 Partial Least Squares (PLS) 145 -- 7.7 Summary 146 -- References 147 -- 8 Nonlinear Subspace Learning 153 -- 8.1 Introduction 153 -- 8.2 Kernel Principal Component Analysis (KPCA) 153 -- 8.2.1 Application of KPCA in Machine Fault Diagnosis 156 -- 8.3 Isometric Feature Mapping (ISOMAP) 156 -- 8.3.1 Application of ISOMAP in Machine Fault Diagnosis 158.
8.4 Diffusion Maps (DMs) and Diffusion Distances 159 -- 8.4.1 Application of DMs in Machine Fault Diagnosis 160 -- 8.5 Laplacian Eigenmap (LE) 161 -- 8.5.1 Application of the LE in Machine Fault Diagnosis 161 -- 8.6 Local Linear Embedding (LLE) 162 -- 8.6.1 Application of LLE in Machine Fault Diagnosis 163 -- 8.7 Hessian-Based LLE 163 -- 8.7.1 Application of HLLE in Machine Fault Diagnosis 164 -- 8.8 Local Tangent Space Alignment Analysis (LTSA) 165 -- 8.8.1 Application of LTSA in Machine Fault Diagnosis 165 -- 8.9 Maximum Variance Unfolding (MVU) 166 -- 8.9.1 Application of MVU in Machine Fault Diagnosis 167 -- 8.10 Stochastic Proximity Embedding (SPE) 168 -- 8.10.1 Application of SPE in Machine Fault Diagnosis 168 -- 8.11 Summary 169 -- References 170 -- 9 Feature Selection 173 -- 9.1 Introduction 173 -- 9.2 Filter Model-Based Feature Selection 175 -- 9.2.1 Fisher Score (FS) 176 -- 9.2.2 Laplacian Score (LS) 177 -- 9.2.3 Relief and Relief-F Algorithms 178 -- 9.2.3.1 Relief Algorithm 178 -- 9.2.3.2 Relief-F Algorithm 179 -- 9.2.4 Pearson Correlation Coefficient (PCC) 180 -- 9.2.5 Information Gain (IG) and Gain Ratio (GR) 180 -- 9.2.6 Mutual Information (MI) 181 -- 9.2.7 Chi-Squared (Chi-2) 181 -- 9.2.8 Wilcoxon Ranking 181 -- 9.2.9 Application of Feature Ranking in Machine Fault Diagnosis 182 -- 9.3 Wrapper ModeĺôBased Feature Subset Selection 185 -- 9.3.1 Sequential Selection Algorithms 185 -- 9.3.2 Heuristic-Based Selection Algorithms 185 -- 9.3.2.1 Ant Colony Optimisation (ACO) 185 -- 9.3.2.2 Genetic Algorithms (GAs) and Genetic Programming 187 -- 9.3.2.3 Particle Swarm Optimisation (PSO) 188 -- 9.3.3 Application of Wrapper ModeĺôBased Feature Subset Selection in Machine Fault Diagnosis 189 -- 9.4 Embedded ModeĺôBased Feature Selection 192 -- 9.5 Summary 193 -- References 194 -- Part IV Classification Algorithms 199 -- 10 Decision Trees and Random Forests 201 -- 10.1 Introduction 201 -- 10.2 Decision Trees 202 -- 10.2.1 Univariate Splitting Criteria 204 -- 10.2.1.1 Gini Index 205.
10.2.1.2 Information Gain 206 -- 10.2.1.3 Distance Measure 207 -- 10.2.1.4 Orthogonal Criterion (ORT) 207 -- 10.2.2 Multivariate Splitting Criteria 207 -- 10.2.3 Tree-Pruning Methods 208 -- 10.2.3.1 Error-Complexity Pruning 208 -- 10.2.3.2 Minimum-Error Pruning 209 -- 10.2.3.3 Reduced-Error Pruning 209 -- 10.2.3.4 Critical-Value Pruning 210 -- 10.2.3.5 Pessimistic Pruning 210 -- 10.2.3.6 Minimum Description Length (MDL) Pruning 210 -- 10.2.4 Decision Tree Inducers 211 -- 10.2.4.1 CART 211 -- 10.2.4.2 ID3 211 -- 10.2.4.3 C4.5 211 -- 10.2.4.4 CHAID 212 -- 10.3 Decision Forests 212 -- 10.4 Application of Decision Trees/Forests in Machine Fault Diagnosis 213 -- 10.5 Summary 217 -- References 217 -- 11 Probabilistic Classification Methods 225 -- 11.1 Introduction 225 -- 11.2 Hidden Markov Model 225 -- 11.2.1 Application of Hidden Markov Models in Machine Fault Diagnosis 228 -- 11.3 Logistic Regression Model 230 -- 11.3.1 Logistic Regression Regularisation 232 -- 11.3.2 Multinomial Logistic Regression Model (MLR) 232 -- 11.3.3 Application of Logistic Regression in Machine Fault Diagnosis 233 -- 11.4 Summary 234 -- References 235 -- 12 Artificial Neural Networks (ANNs) 239 -- 12.1 Introduction 239 -- 12.2 Neural Network Basic Principles 240 -- 12.2.1 The Multilayer Perceptron 241 -- 12.2.2 The Radial Basis Function Network 243 -- 12.2.3 The Kohonen Network 244 -- 12.3 Application of Artificial Neural Networks in Machine Fault Diagnosis 245 -- 12.4 Summary 253 -- References 254 -- 13 Support Vector Machines (SVMs) 259 -- 13.1 Introduction 259 -- 13.2 Multiclass SVMs 262 -- 13.3 Selection of Kernel Parameters 263 -- 13.4 Application of SVMs in Machine Fault Diagnosis 263 -- 13.5 Summary 274 -- References 274 -- 14 Deep Learning 279 -- 14.1 Introduction 279 -- 14.2 Autoencoders 280 -- 14.3 Convolutional Neural Networks (CNNs) 283 -- 14.4 Deep Belief Networks (DBNs) 284 -- 14.5 Recurrent Neural Networks (RNNs) 285 -- 14.6 Overview of Deep Learning in MCM 286 -- 14.6.1 Application of AE-based DNNs in Machine Fault Diagnosis 286.
14.6.2 Application of CNNs in Machine Fault Diagnosis 292 -- 14.6.3 Application of DBNs in Machine Fault Diagnosis 296 -- 14.6.4 Application of RNNs in Machine Fault Diagnosis 298 -- 14.7 Summary 299 -- References 301 -- 15 Classification Algorithm Validation 307 -- 15.1 Introduction 307 -- 15.2 The Hold-Out Technique 308 -- 15.2.1 Three-Way Data Split 309 -- 15.3 Random Subsampling 309 -- 15.4 K-Fold Cross-Validation 310 -- 15.5 Leave-One-Out Cross-Validation 311 -- 15.6 Bootstrapping 311 -- 15.7 Overall Classification Accuracy 312 -- 15.8 Confusion Matrix 313 -- 15.9 Recall and Precision 314 -- 15.10 ROC Graphs 315 -- 15.11 Summary 317 -- References 318 -- Part V New Fault Diagnosis Frameworks Designed for MCM 321 -- 16 Compressive Sampling and Subspace Learning (CS-SL) 323 -- 16.1 Introduction 323 -- 16.2 Compressive Sampling for Vibration-Based MCM 325 -- 16.2.1 Compressive Sampling Basics 325 -- 16.2.2 CS for Sparse Frequency Representation 328 -- 16.2.3 CS for Sparse Time-Frequency Representation 329 -- 16.3 Overview of CS in Machine Condition Monitoring 330 -- 16.3.1 Compressed Sensed Data Followed by Complete Data Construction 330 -- 16.3.2 Compressed Sensed Data Followed by Incomplete Data Construction 331 -- 16.3.3 Compressed Sensed Data as the Input of a Classifier 332 -- 16.3.4 Compressed Sensed Data Followed by Feature Learning 333 -- 16.4 Compressive Sampling and Feature Ranking (CS-FR) 333 -- 16.4.1 Implementations 334 -- 16.4.1.1 CS-LS 336 -- 16.4.1.2 CS-FS 336 -- 16.4.1.3 CS-Relief-F 337 -- 16.4.1.4 CS-PCC 338 -- 16.4.1.5 CS-Chi-2 338 -- 16.5 CS and Linear Subspace Learning-Based Framework for Fault Diagnosis 339 -- 16.5.1 Implementations 339 -- 16.5.1.1 CS-PCA 339 -- 16.5.1.2 CS-LDA 340 -- 16.5.1.3 CS-CPDC 341 -- 16.6 CS and Nonlinear Subspace Learning-Based Framework for Fault Diagnosis 343 -- 16.6.1 Implementations 344 -- 16.6.1.1 CS-KPCA 344 -- 16.6.1.2 CS-KLDA 345 -- 16.6.1.3 CS-CMDS 346 -- 16.6.1.4 CS-SPE 346 -- 16.7 Applications 348 -- 16.7.1 Case Study 1 348.
16.7.1.1 The Combination of MMV-CS and Several Feature-Ranking Techniques 350 -- 16.7.1.2 The Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques 352 -- 16.7.2 Case Study 2 354 -- 16.7.2.1 The Combination of MMV-CS and Several Feature-Ranking Techniques 354 -- 16.7.2.2 The Combination of MMV-CS and Several Linear and Nonlinear Subspace Learning Techniques 355 -- 16.8 Discussion 355 -- References 357 -- 17 Compressive Sampling and Deep Neural Network (CS-DNN) 361 -- 17.1 Introduction 361 -- 17.2 Related Work 361 -- 17.3 CS-SAE-DNN 362 -- 17.3.1 Compressed Measurements Generation 362 -- 17.3.2 CS Model Testing Using the Flip Test 363 -- 17.3.3 DNN-Based Unsupervised Sparse Overcomplete Feature Learning 363 -- 17.3.4 Supervised Fine Tuning 367 -- 17.4 Applications 367 -- 17.4.1 Case Study 1 367 -- 17.4.2 Case Study 2 372 -- 17.5 Discussion 375 -- References 375 -- 18 Conclusion 379 -- 18.1 Introduction 379 -- 18.2 Summary and Conclusion 380 -- Appendix Machinery Vibration Data Resources and Analysis Algorithms 389 -- References 394 -- Index 395.
Record Nr. UNINA-9910813338803321
Ahmed Hosameldin <1976->  
Hoboken, New Jersey, USA : , : John Wiley & Sons, Inc., , 2020
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Intelligent Condition Based Monitoring : For Turbines, Compressors, and Other Rotating Machines / / by Nishchal K. Verma, Al Salour
Intelligent Condition Based Monitoring : For Turbines, Compressors, and Other Rotating Machines / / by Nishchal K. Verma, Al Salour
Autore Verma Nishchal K
Edizione [1st ed. 2020.]
Pubbl/distr/stampa Singapore : , : Springer Singapore : , : Imprint : Springer, , 2020
Descrizione fisica 1 online resource (XXX, 302 p.)
Disciplina 621.80287
Collana Studies in Systems, Decision and Control
Soggetto topico Machinery
Computational intelligence
Quality control
Reliability
Industrial safety
Machinery and Machine Elements
Computational Intelligence
Quality Control, Reliability, Safety and Risk
ISBN 981-15-0512-8
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Introduction -- Faults And Data Acquisition -- Preprocessing -- Feature Extraction -- Feature Selection -- Fault Recognisition -- Fault Diagnosis System For Air Compressor Using Palmtop -- Improved Fault Detection Model -- Fault Diagnosis System Using Smartphone -- References.
Record Nr. UNINA-9910373900603321
Verma Nishchal K  
Singapore : , : Springer Singapore : , : Imprint : Springer, , 2020
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Strumenti statistici per la meccanica sperimentale e l'affidabilità / G. Belingardi
Strumenti statistici per la meccanica sperimentale e l'affidabilità / G. Belingardi
Autore Belingardi, Giovanni
Pubbl/distr/stampa Torino, : Libreria editrice universitaria Levrotto & Bella, \1996!
Descrizione fisica 262 p. ; 24 cm
Disciplina 621.8028
621.80287
Collana Collana di progettazione delle macchine
Soggetto topico Macchine - Affidabilità - Prove - Metodi statistici
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione ita
Record Nr. UNISANNIO-RMS0039155
Belingardi, Giovanni  
Torino, : Libreria editrice universitaria Levrotto & Bella, \1996!
Materiale a stampa
Lo trovi qui: Univ. del Sannio
Opac: Controlla la disponibilità qui