05558nam 2200697 a 450 991045556250332120211202093958.01-282-75785-79786612757853981-4271-07-1(CKB)2490000000001739(EBL)1679487(OCoLC)859886714(SSID)ssj0000424957(PQKBManifestationID)11306088(PQKBTitleCode)TC0000424957(PQKBWorkID)10476709(PQKB)10330674(MiAaPQ)EBC1679487(WSP)00000652(Au-PeEL)EBL1679487(CaPaEBR)ebr10422182(CaONFJC)MIL275785(EXLCZ)99249000000000173920100520d2010 uy 0engurcn|||||||||txtccrPattern classification using ensemble methods[electronic resource] /Lior RokachSingapore ;Hackensack, NJ World Scientificc20101 online resource (242 p.)Series in machine perception and artificial intelligence ;v. 75Description based upon print version of record.981-4271-06-3 Includes bibliographical references (p. 185-222) and index.Contents; Preface; 1. Introduction to Pattern Classification; 1.1 Pattern Classification; 1.2 Induction Algorithms; 1.3 Rule Induction; 1.4 Decision Trees; 1.5 Bayesian Methods; 1.5.1 Overview.; 1.5.2 Naıve Bayes; 1.5.2.1 The Basic Naıve Bayes Classifier; 1.5.2.2 Naıve Bayes Induction for Numeric Attributes; 1.5.2.3 Correction to the Probability Estimation; 1.5.2.4 Laplace Correction; 1.5.2.5 No Match; 1.5.3 Other Bayesian Methods; 1.6 Other Induction Methods; 1.6.1 Neural Networks; 1.6.2 Genetic Algorithms; 1.6.3 Instance-based Learning; 1.6.4 Support Vector Machines2. Introduction to Ensemble Learning 2.1 Back to the Roots; 2.2 The Wisdom of Crowds; 2.3 The Bagging Algorithm; 2.4 The Boosting Algorithm; 2.5 The Ada Boost Algorithm; 2.6 No Free Lunch Theorem and Ensemble Learning; 2.7 Bias-Variance Decomposition and Ensemble Learning; 2.8 Occam's Razor and Ensemble Learning; 2.9 Classifier Dependency; 2.9.1 Dependent Methods; 2.9.1.1 Model-guided Instance Selection; 2.9.1.2 Basic Boosting Algorithms; 2.9.1.3 Advanced Boosting Algorithms; 2.9.1.4 Incremental Batch Learning; 2.9.2 Independent Methods; 2.9.2.1 Bagging; 2.9.2.2 Wagging2.9.2.3 Random Forest and Random Subspace Projection 2.9.2.4 Non-Linear Boosting Projection (NLBP); 2.9.2.5 Cross-validated Committees; 2.9.2.6 Robust Boosting; 2.10 Ensemble Methods for Advanced Classification Tasks; 2.10.1 Cost-Sensitive Classification; 2.10.2 Ensemble for Learning Concept Drift; 2.10.3 Reject Driven Classification; 3. Ensemble Classification; 3.1 Fusions Methods; 3.1.1 Weighting Methods; 3.1.2 Majority Voting; 3.1.3 Performance Weighting; 3.1.4 Distribution Summation; 3.1.5 Bayesian Combination; 3.1.6 Dempster-Shafer; 3.1.7 Vogging; 3.1.8 Naıve Bayes3.1.9 Entropy Weighting 3.1.10 Density-based Weighting; 3.1.11 DEA Weighting Method; 3.1.12 Logarithmic Opinion Pool; 3.1.13 Order Statistics; 3.2 Selecting Classification; 3.2.1 Partitioning the Instance Space; 3.2.1.1 The K-Means Algorithm as a Decomposition Tool; 3.2.1.2 Determining the Number of Subsets; 3.2.1.3 The Basic K-Classifier Algorithm; 3.2.1.4 The Heterogeneity Detecting K-Classifier (HDK-Classifier); 3.2.1.5 Running-Time Complexity; 3.3 Mixture of Experts and Meta Learning; 3.3.1 Stacking; 3.3.2 Arbiter Trees; 3.3.3 Combiner Trees; 3.3.4 Grading; 3.3.5 Gating Network4. Ensemble Diversity 4.1 Overview; 4.2 Manipulating the Inducer; 4.2.1 Manipulation of the Inducer's Parameters; 4.2.2 Starting Point in Hypothesis Space; 4.2.3 Hypothesis Space Traversal; 4.3 Manipulating the Training Samples; 4.3.1 Resampling; 4.3.2 Creation; 4.3.3 Partitioning; 4.4 Manipulating the Target Attribute Representation; 4.4.1 Label Switching; 4.5 Partitioning the Search Space; 4.5.1 Divide and Conquer; 4.5.2 Feature Subset-based Ensemble Methods; 4.5.2.1 Random-based Strategy; 4.5.2.2 Reduct-based Strategy; 4.5.2.3 Collective-Performance-based Strategy4.5.2.4 Feature Set PartitioningResearchers from various disciplines such as pattern recognition, statistics, and machine learning have explored the use of ensemble methodology since the late seventies. Thus, they are faced with a wide variety of methods, given the growing interest in the field. This book aims to impose a degree of order upon this diversity by presenting a coherent and unified repository of ensemble methods, theories, trends, challenges and applications. The book describes in detail the classical methods, as well as the extensions and novel approaches developed recently. Along with algorithmic descriptionsSeries in machine perception and artificial intelligence ;v. 75.Pattern recognition systemsAlgorithmsMachine learningElectronic books.Pattern recognition systems.Algorithms.Machine learning.621.389/28Rokach Lior620362MiAaPQMiAaPQMiAaPQBOOK9910455562503321Pattern classification using ensemble methods2100521UNINA