LEADER 05558nam 2200697 a 450 001 9910455562503321 005 20211202093958.0 010 $a1-282-75785-7 010 $a9786612757853 010 $a981-4271-07-1 035 $a(CKB)2490000000001739 035 $a(EBL)1679487 035 $a(OCoLC)859886714 035 $a(SSID)ssj0000424957 035 $a(PQKBManifestationID)11306088 035 $a(PQKBTitleCode)TC0000424957 035 $a(PQKBWorkID)10476709 035 $a(PQKB)10330674 035 $a(MiAaPQ)EBC1679487 035 $a(WSP)00000652 035 $a(Au-PeEL)EBL1679487 035 $a(CaPaEBR)ebr10422182 035 $a(CaONFJC)MIL275785 035 $a(EXLCZ)992490000000001739 100 $a20100520d2010 uy 0 101 0 $aeng 135 $aurcn||||||||| 181 $ctxt 182 $cc 183 $acr 200 10$aPattern classification using ensemble methods$b[electronic resource] /$fLior Rokach 210 $aSingapore ;$aHackensack, NJ $cWorld Scientific$dc2010 215 $a1 online resource (242 p.) 225 1 $aSeries in machine perception and artificial intelligence ;$vv. 75 300 $aDescription based upon print version of record. 311 $a981-4271-06-3 320 $aIncludes bibliographical references (p. 185-222) and index. 327 $aContents; Preface; 1. Introduction to Pattern Classification; 1.1 Pattern Classification; 1.2 Induction Algorithms; 1.3 Rule Induction; 1.4 Decision Trees; 1.5 Bayesian Methods; 1.5.1 Overview.; 1.5.2 Na?ve Bayes; 1.5.2.1 The Basic Na?ve Bayes Classifier; 1.5.2.2 Na?ve Bayes Induction for Numeric Attributes; 1.5.2.3 Correction to the Probability Estimation; 1.5.2.4 Laplace Correction; 1.5.2.5 No Match; 1.5.3 Other Bayesian Methods; 1.6 Other Induction Methods; 1.6.1 Neural Networks; 1.6.2 Genetic Algorithms; 1.6.3 Instance-based Learning; 1.6.4 Support Vector Machines 327 $a2. Introduction to Ensemble Learning 2.1 Back to the Roots; 2.2 The Wisdom of Crowds; 2.3 The Bagging Algorithm; 2.4 The Boosting Algorithm; 2.5 The Ada Boost Algorithm; 2.6 No Free Lunch Theorem and Ensemble Learning; 2.7 Bias-Variance Decomposition and Ensemble Learning; 2.8 Occam's Razor and Ensemble Learning; 2.9 Classifier Dependency; 2.9.1 Dependent Methods; 2.9.1.1 Model-guided Instance Selection; 2.9.1.2 Basic Boosting Algorithms; 2.9.1.3 Advanced Boosting Algorithms; 2.9.1.4 Incremental Batch Learning; 2.9.2 Independent Methods; 2.9.2.1 Bagging; 2.9.2.2 Wagging 327 $a2.9.2.3 Random Forest and Random Subspace Projection 2.9.2.4 Non-Linear Boosting Projection (NLBP); 2.9.2.5 Cross-validated Committees; 2.9.2.6 Robust Boosting; 2.10 Ensemble Methods for Advanced Classification Tasks; 2.10.1 Cost-Sensitive Classification; 2.10.2 Ensemble for Learning Concept Drift; 2.10.3 Reject Driven Classification; 3. Ensemble Classification; 3.1 Fusions Methods; 3.1.1 Weighting Methods; 3.1.2 Majority Voting; 3.1.3 Performance Weighting; 3.1.4 Distribution Summation; 3.1.5 Bayesian Combination; 3.1.6 Dempster-Shafer; 3.1.7 Vogging; 3.1.8 Na?ve Bayes 327 $a3.1.9 Entropy Weighting 3.1.10 Density-based Weighting; 3.1.11 DEA Weighting Method; 3.1.12 Logarithmic Opinion Pool; 3.1.13 Order Statistics; 3.2 Selecting Classification; 3.2.1 Partitioning the Instance Space; 3.2.1.1 The K-Means Algorithm as a Decomposition Tool; 3.2.1.2 Determining the Number of Subsets; 3.2.1.3 The Basic K-Classifier Algorithm; 3.2.1.4 The Heterogeneity Detecting K-Classifier (HDK-Classifier); 3.2.1.5 Running-Time Complexity; 3.3 Mixture of Experts and Meta Learning; 3.3.1 Stacking; 3.3.2 Arbiter Trees; 3.3.3 Combiner Trees; 3.3.4 Grading; 3.3.5 Gating Network 327 $a4. Ensemble Diversity 4.1 Overview; 4.2 Manipulating the Inducer; 4.2.1 Manipulation of the Inducer's Parameters; 4.2.2 Starting Point in Hypothesis Space; 4.2.3 Hypothesis Space Traversal; 4.3 Manipulating the Training Samples; 4.3.1 Resampling; 4.3.2 Creation; 4.3.3 Partitioning; 4.4 Manipulating the Target Attribute Representation; 4.4.1 Label Switching; 4.5 Partitioning the Search Space; 4.5.1 Divide and Conquer; 4.5.2 Feature Subset-based Ensemble Methods; 4.5.2.1 Random-based Strategy; 4.5.2.2 Reduct-based Strategy; 4.5.2.3 Collective-Performance-based Strategy 327 $a4.5.2.4 Feature Set Partitioning 330 $aResearchers from various disciplines such as pattern recognition, statistics, and machine learning have explored the use of ensemble methodology since the late seventies. Thus, they are faced with a wide variety of methods, given the growing interest in the field. This book aims to impose a degree of order upon this diversity by presenting a coherent and unified repository of ensemble methods, theories, trends, challenges and applications. The book describes in detail the classical methods, as well as the extensions and novel approaches developed recently. Along with algorithmic descriptions 410 0$aSeries in machine perception and artificial intelligence ;$vv. 75. 606 $aPattern recognition systems 606 $aAlgorithms 606 $aMachine learning 608 $aElectronic books. 615 0$aPattern recognition systems. 615 0$aAlgorithms. 615 0$aMachine learning. 676 $a621.389/28 700 $aRokach$b Lior$0620362 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910455562503321 996 $aPattern classification using ensemble methods$92100521 997 $aUNINA