Vai al contenuto principale della pagina
Autore: | Jensen Richard |
Titolo: | Computational intelligence and feature selection : rough and fuzzy approaches / / by Richard Jensen, Qiang Shen |
Pubblicazione: | Oxford, : Wiley, 2008 |
Descrizione fisica: | 1 online resource (357 p.) |
Disciplina: | 006.30151132 |
Soggetto topico: | Artificial intelligence - Mathematical models |
Set theory | |
Altri autori: | ShenQiang |
Note generali: | Description based upon print version of record. |
Nota di bibliografia: | Includes bibliographical references (p. 313-336) and index. |
Nota di contenuto: | PREFACE -- 1 THE IMPORTANCE OF FEATURE SELECTION -- 1.1. Knowledge Discovery -- 1.2. Feature Selection -- 1.2.1. The Task -- 1.2.2. The Benefits -- 1.3. Rough Sets -- 1.4. Applications -- 1.5. Structure -- 2 SET THEORY -- 2.1. Classical Set Theory -- 2.1.1. Definition -- 2.1.2. Subsets -- 2.1.3. Operators -- 2.2. Fuzzy Set Theory -- 2.2.1. Definition -- 2.2.2. Operators -- 2.2.3. Simple Example -- 2.2.4. Fuzzy Relations and Composition -- 2.2.5. Approximate Reasoning -- 2.2.6. Linguistic Hedges -- 2.2.7. Fuzzy Sets and Probability -- 2.3. Rough Set Theory -- 2.3.1. Information and Decision Systems -- 2.3.2. Indiscernibility -- 2.3.3. Lower and Upper Approximations -- 2.3.4. Positive, Negative, and Boundary Regions -- 2.3.5. Feature Dependency and Significance -- 2.3.6. Reducts -- 2.3.7. Discernibility Matrix -- 2.4. Fuzzy-Rough Set Theory -- 2.4.1. Fuzzy Equivalence Classes -- 2.4.2. Fuzzy-Rough Sets -- 2.4.3. Rough-Fuzzy Sets -- 2.4.4. Fuzzy-Rough Hybrids -- 2.5. Summary -- 3 CLASSIFICATION METHODS -- 3.1. Crisp Approaches -- 3.1.1. Rule Inducers -- 3.1.2. Decision Trees -- 3.1.3. Clustering -- 3.1.4. Naive Bayes -- 3.1.5. Inductive Logic Programming -- 3.2. Fuzzy Approaches -- 3.2.1. Lozowski's Method -- 3.2.2. Subsethood-Based Methods -- 3.2.3. Fuzzy Decision Trees -- 3.2.4. Evolutionary Approaches -- 3.3. Rulebase Optimization -- 3.3.1. Fuzzy Interpolation -- 3.3.2. Fuzzy Rule Optimization -- 3.4. Summary -- 4 DIMENSIONALITY REDUCTION -- 4.1. Transformation-Based Reduction -- 4.1.1. Linear Methods -- 4.1.2. Nonlinear Methods -- 4.2. Selection-Based Reduction -- 4.2.1. Filter Methods -- 4.2.2. Wrapper Methods -- 4.2.3. Genetic Approaches -- 4.2.4. Simulated Annealing Based Feature Selection -- 4.3. Summary -- 5 ROUGH SET BASED APPROACHES TO FEATURE SELECTION -- 5.1. Rough Set Attribute Reduction -- 5.1.1. Additional Search Strategies -- 5.1.2. Proof of QUICKREDUCT Monotonicity -- 5.2. RSAR Optimizations. |
5.2.1. Implementation Goals -- 5.2.2. Implementational Optimizations -- 5.3. Discernibility Matrix Based Approaches -- 5.3.1. Johnson Reducer -- 5.3.2. Compressibility Algorithm -- 5.4. Reduction with Variable Precision Rough Sets -- 5.5. Dynamic Reducts -- 5.6. Relative Dependency Method -- 5.7. Tolerance-Based Method -- 5.7.1. Similarity Measures -- 5.7.2. Approximations and Dependency -- 5.8. Combined Heuristic Method -- 5.9. Alternative Approaches -- 5.10. Comparison of Crisp Approaches -- 5.10.1. Dependency Degree Based Approaches -- 5.10.2. Discernibility Matrix Based Approaches -- 5.11. Summary -- 6 APPLICATIONS I: USE OF RSAR -- 6.1. Medical Image Classification -- 6.1.1. Problem Case -- 6.1.2. Neural Network Modeling -- 6.1.3. Results -- 6.2. Text Categorization -- 6.2.1. Problem Case -- 6.2.2. Metrics -- 6.2.3. Datasets Used -- 6.2.4. Dimensionality Reduction -- 6.2.5. Information Content of Rough Set Reducts -- 6.2.6. Comparative Study of TC Methodologies -- 6.2.7. Efficiency Considerations of RSAR -- 6.2.8. Generalization -- 6.3. Algae Estimation -- 6.3.1. Problem Case -- 6.3.2. Results -- 6.4. Other Applications -- 6.4.1. Prediction of Business Failure -- 6.4.2. Financial Investment -- 6.4.3. Bioinformatics and Medicine -- 6.4.4. Fault Diagnosis -- 6.4.5. Spacial and Meteorological Pattern Classification -- 6.4.6. Music and Acoustics -- 6.5. Summary -- 7 ROUGH AND FUZZY HYBRIDIZATION -- 7.1. Introduction -- 7.2. Theoretical Hybridization -- 7.3. Supervised Learning and Information Retrieval -- 7.4. Feature Selection -- 7.5. Unsupervised Learning and Clustering -- 7.6. Neurocomputing -- 7.7. Evolutionary and Genetic Algorithms -- 7.8. Summary -- 8 FUZZY-ROUGH FEATURE SELECTION -- 8.1. Feature Selection with Fuzzy-Rough Sets -- 8.2. Fuzzy-Rough Reduction Process -- 8.3. Fuzzy-Rough QuickReduct -- 8.4. Complexity Analysis -- 8.5. Worked Examples -- 8.5.1. Crisp Decisions -- 8.5.2. Fuzzy Decisions. | |
8.6. Optimizations -- 8.7. Evaluating the Fuzzy-Rough Metric -- 8.7.1. Compared Metrics -- 8.7.2. Metric Comparison -- 8.7.3. Application to Financial Data -- 8.8. Summary -- 9 NEW DEVELOPMENTS OF FRFS -- 9.1. Introduction -- 9.2. New Fuzzy-Rough Feature Selection -- 9.2.1. Fuzzy Lower Approximation Based FS -- 9.2.2. Fuzzy Boundary Region Based FS -- 9.2.3. Fuzzy-Rough Reduction with Fuzzy Entropy -- 9.2.4. Fuzzy-Rough Reduction with Fuzzy Gain Ratio -- 9.2.5. Fuzzy Discernibility Matrix Based FS -- 9.2.6. Vaguely Quantified Rough Sets (VQRS) -- 9.3. Experimentation -- 9.3.1. Experimental Setup -- 9.3.2. Experimental Results -- 9.3.3. Fuzzy Entropy Experimentation -- 9.4. Proofs -- 9.5. Summary -- 10 FURTHER ADVANCED FS METHODS -- 10.1. Feature Grouping -- 10.1.1. Fuzzy Dependency -- 10.1.2. Scaled Dependency -- 10.1.3. The Feature Grouping Algorithm -- 10.1.4. Selection Strategies -- 10.1.5. Algorithmic Complexity -- 10.2. Ant Colony Optimization-Based Selection -- 10.2.1. Ant Colony Optimization -- 10.2.2. Traveling Salesman Problem -- 10.2.3. Ant-Based Feature Selection -- 10.3. Summary -- 11 APPLICATIONS II: WEB CONTENT CATEGORIZATION -- 11.1. Text Categorization -- 11.1.1. Rule-Based Classification -- 11.1.2. Vector-Based Classification -- 11.1.3. Latent Semantic Indexing -- 11.1.4. Probabilistic -- 11.1.5. Term Reduction -- 11.2. System Overview -- 11.3. Bookmark Classification -- 11.3.1. Existing Systems -- 11.3.2. Overview -- 11.3.3. Results -- 11.4. Web Site Classification -- 11.4.1. Existing Systems -- 11.4.2. Overview -- 11.4.3. Results -- 11.5. Summary -- 12 APPLICATIONS III: COMPLEX SYSTEMS MONITORING -- 12.1. The Application -- 12.1.1. Problem Case -- 12.1.2. Monitoring System -- 12.2. Experimental Results -- 12.2.1. Comparison with Unreduced Features -- 12.2.2. Comparison with Entropy-Based Feature Selection -- 12.2.3. Comparison with PCA and Random Reduction -- 12.2.4. Alternative Fuzzy Rule Inducer. | |
12.2.5. Results with Feature Grouping -- 12.2.6. Results with Ant-Based FRFS -- 12.3. Summary -- 13 APPLICATIONS IV: ALGAE POPULATION ESTIMATION -- 13.1. Application Domain -- 13.1.1. Domain Description -- 13.1.2. Predictors -- 13.2. Experimentation -- 13.2.1. Impact of Feature Selection -- 13.2.2. Comparison with Relief -- 13.2.3. Comparison with Existing Work -- 13.3. Summary -- 14 APPLICATIONS V: FORENSIC GLASS ANALYSIS -- 14.1. Background -- 14.2. Estimation of Likelihood Ratio -- 14.2.1. Exponential Model -- 14.2.2. Biweight Kernel Estimation -- 14.2.3. Likelihood Ratio with Biweight and Boundary Kernels -- 14.2.4. Adaptive Kernel -- 14.3. Application -- 14.3.1. Fragment Elemental Analysis -- 14.3.2. Data Preparation -- 14.3.3. Feature Selection -- 14.3.4. Estimators -- 14.4. Experimentation -- 14.4.1. Feature Evaluation -- 14.4.2. Likelihood Ratio Estimation -- 14.5. Glass Classification -- 14.6. Summary -- 15 SUPPLEMENTARY DEVELOPMENTS AND INVESTIGATIONS -- 15.1. RSAR-SAT -- 15.1.1. Finding Rough Set Reducts -- 15.1.2. Preprocessing Clauses -- 15.1.3. Evaluation -- 15.2. Fuzzy-Rough Decision Trees -- 15.2.1. Explanation -- 15.2.2. Experimentation -- 15.3. Fuzzy-Rough Rule Induction -- 15.4. Hybrid Rule Induction -- 15.4.1. Hybrid Approach -- 15.4.2. Rule Search -- 15.4.3. Walkthrough -- 15.4.4. Experimentation -- 15.5. Fuzzy Universal Reducts -- 15.6. Fuzzy-Rough Clustering -- 15.6.1. Fuzzy-Rough c-Means -- 15.6.2. General Fuzzy-Rough Clustering -- 15.7. Fuzzification Optimization -- 15.8. Summary -- APPENDIX A: METRIC COMPARISON RESULTS: CLASSIFICATION DATASETS -- APPENDIX B: METRIC COMPARISON RESULTS: REGRESSION DATASETS -- REFERENCES -- INDEX. | |
Sommario/riassunto: | The rough and fuzzy set approaches presented here open up many new frontiers for continued research and development Computational Intelligence and Feature Selection provides readers with the background and fundamental ideas behind Feature Selection (FS), with an emphasis on techniques based on rough and fuzzy sets. For readers who are less familiar with the subject, the book begins with an introduction to fuzzy set theory and fuzzy-rough set theory. Building on this foundation, the book provides: . A critical review of FS methods, with particular emphasis on their current limitations. Program files implementing major algorithms, together with the necessary instructions and datasets, available on a related Web site. Coverage of the background and fundamental ideas behind FS. A systematic presentation of the leading methods reviewed in a consistent algorithmic framework. Real-world applications with worked examples that illustrate the power and efficacy of the FS approaches covered. An investigation of the associated areas of FS, including rule induction and clustering methods using hybridizations of fuzzy and rough set theories Computational Intelligence and Feature Selection is an ideal resource for advanced undergraduates, postgraduates, researchers, and professional engineers. However, its straightforward presentation of the underlying concepts makes the book meaningful to specialists and nonspecialists alike. |
Titolo autorizzato: | Computational intelligence and feature selection |
ISBN: | 1-281-83135-2 |
9786611831356 | |
0-470-37788-7 | |
0-470-37791-7 | |
Formato: | Materiale a stampa |
Livello bibliografico | Monografia |
Lingua di pubblicazione: | Inglese |
Record Nr.: | 9910877811303321 |
Lo trovi qui: | Univ. Federico II |
Opac: | Controlla la disponibilità qui |