LEADER 05854nam 22007815 450 001 996465562403316 005 20210216211331.0 010 $a3-540-45014-9 024 7 $a10.1007/3-540-45014-9 035 $a(CKB)1000000000211272 035 $a(SSID)ssj0000325034 035 $a(PQKBManifestationID)11234405 035 $a(PQKBTitleCode)TC0000325034 035 $a(PQKBWorkID)10320833 035 $a(PQKB)10355186 035 $a(DE-He213)978-3-540-45014-6 035 $a(MiAaPQ)EBC3072852 035 $a(PPN)155193597 035 $a(EXLCZ)991000000000211272 100 $a20121227d2000 u| 0 101 0 $aeng 135 $aurnn#008mamaa 181 $ctxt 182 $cc 183 $acr 200 10$aMultiple Classifier Systems$b[electronic resource] $eFirst International Workshop, MCS 2000 Cagliari, Italy, June 21-23, 2000 Proceedings /$fedited by Josef Kittler, Fabio Roli 205 $a1st ed. 2000. 210 1$aBerlin, Heidelberg :$cSpringer Berlin Heidelberg :$cImprint: Springer,$d2000. 215 $a1 online resource (XII, 408 p.) 225 1 $aLecture Notes in Computer Science,$x0302-9743 ;$v1857 300 $aBibliographic Level Mode of Issuance: Monograph 311 $a3-540-67704-6 320 $aIncludes bibliographical references at the end of each chapters and index. 327 $aEnsemble Methods in Machine Learning -- Experiments with Classifier Combining Rules -- The ?Test and Select? Approach to Ensemble Combination -- A Survey of Sequential Combination of Word Recognizers in Handwritten Phrase Recognition at CEDAR -- Multiple Classifier Combination Methodologies for Different Output Levels -- A Mathematically Rigorous Foundation for Supervised Learning -- Classifier Combinations: Implementations and Theoretical Issues -- Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification -- Complexity of Classification Problems and Comparative Advantages of Combined Classifiers -- Effectiveness of Error Correcting Output Codes in Multiclass Learning Problems -- Combining Fisher Linear Discriminants for Dissimilarity Representations -- A Learning Method of Feature Selection for Rough Classification -- Analysis of a Fusion Method for Combining Marginal Classifiers -- A hybrid projection based and radial basis function architecture -- Combining Multiple Classifiers in Probabilistic Neural Networks -- Supervised Classifier Combination through Generalized Additive Multi-model -- Dynamic Classifier Selection -- Boosting in Linear Discriminant Analysis -- Different Ways of Weakening Decision Trees and Their Impact on Classification Accuracy of DT Combination -- Applying Boosting to Similarity Literals for Time Series Classification -- Boosting of Tree-Based Classifiers for Predictive Risk Modeling in GIS -- A New Evaluation Method for Expert Combination in Multi-expert System Designing -- Diversity between Neural Networks and Decision Trees for Building Multiple Classifier Systems -- Self-Organizing Decomposition of Functions -- Classifier Instability and Partitioning -- A Hierarchical Multiclassifier System for Hyperspectral Data Analysis -- Consensus Based Classification of Multisource Remote Sensing Data -- Combining Parametric and Nonparametric Classifiers for an Unsupervised Updating of Land-Cover Maps -- A Multiple Self-Organizing Map Scheme for Remote Sensing Classification -- Use of Lexicon Density in Evaluating Word Recognizers -- A Multi-expert System for Dynamic Signature Verification -- A Cascaded Multiple Expert System for Verification -- Architecture for Classifier Combination Using Entropy Measures -- Combining Fingerprint Classifiers -- Statistical Sensor Calibration for Fusion of Different Classifiers in a Biometric Person Recognition Framework -- A Modular Neuro-Fuzzy Network for Musical Instruments Classification -- Classifier Combination for Grammar-Guided Sentence Recognition -- Shape Matching and Extraction by an Array of Figure-and-Ground Classifiers. 410 0$aLecture Notes in Computer Science,$x0302-9743 ;$v1857 606 $aEnsemble learning (Machine learning) 606 $aPattern recognition 606 $aArtificial intelligence 606 $aOptical data processing 606 $aAlgorithms 606 $aComputers 606 $aPattern Recognition$3https://scigraph.springernature.com/ontologies/product-market-codes/I2203X 606 $aArtificial Intelligence$3https://scigraph.springernature.com/ontologies/product-market-codes/I21000 606 $aImage Processing and Computer Vision$3https://scigraph.springernature.com/ontologies/product-market-codes/I22021 606 $aAlgorithm Analysis and Problem Complexity$3https://scigraph.springernature.com/ontologies/product-market-codes/I16021 606 $aComputation by Abstract Devices$3https://scigraph.springernature.com/ontologies/product-market-codes/I16013 615 0$aEnsemble learning (Machine learning) 615 0$aPattern recognition. 615 0$aArtificial intelligence. 615 0$aOptical data processing. 615 0$aAlgorithms. 615 0$aComputers. 615 14$aPattern Recognition. 615 24$aArtificial Intelligence. 615 24$aImage Processing and Computer Vision. 615 24$aAlgorithm Analysis and Problem Complexity. 615 24$aComputation by Abstract Devices. 676 $a006.3/1 702 $aKittler$b Josef$4edt$4http://id.loc.gov/vocabulary/relators/edt 702 $aRoli$b Fabio$4edt$4http://id.loc.gov/vocabulary/relators/edt 712 12$aInternational Workshop on Multiple Classifier Systems$d(1st :$f2000 :$eCagliari, Italy) 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996465562403316 996 $aMultiple Classifier Systems$9772217 997 $aUNISA LEADER 04665nam 22007095 450 001 9910746282903321 005 20250604142951.0 010 $a9783031358517 010 $a3031358511 024 7 $a10.1007/978-3-031-35851-7 035 $a(CKB)28270280000041 035 $a(MiAaPQ)EBC30749666 035 $a(Au-PeEL)EBL30749666 035 $a(OCoLC)1399170835 035 $a(DE-He213)978-3-031-35851-7 035 $a(EXLCZ)9928270280000041 100 $a20230919d2023 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aStatistical Learning in Genetics $eAn Introduction Using R /$fby Daniel Sorensen 205 $a1st ed. 2023. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2023. 215 $a1 online resource (696 pages) 225 1 $aStatistics for Biology and Health,$x2197-5671 311 1 $a9783031358500 330 $aThis book provides an introduction to computer-based methods for the analysis of genomic data. Breakthroughs in molecular and computational biology have contributed to the emergence of vast data sets, where millions of genetic markers for each individual are coupled with medical records, generating an unparalleled resource for linking human genetic variation to human biology and disease. Similar developments have taken place in animal and plant breeding, where genetic marker information is combined with production traits. An important task for the statistical geneticist is to adapt, construct and implement models that can extract information from these large-scale data. An initial step is to understand the methodology that underlies the probability models and to learn the modern computer-intensive methods required for fitting these models. The objective of this book, suitable for readers who wish to develop analytic skills to perform genomic research, is to provide guidance to take this first step. This book is addressed to numerate biologists who typically lack the formal mathematical background of the professional statistician. For this reason, considerably more detail in explanations and derivations is offered. It is written in a concise style and examples are used profusely. A large proportion of the examples involve programming with the open-source package R. The R code needed to solve the exercises is provided. The MarkDown interface allows the students to implement the code on their own computer, contributing to a better understanding of the underlying theory. Part I presents methods of inference based on likelihood and Bayesian methods, including computational techniques for fitting likelihood and Bayesian models. Part II discusses prediction for continuous and binary data using both frequentist and Bayesian approaches. Some of the models used for prediction are also used for gene discovery. The challenge is to find promising genes without incurring a large proportion of false positive results. Therefore, Part II includes a detour on False Discovery Rate assuming frequentist and Bayesian perspectives. The last chapter of Part II provides an overview of a selected number of non-parametric methods. Part III consists of exercises and their solutions. Daniel Sorensen holds PhD and DSc degrees from the University of Edinburgh and is an elected Fellow of the American Statistical Association. He was professor of Statistical Genetics at Aarhus University where, at present, he is professor emeritus. 410 0$aStatistics for Biology and Health,$x2197-5671 606 $aStatistics 606 $aQuantitative research 606 $aBiometry 606 $aGenetics 606 $aStatistical Theory and Methods 606 $aData Analysis and Big Data 606 $aBiostatistics 606 $aGenetics 606 $aGenčtica$2thub 606 $aEstadística matemātica$2thub 606 $aR (Llenguatge de programaciķ)$2thub 608 $aLlibres electrōnics$2thub 615 0$aStatistics. 615 0$aQuantitative research. 615 0$aBiometry. 615 0$aGenetics. 615 14$aStatistical Theory and Methods. 615 24$aData Analysis and Big Data. 615 24$aBiostatistics. 615 24$aGenetics. 615 7$aGenčtica 615 7$aEstadística matemātica 615 7$aR (Llenguatge de programaciķ) 676 $a576.5015195 700 $aSorensen$b Daniel$01429691 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910746282903321 996 $aStatistical Learning in Genetics$93568944 997 $aUNINA