top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Introduction to statistical machine learning / / Masashi Sugiyama
Introduction to statistical machine learning / / Masashi Sugiyama
Autore Sugiyama Masashi <1974->
Pubbl/distr/stampa Amsterdam : , : Elsevier, , [2016]
Descrizione fisica 1 online resource (535 p.)
Disciplina 006.3/1
Soggetto topico Machine learning - Statistical methods
Information science - Statistical methods
Pattern recognition systems
ISBN 0-12-802350-3
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Front Cover; Introduction to Statistical Machine Learning; Copyright; Table of Contents; Biography; Preface; 1 INTRODUCTION; 1 Statistical Machine Learning; 1.1 Types of Learning; 1.2 Examples of Machine Learning Tasks; 1.2.1 Supervised Learning; 1.2.2 Unsupervised Learning; 1.2.3 Further Topics; 1.3 Structure of This Textbook; 2 STATISTICS AND PROBABILITY; 2 Random Variables and Probability Distributions; 2.1 Mathematical Preliminaries; 2.2 Probability; 2.3 Random Variable and Probability Distribution; 2.4 Properties of Probability Distributions; 2.4.1 Expectation, Median, and Mode
2.4.2 Variance and Standard Deviation2.4.3 Skewness, Kurtosis, and Moments; 2.5 Transformation of Random Variables; 3 Examples of Discrete Probability Distributions; 3.1 Discrete Uniform Distribution; 3.2 Binomial Distribution; 3.3 Hypergeometric Distribution; 3.4 Poisson Distribution; 3.5 Negative Binomial Distribution; 3.6 Geometric Distribution; 4 Examples of Continuous Probability Distributions; 4.1 Continuous Uniform Distribution; 4.2 Normal Distribution; 4.3 Gamma Distribution, Exponential Distribution, and Chi-Squared Distribution; 4.4 Beta Distribution
4.5 Cauchy Distribution and Laplace Distribution4.6 t-Distribution and F-Distribution; 5 Multidimensional Probability Distributions; 5.1 Joint Probability Distribution; 5.2 Conditional Probability Distribution; 5.3 Contingency Table; 5.4 Bayes' Theorem; 5.5 Covariance and Correlation; 5.6 Independence; 6 Examples of Multidimensional Probability Distributions; 6.1 Multinomial Distribution; 6.2 Multivariate Normal Distribution; 6.3 Dirichlet Distribution; 6.4 Wishart Distribution; 7 Sum of Independent Random Variables; 7.1 Convolution; 7.2 Reproductive Property; 7.3 Law of Large Numbers
7.4 Central Limit Theorem8 Probability Inequalities; 8.1 Union Bound; 8.2 Inequalities for Probabilities; 8.2.1 Markov's Inequality and Chernoff's Inequality; 8.2.2 Cantelli's Inequality and Chebyshev's Inequality; 8.3 Inequalities for Expectation; 8.3.1 Jensen's Inequality; 8.3.2 Hölder's Inequality and Schwarz's Inequality; 8.3.3 Minkowski's Inequality; 8.3.4 Kantorovich's Inequality; 8.4 Inequalities for the Sum of Independent Random Variables; 8.4.1 Chebyshev's Inequality and Chernoff's Inequality; 8.4.2 Hoeffding's Inequality and Bernstein's Inequality; 8.4.3 Bennett's Inequality
9 Statistical Estimation9.1 Fundamentals of Statistical Estimation; 9.2 Point Estimation; 9.2.1 Parametric Density Estimation; 9.2.2 Nonparametric Density Estimation; 9.2.3 Regression and Classification; 9.2.4 Model Selection; 9.3 Interval Estimation; 9.3.1 Interval Estimation for Expectation of Normal Samples; 9.3.2 Bootstrap Confidence Interval; 9.3.3 Bayesian Credible Interval; 10 Hypothesis Testing; 10.1 Fundamentals of Hypothesis Testing; 10.2 Test for Expectation of Normal Samples; 10.3 Neyman-Pearson Lemma; 10.4 Test for Contingency Tables
10.5 Test for Difference in Expectations of Normal Samples
Record Nr. UNINA-9910583088403321
Sugiyama Masashi <1974->  
Amsterdam : , : Elsevier, , [2016]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine learning in non-stationary environments : introduction to covariate shift adaptation / / Masashi Sugiyama and Motoaki Kawanabe
Machine learning in non-stationary environments : introduction to covariate shift adaptation / / Masashi Sugiyama and Motoaki Kawanabe
Autore Sugiyama Masashi <1974->
Pubbl/distr/stampa Cambridge, Mass., : MIT Press, ©2012
Descrizione fisica 1 online resource (279 p.)
Disciplina 006.3/1
Altri autori (Persone) KawanabeMotoaki
Collana Adaptive computation and machine learning
Soggetto topico Machine learning
Soggetto non controllato COMPUTER SCIENCE/Machine Learning & Neural Networks
COMPUTER SCIENCE/General
COMPUTER SCIENCE/Artificial Intelligence
ISBN 0-262-30043-5
1-280-49922-2
9786613594457
0-262-30122-9
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Contents; Foreword; Preface; I INTRODUCTION; 1 Introduction and Problem Formulation; 1.1 Machine Learning under Covariate Shift; 1.2 Quick Tour of Covariate Shift Adaptation; 1.3 Problem Formulation; 1.4 Structure of This Book; II LEARNING UNDER COVARIATE SHIFT; 2 Function Approximation; 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation; 2.2 Examples of Importance-Weighted Regression Methods; 2.3 Examples of Importance-Weighted Classification Methods; 2.4 Numerical Examples; 2.5 Summary and Discussion; 3 Model Selection; 3.1 Importance-Weighted Akaike Information Criterion
3.2 Importance-Weighted Subspace Information Criterion3.3 Importance-Weighted Cross-Validation; 3.4 Numerical Examples; 3.5 Summary and Discussion; 4 Importance Estimation; 4.1 Kernel Density Estimation; 4.2 Kernel Mean Matching; 4.3 Logistic Regression; 4.4 Kullback-Leibler Importance Estimation Procedure; 4.5 Least-Squares Importance Fitting; 4.6 Unconstrained Least-Squares Importance Fitting; 4.7 Numerical Examples; 4.8 Experimental Comparison; 4.9 Summary; 5 Direct Density-Ratio Estimation with Dimensionality Reduction; 5.1 Density Difference in Hetero-Distributional Subspace
5.2 Characterization of Hetero-Distributional Subspace5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality Reduction; 5.4 Using LFDA for Finding Hetero-Distributional Subspace; 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace; 5.6 Numerical Examples; 5.7 Summary; 6 Relation to Sample Selection Bias; 6.1 Heckman's Sample Selection Model; 6.2 Distributional Change and Sample Selection Bias; 6.3 The Two-Step Algorithm; 6.4 Relation to Covariate Shift Approach; 7 Applications of Covariate Shift Adaptation; 7.1 Brain-Computer Interface
7.2 Speaker Identification7.3 Natural Language Processing; 7.4 Perceived Age Prediction from Face Images; 7.5 Human Activity Recognition from Accelerometric Data; 7.6 Sample Reuse in Reinforcement Learning; III LEARNING CAUSING COVARIATE SHIFT; 8 Active Learning; 8.1 Preliminaries; 8.2 Population-Based Active Learning Methods; 8.3 Numerical Examples of Population-Based Active Learning Methods; 8.4 Pool-Based Active Learning Methods; 8.5 Numerical Examples of Pool-Based Active Learning Methods; 8.6 Summary and Discussion; 9 Active Learning with Model Selection
9.1 Direct Approach and the Active Learning/Model Selection Dilemma9.2 Sequential Approach; 9.3 Batch Approach; 9.4 Ensemble Active Learning; 9.5 Numerical Examples; 9.6 Summary and Discussion; 10 Applications of Active Learning; 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning; 10.2 Wafer Alignment in Semiconductor Exposure Apparatus; IV CONCLUSIONS; 11 Conclusions and Future Prospects; 11.1 Conclusions; 11.2 Future Prospects; Appendix: List of Symbols and Abbreviations; Bibliography; Index
Record Nr. UNINA-9910789928103321
Sugiyama Masashi <1974->  
Cambridge, Mass., : MIT Press, ©2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Machine learning in non-stationary environments : introduction to covariate shift adaptation / / Masashi Sugiyama and Motoaki Kawanabe
Machine learning in non-stationary environments : introduction to covariate shift adaptation / / Masashi Sugiyama and Motoaki Kawanabe
Autore Sugiyama Masashi <1974->
Pubbl/distr/stampa Cambridge, Mass., : MIT Press, ©2012
Descrizione fisica 1 online resource (279 p.)
Disciplina 006.3/1
Altri autori (Persone) KawanabeMotoaki
Collana Adaptive computation and machine learning
Soggetto topico Machine learning
Soggetto non controllato COMPUTER SCIENCE/Machine Learning & Neural Networks
COMPUTER SCIENCE/General
COMPUTER SCIENCE/Artificial Intelligence
ISBN 0-262-30043-5
1-280-49922-2
9786613594457
0-262-30122-9
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Contents; Foreword; Preface; I INTRODUCTION; 1 Introduction and Problem Formulation; 1.1 Machine Learning under Covariate Shift; 1.2 Quick Tour of Covariate Shift Adaptation; 1.3 Problem Formulation; 1.4 Structure of This Book; II LEARNING UNDER COVARIATE SHIFT; 2 Function Approximation; 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation; 2.2 Examples of Importance-Weighted Regression Methods; 2.3 Examples of Importance-Weighted Classification Methods; 2.4 Numerical Examples; 2.5 Summary and Discussion; 3 Model Selection; 3.1 Importance-Weighted Akaike Information Criterion
3.2 Importance-Weighted Subspace Information Criterion3.3 Importance-Weighted Cross-Validation; 3.4 Numerical Examples; 3.5 Summary and Discussion; 4 Importance Estimation; 4.1 Kernel Density Estimation; 4.2 Kernel Mean Matching; 4.3 Logistic Regression; 4.4 Kullback-Leibler Importance Estimation Procedure; 4.5 Least-Squares Importance Fitting; 4.6 Unconstrained Least-Squares Importance Fitting; 4.7 Numerical Examples; 4.8 Experimental Comparison; 4.9 Summary; 5 Direct Density-Ratio Estimation with Dimensionality Reduction; 5.1 Density Difference in Hetero-Distributional Subspace
5.2 Characterization of Hetero-Distributional Subspace5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality Reduction; 5.4 Using LFDA for Finding Hetero-Distributional Subspace; 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace; 5.6 Numerical Examples; 5.7 Summary; 6 Relation to Sample Selection Bias; 6.1 Heckman's Sample Selection Model; 6.2 Distributional Change and Sample Selection Bias; 6.3 The Two-Step Algorithm; 6.4 Relation to Covariate Shift Approach; 7 Applications of Covariate Shift Adaptation; 7.1 Brain-Computer Interface
7.2 Speaker Identification7.3 Natural Language Processing; 7.4 Perceived Age Prediction from Face Images; 7.5 Human Activity Recognition from Accelerometric Data; 7.6 Sample Reuse in Reinforcement Learning; III LEARNING CAUSING COVARIATE SHIFT; 8 Active Learning; 8.1 Preliminaries; 8.2 Population-Based Active Learning Methods; 8.3 Numerical Examples of Population-Based Active Learning Methods; 8.4 Pool-Based Active Learning Methods; 8.5 Numerical Examples of Pool-Based Active Learning Methods; 8.6 Summary and Discussion; 9 Active Learning with Model Selection
9.1 Direct Approach and the Active Learning/Model Selection Dilemma9.2 Sequential Approach; 9.3 Batch Approach; 9.4 Ensemble Active Learning; 9.5 Numerical Examples; 9.6 Summary and Discussion; 10 Applications of Active Learning; 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning; 10.2 Wafer Alignment in Semiconductor Exposure Apparatus; IV CONCLUSIONS; 11 Conclusions and Future Prospects; 11.1 Conclusions; 11.2 Future Prospects; Appendix: List of Symbols and Abbreviations; Bibliography; Index
Record Nr. UNINA-9910825516303321
Sugiyama Masashi <1974->  
Cambridge, Mass., : MIT Press, ©2012
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui