LEADER 05676oam 2200757Ka 450 001 9910825516303321 005 20190503073403.0 010 $a0-262-30043-5 010 $a1-280-49922-2 010 $a9786613594457 010 $a0-262-30122-9 024 8 $a9786613594457 035 $a(CKB)2670000000174086 035 $a(EBL)3339422 035 $a(SSID)ssj0000652290 035 $a(PQKBManifestationID)11388984 035 $a(PQKBTitleCode)TC0000652290 035 $a(PQKBWorkID)10638752 035 $a(PQKB)10672516 035 $a(StDuBDS)EDZ0000155739 035 $a(CaBNVSL)mat06267539 035 $a(IDAMS)0b000064818b458f 035 $a(IEEE)6267539 035 $a(OCoLC)784949353$z(OCoLC)794489207$z(OCoLC)817078669$z(OCoLC)961629362$z(OCoLC)962696956$z(OCoLC)988440888$z(OCoLC)988450036$z(OCoLC)992079675$z(OCoLC)1037929255$z(OCoLC)1038618374$z(OCoLC)1055396331$z(OCoLC)1065925703$z(OCoLC)1081285003 035 $a(OCoLC-P)784949353 035 $a(MaCbMITP)8494 035 $a(Au-PeEL)EBL3339422 035 $a(CaPaEBR)ebr10547396 035 $a(CaONFJC)MIL359445 035 $a(OCoLC)784949353 035 $a(MiAaPQ)EBC3339422 035 $a(EXLCZ)992670000000174086 100 $a20120409d2012 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aMachine learning in non-stationary environments $eintroduction to covariate shift adaptation /$fMasashi Sugiyama and Motoaki Kawanabe 210 $aCambridge, Mass. $cMIT Press$dİ2012 215 $a1 online resource (279 p.) 225 1 $aAdaptive computation and machine learning 300 $aDescription based upon print version of record. 311 $a0-262-01709-1 320 $aIncludes bibliographical references and index. 327 $aContents; Foreword; Preface; I INTRODUCTION; 1 Introduction and Problem Formulation; 1.1 Machine Learning under Covariate Shift; 1.2 Quick Tour of Covariate Shift Adaptation; 1.3 Problem Formulation; 1.4 Structure of This Book; II LEARNING UNDER COVARIATE SHIFT; 2 Function Approximation; 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation; 2.2 Examples of Importance-Weighted Regression Methods; 2.3 Examples of Importance-Weighted Classification Methods; 2.4 Numerical Examples; 2.5 Summary and Discussion; 3 Model Selection; 3.1 Importance-Weighted Akaike Information Criterion 327 $a3.2 Importance-Weighted Subspace Information Criterion3.3 Importance-Weighted Cross-Validation; 3.4 Numerical Examples; 3.5 Summary and Discussion; 4 Importance Estimation; 4.1 Kernel Density Estimation; 4.2 Kernel Mean Matching; 4.3 Logistic Regression; 4.4 Kullback-Leibler Importance Estimation Procedure; 4.5 Least-Squares Importance Fitting; 4.6 Unconstrained Least-Squares Importance Fitting; 4.7 Numerical Examples; 4.8 Experimental Comparison; 4.9 Summary; 5 Direct Density-Ratio Estimation with Dimensionality Reduction; 5.1 Density Difference in Hetero-Distributional Subspace 327 $a5.2 Characterization of Hetero-Distributional Subspace5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality Reduction; 5.4 Using LFDA for Finding Hetero-Distributional Subspace; 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace; 5.6 Numerical Examples; 5.7 Summary; 6 Relation to Sample Selection Bias; 6.1 Heckman's Sample Selection Model; 6.2 Distributional Change and Sample Selection Bias; 6.3 The Two-Step Algorithm; 6.4 Relation to Covariate Shift Approach; 7 Applications of Covariate Shift Adaptation; 7.1 Brain-Computer Interface 327 $a7.2 Speaker Identification7.3 Natural Language Processing; 7.4 Perceived Age Prediction from Face Images; 7.5 Human Activity Recognition from Accelerometric Data; 7.6 Sample Reuse in Reinforcement Learning; III LEARNING CAUSING COVARIATE SHIFT; 8 Active Learning; 8.1 Preliminaries; 8.2 Population-Based Active Learning Methods; 8.3 Numerical Examples of Population-Based Active Learning Methods; 8.4 Pool-Based Active Learning Methods; 8.5 Numerical Examples of Pool-Based Active Learning Methods; 8.6 Summary and Discussion; 9 Active Learning with Model Selection 327 $a9.1 Direct Approach and the Active Learning/Model Selection Dilemma9.2 Sequential Approach; 9.3 Batch Approach; 9.4 Ensemble Active Learning; 9.5 Numerical Examples; 9.6 Summary and Discussion; 10 Applications of Active Learning; 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning; 10.2 Wafer Alignment in Semiconductor Exposure Apparatus; IV CONCLUSIONS; 11 Conclusions and Future Prospects; 11.1 Conclusions; 11.2 Future Prospects; Appendix: List of Symbols and Abbreviations; Bibliography; Index 330 8 $aThis volume focuses on a specific non-stationary environment known as covariate shift, in which the distributions of inputs (queries) changes but the conditional distributions of outputs (answers) is unchanged, and presents machine learning theory algorithms, and applications to overcome this variety of non-stationarity. 410 0$aAdaptive computation and machine learning. 606 $aMachine learning 610 $aCOMPUTER SCIENCE/Machine Learning & Neural Networks 610 $aCOMPUTER SCIENCE/General 610 $aCOMPUTER SCIENCE/Artificial Intelligence 615 0$aMachine learning. 676 $a006.3/1 700 $aSugiyama$b Masashi$f1974-$0847070 701 $aKawanabe$b Motoaki$01722992 801 0$bOCoLC-P 801 1$bOCoLC-P 906 $aBOOK 912 $a9910825516303321 996 $aMachine learning in non-stationary environments$94123879 997 $aUNINA