1.

Record Nr.

UNINA9910825516303321

Autore

Sugiyama Masashi <1974->

Titolo

Machine learning in non-stationary environments : introduction to covariate shift adaptation / / Masashi Sugiyama and Motoaki Kawanabe

Pubbl/distr/stampa

Cambridge, Mass., : MIT Press, ©2012

ISBN

0-262-30043-5

1-280-49922-2

9786613594457

0-262-30122-9

Descrizione fisica

1 online resource (279 p.)

Collana

Adaptive computation and machine learning

Altri autori (Persone)

KawanabeMotoaki

Disciplina

006.3/1

Soggetti

Machine learning

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Description based upon print version of record.

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Contents; Foreword; Preface; I INTRODUCTION; 1 Introduction and Problem Formulation; 1.1 Machine Learning under Covariate Shift; 1.2 Quick Tour of Covariate Shift Adaptation; 1.3 Problem Formulation; 1.4 Structure of This Book; II LEARNING UNDER COVARIATE SHIFT; 2 Function Approximation; 2.1 Importance-Weighting Techniques for Covariate Shift Adaptation; 2.2 Examples of Importance-Weighted Regression Methods; 2.3 Examples of Importance-Weighted Classification Methods; 2.4 Numerical Examples; 2.5 Summary and Discussion; 3 Model Selection; 3.1 Importance-Weighted Akaike Information Criterion

3.2 Importance-Weighted Subspace Information Criterion3.3 Importance-Weighted Cross-Validation; 3.4 Numerical Examples; 3.5 Summary and Discussion; 4 Importance Estimation; 4.1 Kernel Density Estimation; 4.2 Kernel Mean Matching; 4.3 Logistic Regression; 4.4 Kullback-Leibler Importance Estimation Procedure; 4.5 Least-Squares Importance Fitting; 4.6 Unconstrained Least-Squares Importance Fitting; 4.7 Numerical Examples; 4.8 Experimental Comparison; 4.9 Summary; 5 Direct Density-Ratio Estimation with Dimensionality Reduction; 5.1 Density Difference in Hetero-Distributional Subspace

5.2 Characterization of Hetero-Distributional Subspace5.3 Identifying Hetero-Distributional Subspace by Supervised Dimensionality



Reduction; 5.4 Using LFDA for Finding Hetero-Distributional Subspace; 5.5 Density-Ratio Estimation in the Hetero-Distributional Subspace; 5.6 Numerical Examples; 5.7 Summary; 6 Relation to Sample Selection Bias; 6.1 Heckman's Sample Selection Model; 6.2 Distributional Change and Sample Selection Bias; 6.3 The Two-Step Algorithm; 6.4 Relation to Covariate Shift Approach; 7 Applications of Covariate Shift Adaptation; 7.1 Brain-Computer Interface

7.2 Speaker Identification7.3 Natural Language Processing; 7.4 Perceived Age Prediction from Face Images; 7.5 Human Activity Recognition from Accelerometric Data; 7.6 Sample Reuse in Reinforcement Learning; III LEARNING CAUSING COVARIATE SHIFT; 8 Active Learning; 8.1 Preliminaries; 8.2 Population-Based Active Learning Methods; 8.3 Numerical Examples of Population-Based Active Learning Methods; 8.4 Pool-Based Active Learning Methods; 8.5 Numerical Examples of Pool-Based Active Learning Methods; 8.6 Summary and Discussion; 9 Active Learning with Model Selection

9.1 Direct Approach and the Active Learning/Model Selection Dilemma9.2 Sequential Approach; 9.3 Batch Approach; 9.4 Ensemble Active Learning; 9.5 Numerical Examples; 9.6 Summary and Discussion; 10 Applications of Active Learning; 10.1 Design of Efficient Exploration Strategies in Reinforcement Learning; 10.2 Wafer Alignment in Semiconductor Exposure Apparatus; IV CONCLUSIONS; 11 Conclusions and Future Prospects; 11.1 Conclusions; 11.2 Future Prospects; Appendix: List of Symbols and Abbreviations; Bibliography; Index

Sommario/riassunto

This volume focuses on a specific non-stationary environment known as covariate shift, in which the distributions of inputs (queries) changes but the conditional distributions of outputs (answers) is unchanged, and presents machine learning theory algorithms, and applications to overcome this variety of non-stationarity.