1.

Record Nr.

UNISA996466497103316

Autore

Catoni Olivier

Titolo

Statistical Learning Theory and Stochastic Optimization [[electronic resource] ] : Ecole d'Eté de Probabilités de Saint-Flour XXXI - 2001 / / by Olivier Catoni ; edited by Jean Picard

Pubbl/distr/stampa

Berlin, Heidelberg : , : Springer Berlin Heidelberg : , : Imprint : Springer, , 2004

ISBN

3-540-44507-2

Edizione

[1st ed. 2004.]

Descrizione fisica

1 online resource (VIII, 284 p.)

Collana

École d'Été de Probabilités de Saint-Flour, , 0721-5363 ; ; 1851

Disciplina

519.5

Soggetti

Probabilities

Statistics 

Mathematical optimization

Artificial intelligence

Information theory

Numerical analysis

Probability Theory and Stochastic Processes

Statistical Theory and Methods

Optimization

Artificial Intelligence

Information and Communication, Circuits

Numerical Analysis

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Bibliographic Level Mode of Issuance: Monograph

Nota di bibliografia

Includes bibliographical references and index.

Nota di contenuto

Universal Lossless Data Compression -- Links Between Data Compression and Statistical Estimation -- Non Cumulated Mean Risk -- Gibbs Estimators -- Randomized Estimators and Empirical Complexity -- Deviation Inequalities -- Markov Chains with Exponential Transitions -- References -- Index.

Sommario/riassunto

Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as



is often done in practice a notoriously "wrong'' (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.