1.

Record Nr.

UNINA9911015870903321

Autore

Yu Yang

Titolo

Derivative-Free Optimization : Theoretical Foundations, Algorithms, and Applications / / by Yang Yu, Hong Qian, Yi-Qi Hu

Pubbl/distr/stampa

Singapore : , : Springer Nature Singapore : , : Imprint : Springer, , 2025

ISBN

981-9659-29-9

Edizione

[1st ed. 2025.]

Descrizione fisica

1 online resource (288 pages)

Collana

Machine Learning: Foundations, Methodologies, and Applications, , 2730-9916

Altri autori (Persone)

QianHong

HuYi-Qi

Disciplina

006.31

Soggetti

Machine learning

Artificial intelligence

Machine Learning

Artificial Intelligence

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Nota di contenuto

Introduction -- Preliminaries -- Framework -- Theoretical Foundation -- Basic Algorithm -- Optimization in Sequential Mode -- Optimization in High-Dimensional Search Space -- Optimization under Noise -- Optimization with Parallel Computing.

Sommario/riassunto

This book offers a pioneering exploration of classification-based derivative-free optimization (DFO), providing researchers and professionals in artificial intelligence, machine learning, AutoML, and optimization with a robust framework for addressing complex, large-scale problems where gradients are unavailable. By bridging theoretical foundations with practical implementations, it fills critical gaps in the field, making it an indispensable resource for both academic and industrial audiences. The book introduces innovative frameworks such as sampling-and-classification (SAC) and sampling-and-learning (SAL), which underpin cutting-edge algorithms like Racos and SRacos. These methods are designed to excel in challenging optimization scenarios, including high-dimensional search spaces, noisy environments, and parallel computing. A dedicated section on the ZOOpt toolbox provides practical tools for implementing these algorithms effectively. The book’



s structure moves from foundational principles and algorithmic development to advanced topics and real-world applications, such as hyperparameter tuning, neural architecture search, and algorithm selection in AutoML. Readers will benefit from a comprehensive yet concise presentation of modern DFO methods, gaining theoretical insights and practical tools to enhance their research and problem-solving capabilities. A foundational understanding of machine learning, probability theory, and algorithms is recommended for readers to fully engage with the material.