1.

Record Nr.

UNISA996418253803316

Autore

Zaslavski Alexander J

Titolo

Convex Optimization with Computational Errors [[electronic resource] /] / by Alexander J. Zaslavski

Pubbl/distr/stampa

Cham : , : Springer International Publishing : , : Imprint : Springer, , 2020

ISBN

3-030-37822-5

Edizione

[1st ed. 2020.]

Descrizione fisica

1 online resource (XI, 360 p. 1 illus.)

Collana

Springer Optimization and Its Applications, , 1931-6828 ; ; 155

Disciplina

519.3

Soggetti

Calculus of variations

Computer mathematics

Calculus of Variations and Optimal Control; Optimization

Computational Mathematics and Numerical Analysis

Lingua di pubblicazione

Inglese

Formato

Materiale a stampa

Livello bibliografico

Monografia

Note generali

Includes index.

Nota di contenuto

Preface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index. .

Sommario/riassunto

This book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed. The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors



for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps—a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.