04093nam 22005295 450 99641825380331620200707021435.03-030-37822-510.1007/978-3-030-37822-6(CKB)4100000010121967(DE-He213)978-3-030-37822-6(MiAaPQ)EBC6036797(PPN)242845584(EXLCZ)99410000001012196720200131d2020 u| 0engurnn|008mamaatxtrdacontentcrdamediacrrdacarrierConvex Optimization with Computational Errors[electronic resource] /by Alexander J. Zaslavski1st ed. 2020.Cham :Springer International Publishing :Imprint: Springer,2020.1 online resource (XI, 360 p. 1 illus.) Springer Optimization and Its Applications,1931-6828 ;155Includes index.3-030-37821-7 Preface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index. .This book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed. The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps—a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics.Springer Optimization and Its Applications,1931-6828 ;155Calculus of variationsComputer mathematicsCalculus of Variations and Optimal Control; Optimizationhttps://scigraph.springernature.com/ontologies/product-market-codes/M26016Computational Mathematics and Numerical Analysishttps://scigraph.springernature.com/ontologies/product-market-codes/M1400XCalculus of variations.Computer mathematics.Calculus of Variations and Optimal Control; Optimization.Computational Mathematics and Numerical Analysis.519.3Zaslavski Alexander Jauthttp://id.loc.gov/vocabulary/relators/aut721713MiAaPQMiAaPQMiAaPQBOOK996418253803316Convex Optimization with Computational Errors2359372UNISA