LEADER 04093nam 22005295 450 001 996418253803316 005 20200707021435.0 010 $a3-030-37822-5 024 7 $a10.1007/978-3-030-37822-6 035 $a(CKB)4100000010121967 035 $a(DE-He213)978-3-030-37822-6 035 $a(MiAaPQ)EBC6036797 035 $a(PPN)242845584 035 $a(EXLCZ)994100000010121967 100 $a20200131d2020 u| 0 101 0 $aeng 135 $aurnn|008mamaa 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aConvex Optimization with Computational Errors$b[electronic resource] /$fby Alexander J. Zaslavski 205 $a1st ed. 2020. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2020. 215 $a1 online resource (XI, 360 p. 1 illus.) 225 1 $aSpringer Optimization and Its Applications,$x1931-6828 ;$v155 300 $aIncludes index. 311 $a3-030-37821-7 327 $aPreface -- 1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Continuous Subgradient Method -- 7. An optimization problems with a composite objective function -- 8. A zero-sum game with two-players -- 9. PDA-based method for convex optimization -- 10 Minimization of quasiconvex functions.-11. Minimization of sharp weakly convex functions.-12. A Projected Subgradient Method for Nonsmooth Problems -- References. -Index. . 330 $aThis book studies approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are well known as important tools for solving optimization problems. The research presented continues from the author's (c) 2016 book Numerical Optimization with Computational Errors. Both books study algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to obtain the approximate solution and the number of iterations needed. The discussion takes into consideration that for every algorithm, its iteration consists of several steps; computational errors for various steps are generally different. This fact, which was not accounted for in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps?a calculation of a subgradient of the objective function and a calculation of a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are generally different. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book will appeal specifically to researchers and engineers working in optimization as well as to experts in applications of optimization to engineering and economics. 410 0$aSpringer Optimization and Its Applications,$x1931-6828 ;$v155 606 $aCalculus of variations 606 $aComputer mathematics 606 $aCalculus of Variations and Optimal Control; Optimization$3https://scigraph.springernature.com/ontologies/product-market-codes/M26016 606 $aComputational Mathematics and Numerical Analysis$3https://scigraph.springernature.com/ontologies/product-market-codes/M1400X 615 0$aCalculus of variations. 615 0$aComputer mathematics. 615 14$aCalculus of Variations and Optimal Control; Optimization. 615 24$aComputational Mathematics and Numerical Analysis. 676 $a519.3 700 $aZaslavski$b Alexander J$4aut$4http://id.loc.gov/vocabulary/relators/aut$0721713 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a996418253803316 996 $aConvex Optimization with Computational Errors$92359372 997 $aUNISA