LEADER 04352nam 22006975 450 001 9910254076803321 005 20220404185945.0 010 $a3-319-30921-8 024 7 $a10.1007/978-3-319-30921-7 035 $a(CKB)3710000000649214 035 $a(EBL)4512615 035 $a(OCoLC)947837426 035 $a(SSID)ssj0001666027 035 $a(PQKBManifestationID)16455287 035 $a(PQKBTitleCode)TC0001666027 035 $a(PQKBWorkID)15000202 035 $a(PQKB)10019850 035 $a(DE-He213)978-3-319-30921-7 035 $a(MiAaPQ)EBC4512615 035 $a(PPN)193444682 035 $a(EXLCZ)993710000000649214 100 $a20160422d2016 u| 0 101 0 $aeng 135 $aurcn#nnn||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aNumerical optimization with computational errors /$fby Alexander J. Zaslavski 205 $a1st ed. 2016. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2016. 215 $a1 online resource (ix, 304 pages) 225 1 $aSpringer Optimization and Its Applications,$x1931-6828 ;$v108 300 $aBibliographic Level Mode of Issuance: Monograph 311 $a3-319-30920-X 320 $aIncludes bibliographical references and index. 327 $a1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index. . 330 $aThis book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton?s method. 410 0$aSpringer Optimization and Its Applications,$x1931-6828 ;$v108 606 $aCalculus of variations 606 $aNumerical analysis 606 $aOperations research 606 $aManagement science 606 $aCalculus of Variations and Optimal Control; Optimization$3https://scigraph.springernature.com/ontologies/product-market-codes/M26016 606 $aNumerical Analysis$3https://scigraph.springernature.com/ontologies/product-market-codes/M14050 606 $aOperations Research, Management Science$3https://scigraph.springernature.com/ontologies/product-market-codes/M26024 615 0$aCalculus of variations. 615 0$aNumerical analysis. 615 0$aOperations research. 615 0$aManagement science. 615 14$aCalculus of Variations and Optimal Control; Optimization. 615 24$aNumerical Analysis. 615 24$aOperations Research, Management Science. 676 $a519.3 700 $aZaslavski$b Alexander J$4aut$4http://id.loc.gov/vocabulary/relators/aut$0721713 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910254076803321 996 $aNumerical optimization with computational errors$91523527 997 $aUNINA