LEADER 03309nam 22005775 450 001 9910299491003321 005 20251202152642.0 010 $a9783319034225 010 $a3319034227 024 7 $a10.1007/978-3-319-03422-5 035 $a(OCoLC)877106574 035 $a(MiFhGG)GVRL6XBD 035 $a(CKB)3710000000075061 035 $a(MiAaPQ)EBC1593091 035 $a(MiFhGG)9783319034225 035 $a(DE-He213)978-3-319-03422-5 035 $a(EXLCZ)993710000000075061 100 $a20131204d2014 u| 0 101 0 $aeng 135 $aurun|---uuuua 181 $ctxt 182 $cc 183 $acr 200 12$aA Brief Introduction to Continuous Evolutionary Optimization /$fby Oliver Kramer 205 $a1st ed. 2014. 210 1$aCham :$cSpringer International Publishing :$cImprint: Springer,$d2014. 215 $a1 online resource (xi, 94 pages) $cillustrations (some color) 225 1 $aSpringerBriefs in Computational Intelligence,$x2625-3712 300 $a"ISSN: 2191-530X." 311 08$a9783319034218 311 08$a3319034219 320 $aIncludes bibliographical references and index. 327 $aPart I Foundations -- Part II Advanced Optimization -- Part III Learning -- Part IV Appendix. 330 $aPractical optimization problems are often hard to solve, in particular when they are black boxes and no further information about the problem is available except via function evaluations. This work introduces a collection of heuristics and algorithms for black box optimization with evolutionary algorithms in continuous solution spaces. The book gives an introduction to evolution strategies and parameter control. Heuristic extensions are presented that allow optimization in constrained, multimodal, and multi-objective solution spaces. An adaptive penalty function is introduced for constrained optimization. Meta-models reduce the number of fitness and constraint function calls in expensive optimization problems. The hybridization of evolution strategies with local search allows fast optimization in solution spaces with many local optima. A selection operator based on reference lines in objective space is introduced to optimize multiple conflictive objectives. Evolutionary search is employed for learning kernel parameters of the Nadaraya-Watson estimator, and a swarm-based iterative approach is presented for optimizing latent points in dimensionality reduction problems. Experiments on typical benchmark problems as well as numerous figures and diagrams illustrate the behavior of the introduced concepts and methods. 410 0$aSpringerBriefs in Computational Intelligence,$x2625-3712 606 $aComputational intelligence 606 $aArtificial intelligence 606 $aComputational Intelligence 606 $aArtificial Intelligence 615 0$aComputational intelligence. 615 0$aArtificial intelligence. 615 14$aComputational Intelligence. 615 24$aArtificial Intelligence. 676 $a006.3 700 $aKramer$b Oliver$4aut$4http://id.loc.gov/vocabulary/relators/aut$0761919 801 0$bMiFhGG 801 1$bMiFhGG 906 $aBOOK 912 $a9910299491003321 996 $aA Brief Introduction to Continuous Evolutionary Optimization$91951234 997 $aUNINA