Vai al contenuto principale della pagina

Genetic Programming : 26th European Conference, EuroGP 2023, Held as Part of EvoStar 2023, Brno, Czech Republic, April 12–14, 2023, Proceedings / / edited by Gisele Pappa, Mario Giacobini, Zdenek Vasicek



(Visualizza in formato marc)    (Visualizza in BIBFRAME)

Titolo: Genetic Programming : 26th European Conference, EuroGP 2023, Held as Part of EvoStar 2023, Brno, Czech Republic, April 12–14, 2023, Proceedings / / edited by Gisele Pappa, Mario Giacobini, Zdenek Vasicek Visualizza cluster
Pubblicazione: Cham : , : Springer Nature Switzerland : , : Imprint : Springer, , 2023
Edizione: 1st ed. 2023.
Descrizione fisica: 1 online resource (366 pages)
Disciplina: 304.2
Soggetto topico: Programming languages (Electronic computers)
System theory
Computer networks
Machine learning
Natural language processing (Computer science)
Programming Language
Complex Systems
Computer Communication Networks
Machine Learning
Natural Language Processing (NLP)
Persona (resp. second.): PappaGisele
GiacobiniMario
VašíčekZdeněk <1933->
Nota di bibliografia: Includes bibliographical references and index.
Nota di contenuto: Intro -- Preface -- Organization -- Contents -- Long Presentations -- A Self-Adaptive Approach to Exploit Topological Properties of Different GAs' Crossover Operators -- 1 Introduction -- 2 Fundamental Concepts -- 2.1 Crossover -- 2.2 Convex Combination, Convex Hull, and Convex Search -- 3 Related Works -- 4 Methodology -- 4.1 Dynamic Diversity Maintenance -- 4.2 Self-adaptive Crossover -- 5 Experimental Settings -- 6 Experimental Results -- 7 Conclusions -- References -- A Genetic Programming Encoder for Increasing Autoencoder Interpretability -- 1 Introduction -- 1.1 Structure -- 2 Background and Related Work -- 2.1 Non-linear Dimensionality Reduction -- 2.2 Evolutionary Computation for Dimensionality Reduction -- 2.3 Genetic Programming for Autoencoding -- 3 Proposed Method: GPE-AE -- 3.1 GP Representation of Encoder -- 3.2 Fitness Evaluation -- 3.3 Decoder Architecture -- 4 Experiment Design -- 4.1 Comparison Methods -- 4.2 Evaluation Measures -- 4.3 Datasets -- 5 Results -- 6 Further Analysis -- 7 Conclusions -- References -- Graph Networks as Inductive Bias for Genetic Programming: Symbolic Models for Particle-Laden Flows -- 1 Introduction -- 2 Background and Related Work -- 2.1 Genetic Programming in Physics Applications -- 2.2 Machine Learning for Particle-Laden Flows -- 3 Proposed Methods -- 3.1 Graph Networks -- 3.2 Genetic Programming -- 4 Experiment Design -- 4.1 Data Generation: Simulation of Particle-Laden Flows -- 4.2 Data Preprocessing -- 4.3 Algorithm Settings -- 5 Results and Analysis -- 5.1 Overall Algorithm Performance -- 5.2 Explainability of Equations -- 5.3 Validation of Symbolic Models -- 6 Conclusion and Future Work -- References -- Phenotype Search Trajectory Networks for Linear Genetic Programming -- 1 Introduction -- 2 The LGP System -- 2.1 Boolean LGP Algorithm -- 2.2 Genotype, Phenotype, and Fitness.
3 Kolmogorov Complexity -- 4 Sampling and Metrics Estimation -- 5 Search Trajectory Networks -- 5.1 General Definitions -- 5.2 The Proposed STN Models -- 5.3 Network Visualisation -- 5.4 Comparing Three Targets with Increasing Difficulty -- 6 Discussion -- References -- GPAM: Genetic Programming with Associative Memory -- 1 Introduction -- 2 Related Work -- 2.1 Symbolic Regression and Genetic Programming -- 2.2 Efficient Processing of DNNs -- 2.3 Weight Compression -- 3 Proposed Method -- 3.1 The GPAM Approach -- 3.2 GPAM for Weight Generation -- 4 Results for Symbolic Regression Benchmarks -- 4.1 Benchmarks -- 4.2 Setup -- 4.3 Memory Sizing -- 4.4 Role of Constants in GPAM -- 5 Results for Weight Generation -- 6 Discussion and Conclusions -- References -- MAP-Elites with Cosine-Similarity for Evolutionary Ensemble Learning -- 1 Introduction -- 2 Related Work -- 2.1 Semantic GP -- 2.2 GP-Based Ensemble Learning -- 2.3 Quality Diversity Optimization -- 3 The Proposed Ensemble Learning Algorithm -- 3.1 The Overall Framework -- 3.2 Angle-Based Dimensionality Reduction -- 3.3 Reference Semantic Points -- 4 Experiment Settings -- 4.1 Datasets -- 4.2 Experimental Protocol -- 4.3 Parameter Settings -- 4.4 Benchmark Dimensionality Reduction Methods -- 5 Experimental Results -- 5.1 Comparisons of MAP-Elites Using Different Dimensionality Reduction Methods -- 5.2 Impact of Using Reference Points -- 5.3 Comparison with Other Machine Learning and Symbolic Regression Methods -- 6 Conclusions -- References -- Small Solutions for Real-World Symbolic Regression Using Denoising Autoencoder Genetic Programming -- 1 Introduction -- 2 Related Work -- 3 Denoising Autoencoder LSTM -- 3.1 Model Building and Sampling -- 3.2 A New Denoising Strategy: Levenshtein Tree Edit -- 4 Experiments -- 4.1 Experimental Setup -- 4.2 Prediction Quality.
4.3 Analyzing the Search Behavior -- 5 Conclusions and Future Work -- References -- Context Matters: Adaptive Mutation for Grammars -- 1 Introduction -- 2 Background -- 2.1 Grammar-Based Genetic Programming -- 2.2 Adaptive Mutation Rate -- 2.3 Grammar-Design -- 3 Adaptive Facilitated Mutation -- 3.1 Grammar Design for Adaptive Facilitated Mutation -- 4 Experimental Setup -- 5 Results -- 6 Conclusion -- 6.1 Future Work -- References -- A Boosting Approach to Constructing an Ensemble Stack -- 1 Introduction -- 2 Related Work -- 3 Evolving an Ensemble Stack Using Boosting -- 3.1 The Boosting Ensemble Stack Algorithm -- 3.2 Evaluating an Ensemble Stack Post Training -- 3.3 Using an Extremely Large Number of Bins -- 4 Experimental Methodology -- 5 Results -- 5.1 Small Scale Classification Tasks -- 5.2 Large Scale Classification Task -- 6 Conclusion -- References -- Adaptive Batch Size CGP: Improving Accuracy and Runtime for CGP Logic Optimization Flow -- 1 Introduction -- 2 Cartesian Genetic Programming -- 2.1 Representation -- 2.2 Evolutionary Process -- 3 Methodology -- 3.1 Definitions -- 3.2 Adaptive Batch Size CGP -- 3.3 Experimental Protocol -- 4 Results -- 5 Conclusion -- References -- Faster Convergence with Lexicase Selection in Tree-Based Automated Machine Learning -- 1 Introduction -- 2 Related Work -- 3 Methods -- 3.1 Review of TPOT -- 3.2 Parent Selection Algorithms -- 4 Experimental Set-Up -- 4.1 Datasets -- 4.2 Implementation -- 4.3 Evaluating Convergence -- 4.4 Exploration of Pipelines -- 5 Results -- 5.1 DIGEN Datasets -- 5.2 ANGES Datasets -- 6 Discussion -- References -- Using FPGA Devices to Accelerate Tree-Based Genetic Programming: A Preliminary Exploration with Recent Technologies -- 1 Introduction -- 2 Related Work -- 3 Accelerator Architecture -- 3.1 Program Memory -- 3.2 Program Compiler -- 3.3 Program Evaluator.
4 Design of Experiments -- 4.1 Comparison Metrics -- 4.2 Primitive Sets -- 4.3 Program Generation -- 4.4 Fitness Cases -- 5 Results -- 6 Current Limitations and Potential Optimizations -- 6.1 Current Limitations -- 6.2 Potential Optimizations -- 7 Conclusion -- References -- Memetic Semantic Genetic Programming for Symbolic Regression -- 1 Introduction -- 2 Semantic GP -- 2.1 Library Building and Searching -- 3 Memetic Algorithms -- 4 Memetic Semantic for Symbolic Regression -- 4.1 Algorithm -- 4.2 Local Tree Improvement -- 5 Experimental Setup -- 6 Results -- 7 Related Work -- 8 Conclusion -- References -- Grammatical Evolution with Code2vec -- 1 Introduction -- 2 Background -- 2.1 Grammatical Evolution -- 2.2 Code2vec -- 3 Methods -- 3.1 ClusterBooster -- 3.2 ClusterSelection -- 4 Experiments -- 4.1 Benchmarks -- 4.2 Experimental Setup -- 4.3 Results -- 5 Conclusion -- References -- Short Presentations -- Domain-Aware Feature Learning with Grammar-Guided Genetic Programming -- 1 Introduction -- 2 Related Work -- 2.1 Genetic-Programming-Based Feature Learning -- 2.2 Domain-Aware Feature Learning and Aggregation Incorporation -- 3 Method -- 3.1 Domain Knowledge M3GP -- 3.2 Domain Knowledge and Aggregation M3GP -- 4 Evaluation -- 4.1 Datasets -- 4.2 Implementation Details -- 4.3 Experiment Details -- 5 Results -- 6 Conclusion -- References -- Genetic Improvement of LLVM Intermediate Representation -- 1 Introduction -- 2 Background -- 3 Mutating LLVM IR -- 3.1 Representation -- 3.2 LLVM IR define Functions -- 3.3 Mutable LLVM IR -- 3.4 Compiling C/C++ etc. to Generate LLVM IR -- 3.5 Selecting Which LLVM IR to Optimise -- 3.6 Deleting LLVM IR -- 4 Fitness Function -- 4.1 Test Cases for Google's OLC and Uber's H3: GB Post Codes -- 4.2 Counting Instructions with perf stat -e instructions -x, -- 4.3 Sandboxing to Prevent Running Mutations Causing Harm.
4.4 Timeouts to Stop Poor Mutants Delaying Search -- 4.5 Limiting Output Size to Avoid Filling Disk or Exceeding Disk Quota -- 5 Hillclimbing Search -- 6 Results -- 7 Discussion -- 7.1 Types of Improvement Found -- 7.2 Discussion: Future Work, Co-evolution, Perf, Fitness Landscape -- 8 Conclusions -- References -- Spatial Genetic Programming -- 1 Introduction -- 2 Related Literature -- 3 Spatial Genetic Programming -- 3.1 The Cost Function -- 3.2 Outputs, Termination Conditions and Model Execution -- 3.3 Evolution of Models and the Genetic Operators -- 3.4 Conditional Return Statements -- 4 Experiments and Results -- 4.1 Case Study: Classic Control Problems -- 4.2 Case Study: Custom Toy Problems -- 4.3 Impact of a Spatial Crossover on the Evolution of Programs -- 5 Conclusion -- References -- All You Need is Sex for Diversity -- 1 Introduction -- 2 The PIMP Approach -- 3 Motivation -- 4 Methodology -- 4.1 Measures -- 4.2 Statistical Tests -- 5 Results -- 6 Discussion and Additional Remarks -- 7 Conclusion -- References -- On the Effects of Collaborators Selection and Aggregation in Cooperative Coevolution: An Experimental Analysis -- 1 Introduction and Related Works -- 2 A General Scheme for CC -- 3 Case Studies -- 3.1 Toy Problems -- 3.2 Symbolic Regression -- 3.3 Neuroevolution -- 4 Experimental Analysis -- 4.1 Toy Problems -- 4.2 Symbolic Regression -- 4.3 Neuroevolution -- 5 Concluding Remarks -- References -- To Bias or Not to Bias: Probabilistic Initialisation for Evolving Dispatching Rules -- 1 Introduction -- 2 Background -- 2.1 Unrelated Machines Environment -- 2.2 Designing Dispatching Rules with Genetic Programming -- 3 Probabilistic Individual Initialisation -- 4 Experimental Analysis -- 4.1 Benchmark Setup -- 4.2 Results -- 5 Analysis -- 5.1 Node Probabilities -- 5.2 Method Ranking -- 6 Conclusion -- References.
MTGP: Combining Metamorphic Testing and Genetic Programming.
Sommario/riassunto: This book constitutes the refereed proceedings of the 26th European Conference on Genetic Programming, EuroGP 2023, held as part of EvoStar 2023, in Brno, Czech Republic, during April 12–14, 2023, and co-located with the EvoStar events, EvoCOP, EvoMUSART, and EvoApplications. The 14 revised full papers and 8 short papers presented in this book were carefully reviewed and selected from 38 submissions. The wide range of topics in this volume reflects the current state of research in the field. The collection of papers cover topics including developing new variants of GP algorithms for both optimization and machine learning problems as well as exploring GP to address complex real-world problems. .
Titolo autorizzato: Genetic Programming  Visualizza cluster
ISBN: 9783031295737
9783031295720
Formato: Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione: Inglese
Record Nr.: 9910683354403321
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Serie: Lecture Notes in Computer Science, . 1611-3349 ; ; 13986