ABSTRACT
The application of machine learning (ML) models to the analysis of optimization algorithms requires the representation of optimization problems using numerical features. These features can be used as input for ML models that are trained to select or to configure a suitable algorithm for the problem at hand. Since in pure black-box optimization information about the problem instance can only be obtained through function evaluation, a common approach is to dedicate some function evaluations for feature extraction, e.g., using random sampling. This approach has two key downsides: (1) It reduces the budget left for the actual optimization phase, and (2) it neglects valuable information that could be obtained from a problem-solver interaction.
In this paper, we propose a feature extraction method that describes the trajectories of optimization algorithms using simple descriptive statistics. We evaluate the generated features for the task of classifying problem classes from the Black Box Optimization Benchmarking (BBOB) suite. We demonstrate that the proposed DynamoRep features capture enough information to identify the problem class on which the optimization algorithm is running, achieving a mean classification accuracy of 95% across all experiments.
- Nacim Belkhir, Johann Dréo, Pierre Savéant, and Marc Schoenauer. 2017. Per instance algorithm configuration of CMA-ES with limited budget. In Proc. of Genetic and Evolutionary Computation Conference (GECCO). ACM, 681--688. Google ScholarDigital Library
- Hans-Georg Beyer and Hans-Paul Schwefel. 2002. Evolution strategies-a comprehensive introduction. Natural computing 1 (2002), 3--52.Google Scholar
- J. Blank and K. Deb. 2020. pymoo: Multi-Objective Optimization in Python. IEEE Access 8 (2020), 89497--89509.Google ScholarCross Ref
- Gjorgjina Cenikj, Ryan Dieter Lang, Andriess Petrus Engelbrecht, Carola Doerr, Peter Korošec, and Tome Eftimov. 2022. SELECTOR: Selecting a Representative Benchmark Suite for Reproducible Statistical Comparison. In Proceedings of The Genetic and Evolutionary Computation Conference. In Press..Google ScholarDigital Library
- Gabriele Corso, Luca Cavalleri, Dominique Beaini, Pietro Liò, and Petar Veličković. 2020. Principal neighbourhood aggregation for graph nets. Advances in Neural Information Processing Systems 33 (2020), 13260--13271.Google Scholar
- Hossein Gholamalinezhad and Hossein Khosravi. 2020. Pooling methods in deep neural networks, a review. arXiv preprint arXiv:2009.07485 (2020).Google Scholar
- Léo Grinsztajn, Edouard Oyallon, and Gaël Varoquaux. 2022. Why do tree-based models still outperform deep learning on tabular data? arXiv preprint arXiv:2207.08815 (2022).Google Scholar
- Nikolaus Hansen. 2016. The CMA evolution strategy: A tutorial. arXiv preprint arXiv:1604.00772 (2016).Google Scholar
- Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tušar, and Dimo Brockhoff. 2020. COCO: A platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software (2020), 1--31.Google Scholar
- Nikolaus Hansen, Steffen Finck, Raymond Ros, and Anne Auger. 2009. Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Research Report RR-6829. INRIA. https://hal.inria.fr/inria-00362633Google Scholar
- Nikolaus Hansen and Andreas Ostermeier. 1996. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE international conference on evolutionary computation. IEEE, 312--317.Google ScholarCross Ref
- Anja Janković and Carola Doerr. 2019. Adaptive Landscape Analysis. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Prague, Czech Republic) (GECCO '19). Association for Computing Machinery, New York, NY, USA, 2032--2035. Google ScholarDigital Library
- Anja Jankovic, Tome Eftimov, and Carola Doerr. 2021. Towards Feature-Based Performance Regression Using Trajectory Data. In Proc. of Applications of Evolutionary Computation (EvoApplications 2021) (LNCS, Vol. 12694). Springer, 601--617. Google ScholarDigital Library
- Sourabh Katoch, Sumit Singh Chauhan, and Vijay Kumar. 2021. A review on genetic algorithm: past, present, and future. Multimedia Tools and Applications 80 (2021), 8091--8126.Google ScholarDigital Library
- Pascal Kerschke, Holger H. Hoos, Frank Neumann, and Heike Trautmann. 2019. Automated Algorithm Selection: Survey and Perspectives. Evolutionary Computation 27, 1 (2019), 3--45. Google ScholarDigital Library
- Pascal Kerschke and Heike Trautmann. 2019. Comprehensive Feature-Based Landscape Analysis of Continuous and Constrained Optimization Problems Using the R-Package Flacco. Springer International Publishing, Cham, 93--123. Google ScholarCross Ref
- Ana Kostovska, Anja Jankovic, Diederick Vermetten, Jacob de Nobel, Hao Wang, Tome Eftimov, and Carola Doerr. 2022. Per-run Algorithm Selection with Warm-starting using Trajectory-based Features. LNCS, Vol. 13398. Springer, 46--60. Google ScholarDigital Library
- Ryan Dieter Lang and Andries Petrus Engelbrecht. 2021. An Exploratory Landscape Analysis-Based Benchmark Suite. Algorithms 14, 3 (2021). Google ScholarCross Ref
- Katherine M. Malan and Andries P. Engelbrecht. 2013. A survey of techniques for characterising fitness landscapes and some possible ways forward. Information Sciences 241 (2013), 148--163. Google ScholarDigital Library
- Olaf Mersmann, Bernd Bischl, Heike Trautmann, Mike Preuss, Claus Weihs, and Günter Rudolph. 2011. Exploratory landscape analysis. In Proceedings of the 13th annual conference on Genetic and evolutionary computation. 829--836.Google ScholarDigital Library
- Mario Andrés Muñoz and Kate Smith-Miles. 2020. Generating New Space-Filling Test Instances for Continuous Black-Box Optimization. Evolutionary Computation 28 (09 2020), 379--404. Google ScholarDigital Library
- Millie Pant, Hira Zaheer, Laura Garcia-Hernandez, Ajith Abraham, et al. 2020. Differential Evolution: A review of more than two decades of research. Engineering Applications of Artificial Intelligence 90 (2020), 103479.Google ScholarDigital Library
- F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825--2830.Google ScholarDigital Library
- Gašper Petelin, Gjorgjina Cenikj, and Tome Eftimov. 2022. TLA: Topological Landscape Analysis for Single-Objective Continuous Optimization Problem Instances. In Proceedings of IEEE Symposium Series On Computational Intelligence.Google ScholarCross Ref
- Quentin Renau, Carola Doerr, Johann Dreo, and Benjamin Doerr. 2020. Exploratory Landscape Analysis is Strongly Sensitive to the Sampling Strategy. 139--153. Google ScholarDigital Library
- Quentin Renau, Johann Dreo, Carola Doerr, and Benjamin Doerr. 2021. Towards Explainable Exploratory Landscape Analysis: Extreme Feature Selection for Classifying BBOB Functions. In Applications of Evolutionary Computation. Springer International Publishing, 17--33. Google ScholarDigital Library
- Moritz Vinzent Seiler, Raphael Patrick Prager, Pascal Kerschke, and Heike Trautmann. 2022. A Collection of Deep Learning-Based Feature-Free Approaches for Characterizing Single-Objective Continuous Fitness Landscapes. In Proceedings of the Genetic and Evolutionary Computation Conference (Boston, Massachusetts) (GECCO '22). Association for Computing Machinery, New York, NY, USA, 657--665. Google ScholarDigital Library
- Rainer Storn and Kenneth Price. 1997. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization 11, 4 (1997), 341--359.Google ScholarDigital Library
- Boris Yazmir and Ofer M. Shir. 2021. Automated Feature Detection of Black-Box Continuous Search-Landscapes Using Neural Image Recognition. In Proceedings of the Genetic and Evolutionary Computation Conference Companion (Lille, France) (GECCO '21). Association for Computing Machinery, New York, NY, USA, 213--214. Google ScholarDigital Library
- Urban Škvorc, Tome Eftimov, and Peter Korošec. 2020. Understanding the problem space in single-objective numerical optimization using exploratory landscape analysis. Applied Soft Computing 90 (2020), 106138. Google ScholarCross Ref
- Urban Škvorc, Tome Eftimov, and Peter Korošec. 2021. The Effect of Sampling Methods on the Invariance to Function Transformations When Using Exploratory Landscape Analysis. In 2021 IEEE Congress on Evolutionary Computation (CEC). 1139--1146. Google ScholarDigital Library
Index Terms
- DynamoRep: Trajectory-Based Population Dynamics for Classification of Black-box Optimization Problems
Recommendations
A stigmergy-based algorithm for black-box optimization: noiseless function testbed
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersIn this paper, we present a stigmergy-based algorithm for solving optimization problems with continuous variables, labeled Differential Ant-Stigmergy Algorithm (DASA). The performance of the DASA is evaluated on the set of benchmark problems provided ...
Noiseless functions black-box optimization: evaluation of a hybrid particle swarm with differential operators
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersIn this work we evaluate a Particle Swarm Optimizer hybridized with Differential Evolution and apply it to the Black-Box Optimization Benchmarking for noiseless functions (BBOB 2009). We have performed the complete procedure established in this special ...
A stigmergy-based algorithm for black-box optimization: noisy function testbed
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersIn this paper, we present a stigmergy-based algorithm for solving optimization problems with continuous variables, labeled Differential Ant-Stigmergy Algorithm (DASA). The performance of the DASA is evaluated on the set of benchmark problems provided ...
Comments