Genetic programming is naturally suited to evolve bagging ensembles
Paper in proceeding, 2021

Learning ensembles by bagging can substantially improve the generalization performance of low-bias, high-variance estimators, including those evolved by Genetic Programming (GP). To be efficient, modern GP algorithms for evolving (bagging) ensembles typically rely on several (often inter-connected) mechanisms and respective hyper-parameters, ultimately compromising ease of use. In this paper, we provide experimental evidence that such complexity might not be warranted. We show that minor changes to fitness evaluation and selection are sufficient to make a simple and otherwise-traditional GP algorithm evolve ensembles efficiently. The key to our proposal is to exploit the way bagging works to compute, for each individual in the population, multiple fitness values (instead of one) at a cost that is only marginally higher than the one of a normal fitness evaluation. Experimental comparisons on classification and regression tasks taken and reproduced from prior studies show that our algorithm fares very well against state-of-the-art ensemble and non-ensemble GP algorithms. We further provide insights into the proposed approach by (i) scaling the ensemble size, (ii) ablating the changes to selection, (iii) observing the evolvability induced by traditional subtree variation. Code: https://github.com/marcovirgolin/2SEGP.

Bagging

Evolutionary algorithms

Ensemble learning

Machine learning

Genetic programming

Author

Marco Virgolin

Chalmers, Mechanics and Maritime Sciences (M2), Vehicle Engineering and Autonomous Systems

GECCO 2021 - Proceedings of the 2021 Genetic and Evolutionary Computation Conference

830-839
9781450383509 (ISBN)

2021 Genetic and Evolutionary Computation Conference, GECCO 2021
Virtual, Online, France,

Subject Categories

Evolutionary Biology

Probability Theory and Statistics

Computer Science

DOI

10.1145/3449639.3459278

More information

Latest update

8/13/2021