Simultaneous Ensemble Generation and Hyperparameter Optimization for Regression

David Roschewitz, Kurt Driessens, Pieter Collins

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

12 Downloads (Pure)


The development of advanced hyperparameter optimization algorithms, using e.g. Bayesian optimization, has encouraged a departure from hand-tuning. Primarily, this trend is observed for classification tasks while regression has received less attention. In this paper, we devise a method for simultaneously tuning hyperparameters and generating an ensemble, by explicitly optimizing parameters in an ensemble context. Techniques traditionally used for classification are adapted to suit regression problems and we investigate the use of more robust loss functions. Furthermore, we propose methods for dynamically establishing the size of an ensemble and for weighting the individual models. The performance is evaluated using three base-learners and 16 datasets. We show that our algorithms consistently outperform single optimized models and can outperform or match the performance of state of the art ensemble generation techniques.
Original languageEnglish
Title of host publicationArtificial Intelligence
Subtitle of host publicationBNAIC 2017
EditorsB. Verheij, M. Wiering
ISBN (Electronic)978-3-319-76892-2
ISBN (Print)978-3-319-76891-5
Publication statusPublished - 2017

Publication series

SeriesCommunications in Computer and Information Science


  • Bayesian optimization
  • Hyperparameter optimization
  • Ensemble generation
  • Regression

Cite this