Designed Experiments for Tuning Hyperparameters in Machine Learning Algorithms

Nov 24, 2020·
Yongshi Deng
Yongshi Deng
· 0 min read
Abstract
Most machine learning algorithms involve tuning a set of hyperparameters to achieve better performance. The most widely used methodologies for hyperparameter tuning are exhaustive grid search, random search and Bayesian optimisation. However, these methods can be time-consuming and may not achieve optimal outcomes as interactions between hyperparameters are ignored. We propose an alternative method for tuning machine learning hyperparameters through supersaturated designs and response surface methodology (SDRSM). In this talk, we demonstrate our approach SDRSM, and discuss the potential and limitation of applying the proposed method to hyperparameter search.
Date
Nov 24, 2020 1:00 PM — Nov 25, 2020 3:00 PM
Event
Location

University of Auckland, New Zealand

Princes Street, Auckland,