A Survey on Hyperparameter Optimization in Machine Learning 


Vol. 48,  No. 6, pp. 733-747, Jun.  2023
10.7840/kics.2023.48.6.733


PDF
  Abstract

Machine learning (ML) has recently attracted attention in various fields and the performance of an ML model highly depends on its hyperparameters. Accordingly, it is important to find the optimal hyperparameters to improve its performance. However, there are many difficulties in hyperparameter optimization (HPO) due to some characteristics of its objective functions and decision variables and lots of studies on algorithms for HPO have been conducted to solve those difficulties. In this paper, we analyze the difficulties of HPO in ML and present the trends of recent studies on HPO. In addition, we present a future direction for HPO in ML to further improve the performance of ML.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Related Articles
  Cite this article

[IEEE Style]

J. Won, J. Shin, J. Kim, J. Lee, "A Survey on Hyperparameter Optimization in Machine Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 6, pp. 733-747, 2023. DOI: 10.7840/kics.2023.48.6.733.

[ACM Style]

Jonghyeon Won, Jongmin Shin, Jae-Ho Kim, and Jang-Won Lee. 2023. A Survey on Hyperparameter Optimization in Machine Learning. The Journal of Korean Institute of Communications and Information Sciences, 48, 6, (2023), 733-747. DOI: 10.7840/kics.2023.48.6.733.

[KICS Style]

Jonghyeon Won, Jongmin Shin, Jae-Ho Kim, Jang-Won Lee, "A Survey on Hyperparameter Optimization in Machine Learning," The Journal of Korean Institute of Communications and Information Sciences, vol. 48, no. 6, pp. 733-747, 6. 2023. (https://doi.org/10.7840/kics.2023.48.6.733)
Vol. 48, No. 6 Index