Evaluating hyper-parameter tuning using random search in support vector machines for software effort estimation
Uloženo v:
Autoři: | , , , , |
---|---|
Médium: | comunicación de congreso |
Datum vydání: | 2020 |
Popis: | Studies in software effort estimation (SEE) have explored the use of hyper-parameter tuning for machine learning algorithms (MLA) to improve the accuracy of effort estimates. In other contexts random search (RS) has shown similar results to grid search, while being less computationally-expensive. In this paper, we investigate to what extent the random search hyper-parameter tuning approach affects the accuracy and stability of support vector regression (SVR) in SEE. Results were compared to those obtained from ridge re- gression models and grid search-tuned models. A case study with four data sets extracted from the ISBSG 2018 repository shows that random search exhibits similar performance to grid search, ren- dering it an attractive alternative technique for hyper-parameter tuning. RS-tuned SVR achieved an increase of 0.227 standardized accuracy () with respect to default hyper-parameters. In addition, random search improved prediction stability of SVR models to a minimum ratio of 0.840. The analysis showed that RS-tuned SVR attained performance equivalent to GS-tuned SVR. Future work includes extending this research to cover other hyper-parameter tuning approaches and machine learning algorithms, as well as using additional data sets. |
Země: | Kérwá |
Instituce: | Universidad de Costa Rica |
Repositorio: | Kérwá |
Jazyk: | Inglés |
OAI Identifier: | oai:kerwa.ucr.ac.cr:10669/102203 |
On-line přístup: | https://doi.org/10.1145/3416508.3417121 https://dl.acm.org/doi/10.1145/3416508.3417121 https://hdl.handle.net/10669/102203 |
Klíčové slovo: | software effort estimation empirical study support vector machines hyper-parameter tuning random search grid search |