Evaluating hyper-parameter tuning using random search in support vector machines for software effort estimation
Αποθηκεύτηκε σε:
Συγγραφείς: | , , , , |
---|---|
Μορφή: | comunicación de congreso |
Ημερομηνία έκδοσης: | 2020 |
Περιγραφή: | Studies in software effort estimation (SEE) have explored the use of hyper-parameter tuning for machine learning algorithms (MLA) to improve the accuracy of effort estimates. In other contexts random search (RS) has shown similar results to grid search, while being less computationally-expensive. In this paper, we investigate to what extent the random search hyper-parameter tuning approach affects the accuracy and stability of support vector regression (SVR) in SEE. Results were compared to those obtained from ridge re- gression models and grid search-tuned models. A case study with four data sets extracted from the ISBSG 2018 repository shows that random search exhibits similar performance to grid search, ren- dering it an attractive alternative technique for hyper-parameter tuning. RS-tuned SVR achieved an increase of 0.227 standardized accuracy () with respect to default hyper-parameters. In addition, random search improved prediction stability of SVR models to a minimum ratio of 0.840. The analysis showed that RS-tuned SVR attained performance equivalent to GS-tuned SVR. Future work includes extending this research to cover other hyper-parameter tuning approaches and machine learning algorithms, as well as using additional data sets. |
Χώρα: | Kérwá |
Ίδρυμα: | Universidad de Costa Rica |
Repositorio: | Kérwá |
Γλώσσα: | Inglés |
OAI Identifier: | oai:kerwa.ucr.ac.cr:10669/102203 |
Διαθέσιμο Online: | https://doi.org/10.1145/3416508.3417121 https://dl.acm.org/doi/10.1145/3416508.3417121 https://hdl.handle.net/10669/102203 |
Λέξη-Κλειδί : | software effort estimation empirical study support vector machines hyper-parameter tuning random search grid search |