An improved grid search algorithm to optimize SVR for prediction

Parameter optimization is an important step for support vector regression (SVR), since its prediction performance greatly depends on values of the related parameters. To solve the shortcomings of traditional grid search algorithms such as too many invalid search ranges and sensitivity to search step...

Full description

Saved in:
Bibliographic Details
Published inSoft computing (Berlin, Germany) Vol. 25; no. 7; pp. 5633 - 5644
Main Authors Sun, Yuting, Ding, Shifei, Zhang, Zichen, Jia, Weikuan
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.04.2021
Subjects
Online AccessGet full text
ISSN1432-7643
1433-7479
DOI10.1007/s00500-020-05560-w

Cover

More Information
Summary:Parameter optimization is an important step for support vector regression (SVR), since its prediction performance greatly depends on values of the related parameters. To solve the shortcomings of traditional grid search algorithms such as too many invalid search ranges and sensitivity to search step, an improved grid search algorithm is proposed to optimize SVR for prediction. The improved grid search (IGS) algorithm is used to optimize the penalty parameter and kernel function parameter of SVR by automatically changing the search range and step for several times, and then SVR is trained for the optimal solution. The available of the method is proved by predicting the values of soil and plant analyzer development (SPAD) in rice leaves. To predict SPAD values more quickly and accurately, some dimension reduction methods such as stepwise multiple linear regressions (SMLR) and principal component analysis (PCA) are processed the training data, and the results show that the nonlinear fitting and prediction performance of accuracy of SMLR-IGS-SVR and PCA-IGS-SVR are better than those of IGS-SVR.
ISSN:1432-7643
1433-7479
DOI:10.1007/s00500-020-05560-w