Model selection algorithm in Gaussian process regression for computer experiments

The model in our approach assumes that computer responses are a realization of a Gaussian processes superimposed on a regression model called a Gaussian process regression model (GPRM). Selecting a subset of variables or building a good reduced model in classical regression is an important process t...

Full description

Saved in:
Bibliographic Details
Published inCommunications for statistical applications and methods Vol. 24; no. 4; pp. 383 - 396
Main Authors Lee, Youngsaeng, Park, Jeong-Soo
Format Journal Article
LanguageKorean
Published 한국통계학회 31.07.2017
Subjects
Online AccessGet full text
ISSN2287-7843

Cover

More Information
Summary:The model in our approach assumes that computer responses are a realization of a Gaussian processes superimposed on a regression model called a Gaussian process regression model (GPRM). Selecting a subset of variables or building a good reduced model in classical regression is an important process to identify variables influential to responses and for further analysis such as prediction or classification. One reason to select some variables in the prediction aspect is to prevent the over-fitting or under-fitting to data. The same reasoning and approach can be applicable to GPRM. However, only a few works on the variable selection in GPRM were done. In this paper, we propose a new algorithm to build a good prediction model among some GPRMs. It is a post-work of the algorithm that includes the Welch method suggested by previous researchers. The proposed algorithms select some non-zero regression coefficients (_`s) using forward and backward methods along with the Lasso guided approach. During this process, the fixed were covariance parameters (_`s) that were pre-selected by the Welch algorithm. We illustrated the superiority of our proposed models over theWelch method and non-selection models using four test functions and one real data example. Future extensions are also discussed.
Bibliography:The Korean Statistical Society
KISTI1.1003/JNL.JAKO201724655835638
ISSN:2287-7843