Random kernel k-nearest neighbors regression
The k-nearest neighbors (KNN) regression method, known for its nonparametric nature, is highly valued for its simplicity and its effectiveness in handling complex structured data, particularly in big data contexts. However, this method is susceptible to overfitting and fit discontinuity, which prese...
Saved in:
Published in | Frontiers in big data Vol. 7; p. 1402384 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Switzerland
Frontiers Media S.A
01.07.2024
|
Subjects | |
Online Access | Get full text |
ISSN | 2624-909X 2624-909X |
DOI | 10.3389/fdata.2024.1402384 |
Cover
Summary: | The k-nearest neighbors (KNN) regression method, known for its nonparametric nature, is highly valued for its simplicity and its effectiveness in handling complex structured data, particularly in big data contexts. However, this method is susceptible to overfitting and fit discontinuity, which present significant challenges. This paper introduces the random kernel k-nearest neighbors (RK-KNN) regression as a novel approach that is well-suited for big data applications. It integrates kernel smoothing with bootstrap sampling to enhance prediction accuracy and the robustness of the model. This method aggregates multiple predictions using random sampling from the training dataset and selects subsets of input variables for kernel KNN (K-KNN). A comprehensive evaluation of RK-KNN on 15 diverse datasets, employing various kernel functions including Gaussian and Epanechnikov, demonstrates its superior performance. When compared to standard KNN and the random KNN (R-KNN) models, it significantly reduces the root mean square error (RMSE) and mean absolute error, as well as improving R-squared values. The RK-KNN variant that employs a specific kernel function yielding the lowest RMSE will be benchmarked against state-of-the-art methods, including support vector regression, artificial neural networks, and random forests. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 Alladoumbaye Ngueilbaye, Shenzhen University, China These authors have contributed equally to this work and share first authorship Edited by: Dongpo Xu, Northeast Normal University, China Reviewed by: Debo Cheng, University of South Australia, Australia |
ISSN: | 2624-909X 2624-909X |
DOI: | 10.3389/fdata.2024.1402384 |