KELL: A Kernel-Embedded Local Learning for Data-Intensive Modeling

Authors

DOI:

https://doi.org/10.47852/bonviewAIA32021381

Keywords:

kernel methods, Gaussian process regression, data-intensive modeling, local learning, KELL, complex nonlinear models, prediction accuracy

Abstract

Kernel methods are widely used in machine learning. They introduce a nonlinear transformation to achieve a linearization effect: using linear methods to solve nonlinear problems. However, typical kernel methods like Gaussian process regression suffer from a memory consumption issue for data-intensive modeling: the memory required by the algorithms increases rapidly with the growth of data, limiting their applicability. Localized methods can split the training data into batches and largely reduce the amount of data used each time, thus effectively alleviating the memory pressure. This paper combines the two approaches by embedding kernel functions into local learning methods and optimizing algorithm parameters including the local factors, model orders. This results in the kernel-embedded local learning (KELL) method. Numerical studies show that compared with kernel methods like Gaussian process regression, KELL can significantly reduce memory requirements for complex nonlinear models. And compared with other non-kernel methods, KELL demonstrates higher prediction accuracy.

 

Received: 20 July 2023 | Revised: 21 September 2023 | Accepted: 1 November 2023

 

Conflicts of Interest

The author declares that he has no conflicts of interest to this work.

 

Data Availability Statement

Data sharing is not applicable to this article as no new data were created or analyzed in this study

Metrics

Metrics Loading ...

Downloads

Published

2023-11-10

How to Cite

Luo, C. (2023). KELL: A Kernel-Embedded Local Learning for Data-Intensive Modeling. Artificial Intelligence and Applications, 2(1), 38–44. https://doi.org/10.47852/bonviewAIA32021381

Issue

Section

Research Article