KELL: A Kernel-Embedded Local Learning for Data-Intensive Modeling
DOI:
https://doi.org/10.47852/bonviewAIA32021381Keywords:
kernel methods, Gaussian process regression, data-intensive modeling, local learning, KELL, complex nonlinear models, prediction accuracyAbstract
Kernel methods are widely used in machine learning. They introduce a nonlinear transformation to achieve a linearization effect: using linear methods to solve nonlinear problems. However, typical kernel methods like Gaussian process regression suffer from a memory consumption issue for data-intensive modeling: the memory required by the algorithms increases rapidly with the growth of data, limiting their applicability. Localized methods can split the training data into batches and largely reduce the amount of data used each time, thus effectively alleviating the memory pressure. This paper combines the two approaches by embedding kernel functions into local learning methods and optimizing algorithm parameters including the local factors, model orders. This results in the kernel-embedded local learning (KELL) method. Numerical studies show that compared with kernel methods like Gaussian process regression, KELL can significantly reduce memory requirements for complex nonlinear models. And compared with other non-kernel methods, KELL demonstrates higher prediction accuracy.
Received: 20 July 2023 | Revised: 21 September 2023 | Accepted: 1 November 2023
Conflicts of Interest
The author declares that he has no conflicts of interest to this work.
Data Availability Statement
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Metrics
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Author
This work is licensed under a Creative Commons Attribution 4.0 International License.