URN zum Zitieren der Version auf EPub Bayreuth: urn:nbn:de:bvb:703-epub-8215-7
Titelangaben
Köhler, Hannes:
Lp- and Risk Consistency of Localized SVMs.
In: Neurocomputing.
Bd. 598
(2024)
.
- 128060.
ISSN 0925-2312
DOI der Verlagsversion: https://doi.org/10.1016/j.neucom.2024.128060
Volltext
![]() |
|
||||||||
Download (929kB)
|
Angaben zu Projekten
Projektfinanzierung: |
Deutsche Forschungsgemeinschaft |
---|
Abstract
Kernel-based regularized risk minimizers, also called support vector machines (SVMs), are known to possess many desirable properties but suffer from their super-linear computational requirements when dealing with large data sets. This problem can be tackled by using localized SVMs instead, which also offer the additional advantage of being able to apply different hyperparameters to different regions of the input space. In this paper, localized SVMs are analyzed with regards to their consistency. It is proven that they inherit Lp- as well as risk consistency from global SVMs under very weak conditions. Though there already exist results on the latter of these two properties, this paper significantly generalizes them, notably also allowing the regions that underlie the localized SVMs to change as the size of the training data set increases, which is a situation also typically occurring in practice.