Document Type
Conference Proceeding
Publication Date
12-2006
Abstract
We present two new algorithms for online learning in reproducing kernel Hilbert spaces. Our first algorithm, ILK (implicit online learning with kernels), employs a new, implicit update technique that can be applied to a wide variety of convex loss functions. We then introduce a bounded memory version, SILK (sparse ILK), that maintains a compact representation of the predictor without compromising solution quality, even in non-stationary environments. We prove loss bounds and analyze the convergence rate of both. Experimental evidence shows that our proposed algorithms outperform current methods on synthetic and real data.
Repository Citation
Cheng, L.,
Vishwanathan, S. V.,
Schuurmans, D.,
Wang, S.,
& Caelli, T.
(2006). Implicit Online Learning with Kernels. Proceedings of the 20th Annual Conference on Neural Information Processing Systems.
https://corescholar.libraries.wright.edu/knoesis/1005
Included in
Bioinformatics Commons, Communication Technology and New Media Commons, Databases and Information Systems Commons, OS and Networks Commons, Science and Technology Studies Commons
Comments
Presented at the 20th Annual Conference on Neural Information Processing Systems, Vancouver, B.C., Canada, December 4-9, 2006.