Document Type

Conference Proceeding

Publication Date

12-2006

Abstract

We present two new algorithms for online learning in reproducing kernel Hilbert spaces. Our first algorithm, ILK (implicit online learning with kernels), employs a new, implicit update technique that can be applied to a wide variety of convex loss functions. We then introduce a bounded memory version, SILK (sparse ILK), that maintains a compact representation of the predictor without compromising solution quality, even in non-stationary environments. We prove loss bounds and analyze the convergence rate of both. Experimental evidence shows that our proposed algorithms outperform current methods on synthetic and real data.

Comments

Presented at the 20th Annual Conference on Neural Information Processing Systems, Vancouver, B.C., Canada, December 4-9, 2006.


Share

COinS