Document Type
Conference Proceeding
Publication Date
12-2007
Abstract
We present a general boosting method extending functional gradient boosting to optimize complex loss functions that are encountered in many machine learning problems. Our approach is based on optimization of quadratic upper bounds of the loss functions which allows us to present a rigorous convergence analysis of the algorithm. More importantly, this general framework enables us to use a standard regression base learner such as decision trees for fitting any loss function. We illustrate an application of the proposed method in learning ranking functions for Web search by combining both preference data and labeled data for training. We present experimental results for Web search using data from a commercial search engine that show significant improvements of our proposed methods over some existing methods.
Repository Citation
Zheng, Z.,
Zha, H.,
Zhang, T.,
Chapelle, O.,
Chen, K.,
& Sun, G.
(2007). A General Boosting Method and its Application to Learning Ranking Functions for Web Search. Advances in Neural Information Processing Systems 20: Proceedings of the 2007 Conference.
https://corescholar.libraries.wright.edu/knoesis/312
Included in
Bioinformatics Commons, Communication Technology and New Media Commons, Databases and Information Systems Commons, OS and Networks Commons, Science and Technology Studies Commons
Comments
Presented during the Advances in Neural Information Processing Systems Conference, Vancouver, Canada, December 3-6, 2007.