Skip to main content
Unpublished Paper
Direct Maximization of Rank-Based Metrics for Information Retrieval
(2005)
  • Donald A. Metzler
  • W. Bruce Croft
  • Andrew McCallum, University of Massachusetts - Amherst
Abstract
Ranking is an essential component for a number of tasks, such as information retrieval and collaborative filtering. It is often the case that the underlying task attempts to maximize some evaluation metric, such as mean average precision, over rankings. Most past work on learning how to rank has focused on likelihood- or margin-based approaches. In this work we explore directly maximizing rank-based metrics, which are a family of metrics that only depend on the order of ranked items. This allows us to maximize different metrics for the same training data. We show how the parameter space of linear scoring functions can be reduced to a multinomial manifold. Parameter estimation is accomplished by optimizing the evaluation metric over the manifold. Results from ad hoc information retrieval are given that show our model yields significant improvements in effectiveness over other approaches.
Disciplines
Publication Date
2005
Comments
This is the pre-published version harvested from CIIR.
Citation Information
Donald A. Metzler, W. Bruce Croft and Andrew McCallum. "Direct Maximization of Rank-Based Metrics for Information Retrieval" (2005)
Available at: http://works.bepress.com/andrew_mccallum/140/