Prioritizing relevance judgments to improve the construction of IR test collections

Mehdi Hosseini, Ingemar J Cox, Natasa Milic-Frayling, Trevor Sweeting, Vishwa Vinay

7 Citations (Scopus)

Abstract

We consider the problem of optimally allocating a fixed budget to construct a test collection with associated relevance judgements, such that it can (i) accurately evaluate the relative performance of the participating systems, and (ii) generalize to new, previously unseen systems. We propose a two stage approach. For a given set of queries, we adopt the traditional pooling method and use a portion of the budget to evaluate a set of documents retrieved by the participating systems. Next, we analyze the relevance judgments to prioritize the queries and remaining pooled documents for further relevance assessments. The query prioritization is formulated as a convex optimization problem, thereby permitting efficient solution and providing a flexible framework to incorporate various constraints. Query-document pairs with the highest priority scores are evaluated using the remaining budget. We evaluate our resource optimization approach on the TREC 2004 Robust track collection. We demonstrate that our optimization techniques are cost efficient and yield a significant improvement in the reusability of the test collections.

Translated title of the contributionPrioritizing relevance judgments to improve the construction of IR test collections
Original languageEnglish
Title of host publicationProceedings of the 20th ACM international conference on Information and knowledge management
Number of pages6
Publication date2011
Pages641-646
Publication statusPublished - 2011
Externally publishedYes

Fingerprint

Dive into the research topics of 'Prioritizing relevance judgments to improve the construction of IR test collections'. Together they form a unique fingerprint.

Cite this