TY - JOUR
T1 - User perspectives on relevance criteria
T2 - A comparison among relevant, partially relevant, and not-relevant judgments
AU - Maglaughlin, Kelly L.
AU - Sonnenwald, Diane H.
PY - 2002/3/1
Y1 - 2002/3/1
N2 - This study investigates the use of criteria to assess relevant, partially relevant, and not-relevant documents. Study participants identified passages within 20 document representations that they used to make relevance judgments; judged each document representation as a whole to be relevant, partially relevant, or not relevant to their information need; and explained their decisions in an interview. Analysis revealed 29 criteria, discussed positively and negatively, that were used by the participants when selecting passages that contributed or detracted from a document's relevance. These criteria can be grouped into six categories: abstract (e.g., citability, informativeness), author (e.g., novelty, discipline, affiliation, perceived status), content (e.g., accuracy/validity, background, novelty, contrast, depth/scope, domain, citations, links, relevant to other interests, rarity, subject matter, thought catalyst), full text (e.g., audience, novelty, type, possible content, utility), journal/publisher (e.g., novelty, main focus, perceived quality), and personal (e.g., competition, time requirements). Results further indicate that multiple criteria are used when making relevant, partially relevant, and not-relevant judgments, and that most criteria can have either a positive or negative contribution to the relevance of a document. The criteria most frequently mentioned by study participants were content, followed by criteria characterizing the full text document. These findings may have implications for relevance feedback in information retrieval systems, suggesting that systems accept and utilize multiple positive and negative relevance criteria from users. Systems designers may want to focus on supporting content criteria followed by full text criteria as these may provide the greatest cost benefit.
AB - This study investigates the use of criteria to assess relevant, partially relevant, and not-relevant documents. Study participants identified passages within 20 document representations that they used to make relevance judgments; judged each document representation as a whole to be relevant, partially relevant, or not relevant to their information need; and explained their decisions in an interview. Analysis revealed 29 criteria, discussed positively and negatively, that were used by the participants when selecting passages that contributed or detracted from a document's relevance. These criteria can be grouped into six categories: abstract (e.g., citability, informativeness), author (e.g., novelty, discipline, affiliation, perceived status), content (e.g., accuracy/validity, background, novelty, contrast, depth/scope, domain, citations, links, relevant to other interests, rarity, subject matter, thought catalyst), full text (e.g., audience, novelty, type, possible content, utility), journal/publisher (e.g., novelty, main focus, perceived quality), and personal (e.g., competition, time requirements). Results further indicate that multiple criteria are used when making relevant, partially relevant, and not-relevant judgments, and that most criteria can have either a positive or negative contribution to the relevance of a document. The criteria most frequently mentioned by study participants were content, followed by criteria characterizing the full text document. These findings may have implications for relevance feedback in information retrieval systems, suggesting that systems accept and utilize multiple positive and negative relevance criteria from users. Systems designers may want to focus on supporting content criteria followed by full text criteria as these may provide the greatest cost benefit.
U2 - 10.1002/asi.10049
DO - 10.1002/asi.10049
M3 - Journal article
AN - SCOPUS:0036501152
SN - 2330-1635
VL - 53
SP - 327
EP - 342
JO - American Society for Information Science and Technology. Journal
JF - American Society for Information Science and Technology. Journal
IS - 5
ER -