|START Conference Manager|
This paper examines a key question in Information Seeking & Retrieval: how do people assess the usefulness of documents? In an experimental study, we presented 24 participants with five task-based search scenarios and asked them to assess and comment on the usefulness of Web documents from the Canadian government domain. Data was analyzed to test for the effect of five information task types: fact-finding, deciding, doing, learning and problem-solving. Participant assessments show a low level of agreement on usefulness scores overall, but consistency varied by task type. The criteria used to assess usefulness varied by level of usefulness, by task type and by participant. Findings contribute to our understanding of the impact of tasks on information behaviour in the e-government domain.
Program Track: Non-tracked submission type (video or workshop) Submission Type: Research Paper
START Conference Manager (V2.56.8 - Rev. 1261)