Since its inception, the venerable TREC retrieval conference has relied upon specialist assessors or participating groups to create relevance judgments for the tracks that it runs. However, recently crowdsourcing has been proposed as a possible alternative to traditional TREC-like assessments, supporting fast accumulation of judgments at a low cost. 2010 was the first year that TREC experimented with crowdsourcing. In this paper, we report our successful experience in creating relevance assessments for the TREC Blog track 2010 top news stories task. We conclude that crowdsourcing is an effective alternative to using specialist assessors or participating groups for this task.
Richard McCreadie, Craig Macdonald, and Iadh Ounis.
Crowdsourcing Blog Track Top News Judgments at TREC.
In Proceedings of CSDM 2010.
Hong Kong, China, 2011.
0 comments:
Post a Comment