[Asis-l] TREC 2011 Crowdsourcing Track: Draft Guidelines Released
ml at ischool.utexas.edu
Mon Apr 11 11:28:00 EDT 2011
Draft guidelines for the TREC 2011 Crowdsourcing Track are now available
* track site: https://sites.google.com/site/treccrowd2011/
* draft guidelines:
* google group for discussing guidelines and track:
We invite feedback from the community on how we might revise guidelines
to create a better shared task, as well as simply how we might clarify
any unclear or confusing points. General comments and suggestions, as
well as questions, are very welcome.
The 2011 TREC Crowdsourcing Track will investigate:
* How to obtain high-quality relevance judgments from individual crowd
* How to effectively compute consensus judgments from individual judgments;
* Interaction between these (i.e., worker accuracy vs. subsequent
consensus accuracy achieved).
Teams can choose to participate in either or both tasks: (1) collecting
judgments and (2) computing consensus. We plan to evaluate using both
ranking and classification metrics, giving teams the flexibility to
focus on either one or both.
We welcome participation from those who have never before participated
in TREC; note there is an agreement form to submit in advance; the form
is available from the track website.
* May 6: (soft) deadline for teams to email us expressing interest to
participate (since we will be partitioning data based on number of
participants). It is important for us to know by this date if at all
* May 13: expected date for releasing final guidelines
Matt and Gabriella
School of Information and Department of Computer Science
University of Texas at Austin
More information about the Asis-l