|Full Title:||Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge|
|Start Date:||25-Feb-2013 - 15-Mar-2013|
|Meeting Email:||click here to access email|
|Meeting Description:||Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge (SemEval-2013 Task 7)
Task Registration Deadline: February 15, 2013, at http://bit.ly/XUcQ0v
Challenge Duration: from February 25, 2013 to MARCH 15, 2013
Joint Challenge Email Discussion Group: http://bit.ly/XwcwCR
Detailed Task Description: http://bit.ly/10nQTGJ
Updated Evaluation Code and Baselines: http://bit.ly/140DHMH
SemEval Workshop: June 13-14, 2013, co-located with NAACL (http://bit.ly/140uIuN) & *Sem (http://bit.ly/WknhYY), Atlanta, Georgia, USA
We are pleased to invite participants to the Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge at SemEval 2013, a common effort of both educational technology and textual inference communities in order to present a unified scientific challenge addressing researchers in both fields. It will be run together with the SEMEVAL-2013 Semantic Evaluation Exercise, co-located with the *SEM and NAACL-2013 conferences.
The goal of the task is to produce an assessment of student answers to explanation and definition questions typically asked in practice exercises, tests or tutorial dialogue. It is linked to the tasks of semantic analysis and recognizing textual entailment in the textual inference community, and essay grading and short answer assessment in the educational NLP community.
Specifically, given a question, a known correct ‘reference answer’ and a 1- or 2-sentence ‘student answer’, the Main task consists of assessing the correctness of a student’s answer at different levels of granularity - either 5-way (correct, partially correct, contradictory, irrelevant, not in the domain), 3-way task (correct, contradictory, incorrect), or 2-way task (correct, incorrect). Participants can opt to carry out the task at any level of granularity.
More detail is available at the task website, http://bit.ly/10nQTGJ.
February 15, 2013: Registration deadline
MAIN TASK test set release: February 25
MAIN TASK submissions: March 4
PILOT TASK test set release: March 5
PILOT TASK submissions: March 12
Results to participants: March 15
April 9, 2013: Paper submission deadline [TBC]
April 23, 2013: Reviews due [TBC]
May 4, 2013: Camera ready due [TBC]
June 13-14, 2013, co-located with NAACL & *Sem, Atlanta, Georgia, USA
Participating in the Challenge:
More details on the Main and Pilot tasks are available at the SEMEVAL 2013 website: http://bit.ly/10nQTGJ.
Future updates will be distributed through the Joint Challenge Google Discussion Group, please join to ensure that you will receive announcements and status updates, at http://bit.ly/XwcwCR
To help us with forward planning, please register now at SemEval 2013 website if you are intending to enter the challenge (http://bit.ly/XUcQ0v). All teams are required to register by 15 February 2013.
|Linguistic Subfield:||Computational Linguistics|
|Calls and Conferences main page|