LINGUIST List 25.2326|
Tue May 27 2014
Calls: German, Computational Linguistics/Germany
Editor for this issue: Anna White
From: Sebastian Pado <padoims.uni-stuttgart.de>
Subject: GermEval 2014 Named Entity Recognition Shared Task
E-mail this message to a friend
Full Title: GermEval 2014 Named Entity Recognition Shared Task
Date: 08-Oct-2014 - 10-Oct-2014
Location: Hildesheim, Germany
Contact Person: Sebastian Pado
Meeting Email: < click here to access email >
Web Site: https://sites.google.com/site/germeval2014ner/
Linguistic Field(s): Computational Linguistics
Subject Language(s): German
Call Deadline: 15-Aug-2014
GermEval 2014 Named Entity Recognition Shared Task for German
Co-located with KONVENS 2014, October 8-10, Hildesheim, Germany
Named Entity Recognition (NER) has been shown useful for a wide range of NLP tasks.
Even though German is a relatively well-resourced language, NER for German has been challenging, both because capitalization is a less useful feature than in other languages, and because existing training data sets are encumbered by license problems. Therefore, no publicly available NER taggers for German exist that are free of usage restrictions and perform at high levels of accuracy.
The GermEval 2014 NER Shared Task is an event that makes available CC-licensed German data with NER annotation with the goal of significantly advancing the state of the art in German NER and to push the field of NER towards nested representations of named entities.
GermanEval 2014 NER is associated with KONVENS 2014 and will take place as a KONVENS workshop at Hildesheim in Oct 2014.
Language Technology, Technische Universität Darmstadt
IMS, Stuttgart University
 D. Benikova, C. Biemann, M. Reznicek. NoSta-D Named Entity Annotation for German: Guidelines and Dataset. To be presented at LREC 2014, Reykjavik
2nd Call for Participation:
We invite all researchers and industry professionals to participate in the challenge and to demonstrate their capabilities of creating a NER system for German.There are no restrictions regarding the type of NER system submissions, and no restrictions on the use of external data, background corpora, lexical resources etc.
The GermEval 2014 NER Shared Task builds on a new dataset with German NE annotation with the following properties:
- The data was sampled from German Wikipedia and News Corpora as a collection of citations
- The dataset covers over 31,000 sentences corresponding to over 590,000 tokens
- The NER annotation uses the NoSta-D guidelines, which extend the Tübingen Treebank guidelines, using four main NER categories with sub-structure, and annotating embeddings among NEs such as [ORG FC Kickers [LOC Darmstadt]]
Data and Guidelines are available for download at https://sites.google.com/site/germeval2014ner/
We split the dataset into training, development and test sets and provide the datasets in a tab-separated (TSV) format.
- Training Set
- Development Set
- Test Set (Available Aug 1, 2014 in unannotated form, from Sep 1, 2014 in annotated form)
Further, we provide an evaluation script (adopted from the CoNLL competitions) assessing a given TSV file against a gold standard. Evaluation script and manual are also available for download at https://sites.google.com/site/germeval2014ner/ .
There is just one track. Participants may submit up to three runs.
Submissions consist of a TSV file providing predictions for the test data and a paper of up to 4 pages (including references) describing the chosen approach and analyzing the performance. Papers should follow the KONVENS 2014 style files. The papers will be published online. We expect authors to present summaries of their systems at the KONVENS workshop.
Aug 1-15, 2014: Availability of test data and submission of model results
Aug 15, 2014: Deadline for Shared Task description submissions
Sep 1, 2014: Notification of Acceptance and Shared Task Results
Sep 15, 2014: Deadline camera-ready papers
Oct 7, 2014: GemEval NER workshop Konvens
Oct 8 - 10, 2014: Konvens Main Conference
Read more issues|LINGUIST home page|Top of issue
Page Updated: 27-May-2014
While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed
on its pages, it cannot vouch for their contents.