LINGUIST List 26.5226
Sun Nov 22 2015
Calls: Computational Ling/USA
Editor for this issue: Ashley Parker <ashleylinguistlist.org>
Sandra Kuebler <skuebler
Workshop on Discontinuous Structures in Natural Language Processing E-mail this message to a friend
Full Title: Workshop on Discontinuous Structures in Natural Language Processing
Date: 16-Jun-2016 - 17-Jun-2016
Location: San Diego, CA, USA
Contact Person: Sandra Kuebler
Meeting Email: discows2016
Web Site: http://rgcl.wlv.ac.uk/disco/
Linguistic Field(s): Computational Linguistics
Call Deadline: 25-Feb-2016
This workshop is concerned with modeling discontinuous structures across different disciplines in NLP. The modeling of certain structures in natural language requires a mechanism for discontinuity,in the sense that we must account for two or more parts of the structure that are not adjacent. This is true across many languages and on different description levels. For instance, on the lexical level, this concerns discontinuous morphological phenomena such as transfixation (templatic morphology), as well as phrasal verbs, and noncontiguous multiword expressions.
On the syntactic level, discontinuity is caused by phenomena such as extraposition and topicalization, or argument scrambling. Morphologically rich languages (MRLs) are particularly likely to exhibit such phenomena. Other examples include disfluency and anaphora/coreference resolution with discontinuous antecedents; modeling in both of the latter areas requires an extended domain of locality. On a higher level, discontinuity is a relevant factor in machine translation, as well as in complex question answering and in topic structure modeling.
Discontinuity has been studied intensively in a range of different areas, including but not limited to grammar development, syntactic and semantic parsing, morphological analysis, machine translation, anaphora resolution, discourse modeling, automatic summarization and complex question answering. This workshop is intended to bring people from such fields together.
Anne Abeille, University Paris 7
Laura Alonso Alemany, Universidad Nacional de Córdoba
Marianna Apidianaki, LIMSI
Eric de la Clergerie, INRIA
Andreas van Cranenburgh, Huygens Institute for Netherlands History
Corina Forascu, University ''Al. I. Cuza'' Iaşi
Carlos Gomez Rodriguez, University of A Coruña
Eva Hasler, University of Cambridge
Mijail Kabadjov, University of Essex
Laura Kallmeyer, University of Düsseldorf
Philipp Koehn, University of Edinburgh
Johannes Leveling, Elsevier
Timm Lichte, University of Düsseldorf
Georgiana Marsic, University of Wolverhampton
Detmar Meurers, University of Tübingen
JeanLuc Minel, Université Paris Ouest Nanterre La Défense
Sara Moze, University of Wolverhampton
Philippe Muller, University of Toulouse/IRIT
Preslav Nakov, Qatar Computing Research Institute
MarkJan Nederhof, University of St. Andrews
Yannick Parmentier, University of Orléans
Ted Pedersen, University of Minnesota
Irene Renau, Pontificia Universidad Católica de Valparaíso, Chile
Lonneke van der Plas, University of Malta
Djamé Seddah, University Paris 4
Khalil Sima'an, University of Amsterdam
Yannick Versley, University of Heidelberg
Suzan Veberne, University of Nijmegen
Andy Way, Dublin City University
Call for Papers:
Workshop on Discontinuous Structures in Natural Language Processing http://rgcl.wlv.ac.uk/disco/
co-located with NAACL 2016 (San Diego, CA), June 16, 2016
Submission deadline: February 25, 2016
Discontinuity has been studied intensively in a range of different areas, including but not limited to grammar development, syntactic and semantic parsing, morphological analysis, machine translation, anaphora resolution, discourse modeling, automatic summarization and complex question answering. Nevertheless, the treatment of discontinuous structures remains a challenge, Recovering of nonlocal information is generally associated with a high computational cost; and discontinuities are inherently a low frequency phenomenon, i.e. statistical approaches tend to analyze them incorrectly as local phenomena.
The goal of this workshop is to bring together researchers from the different areas to exchange ideas and problem solutions, create synergy effects, and enable more powerful solutions. This encompasses linguistic analyses and work on analyzing or recovering the corresponding structures, but also studies on ''use cases'', which show how information about discontinuity can be used to enhance NLP tasks.
Areas of interest include, but are not limited to, the following topics:
- Theoretical and empirical analyses of nonlocal/discontinuous phenomena.
- Comparisons of different descriptions of the same type of nonlocal information.
- Use, development, and comparison, of techniques for handling nonlocal/discontinuous within NLP tasks; examples of NLP tasks to benefit from handling discontinuous phenomena in machine translation, complex question answering, modeling of discourse, automatic summarization and coreference resolution.
- ''Use cases'' that show how information about discontinuity can enhance an NLP task.
- Annotation of information about nonlocality.
We invite papers presenting completed research including new experimental results, resources, and/or techniques. The maximum length of papers is 8 pages plus references. Submissions must be in PDF format in the NAACL 2016 formatting requirements (available at http://naacl.org/naacl
Reviewing will be double blind, and thus no author information should be included in the papers; self-reference should be avoided as well. Papers that do not conform to these requirements will be rejected without review. Accepted papers will appear in the workshop proceedings.
The submission site will be published on the workshop webpage as soon as it is available.
February 25, 2016: Workshop paper submission deadline
March 20, 2016: Notification of acceptance
March 30, 2016: Camera-ready papers due
June 16/17, 2016: Workshop Date
Wolfgang Maier (University of Düsseldorf, Germany)
Sandra Kübler (Indiana University, USA)
Constantin Orasan (University of Wolverhampton, GB)
Page Updated: 22-Nov-2015