* * * * * * * * * * * * * * * * * * * * * * * *
LINGUIST List logo Eastern Michigan University Wayne State University *
* People & Organizations * Jobs * Calls & Conferences * Publications * Language Resources * Text & Computer Tools * Teaching & Learning * Mailing Lists * Search *
* *


LINGUIST List 25.1057

Mon Mar 03 2014

Calls: Cognitive Science, Computational Linguistics/USA

Editor for this issue: Anna White <awhitelinguistlist.org>

Date: 03-Mar-2014
From: Ekaterina Shutova <shutova.egmail.com>
Subject: 2nd Workshop on Metaphor in NLP
E-mail this message to a friend

Full Title: 2nd Workshop on Metaphor in NLP
Short Title: Metaphor 2014

Date: 26-Jun-2014 - 26-Jun-2014
Location: Baltimore, MD, USA
Contact Person: Ekaterina Shutova
Meeting Email: < click here to access email >
Web Site: https://sites.google.com/site/workshoponmetaphorinnlp/

Linguistic Field(s): Cognitive Science; Computational Linguistics

Call Deadline: 25-Mar-2014

Meeting Description:

Metaphor processing is a rapidly growing area in NLP. The ubiquity of
metaphor in language has been established in a number of corpus studies and
the role it plays in human reasoning has been confirmed in psychological
experiments. This makes metaphor an important research area for
computational and cognitive linguistics, and its automatic identification
and interpretation indispensable for any semantics-oriented NLP application.

The work on metaphor in NLP and AI started in the 1980s, providing us with
a wealth of ideas on the structure and mechanisms of the phenomenon. The
last decade witnessed a technological leap in natural language computation,
whereby manually crafted rules gradually give way to more robust
corpus-based statistical methods. This is also the case for metaphor
research. In the recent years, the problem of metaphor modeling has been
steadily gaining interest within the NLP community, with a growing number
of approaches exploiting statistical techniques. Compared to more
traditional approaches based on hand-coded knowledge, these more recent
methods tend to have a wider coverage, as well as be more efficient,
accurate and robust. However, even the statistical metaphor processing
approaches so far often focused on a limited domain or a subset of
phenomena. At the same time, recent work on computational lexical semantics
and lexical acquisition techniques, as well as a wide range of NLP methods
applying machine learning to open-domain semantic tasks, open many new
avenues for creation of large-scale robust tools for recognition and
interpretation of metaphor.

The main focus of the workshop will be on computational modeling of
metaphor using state-of-the-art NLP techniques.

Final Call for Papers:

The Second Workshop on Metaphor in NLP
(co-located with ACL 2014)
Baltimore, MD, USA – June 26, 2014

https://sites.google.com/site/workshoponmetaphorinnlp/

Submission deadline: March 25, 2014

Submission website: https://www.softconf.com/acl2014/Metaphor/

The main focus of the workshop will be on computational modelling of metaphor using state-of-the-art NLP techniques. However, papers on cognitive, linguistic, and applied aspects of metaphor are also of interest, provided that they are presented within a computational, a formal or a quantitative framework. We also encourage descriptions of proposals and data sets for shared tasks on metaphor processing. In comparison to last year's workshop, the Second Workshop on Metaphor in NLP will broaden its scope by encouraging submissions on special themes of computational processing of emotions and affect in metaphor, as well as processing of metaphorical language in social media.

The workshop will solicit both full papers and short papers for either oral or poster presentation.



Read more issues|LINGUIST home page|Top of issue



Page Updated: 03-Mar-2014

Supported in part by the National Science Foundation       About LINGUIST    |   Contact Us       ILIT Logo
While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed on its pages, it cannot vouch for their contents.