* * * * * * * * * * * * * * * * * * * * * * * *
LINGUIST List logo Eastern Michigan University Wayne State University *
* People & Organizations * Jobs * Calls & Conferences * Publications * Language Resources * Text & Computer Tools * Teaching & Learning * Mailing Lists * Search *
* *
LINGUIST List 20.2540

Sat Jul 18 2009

Calls: Computational Linguistics, Phonetics, Phonology/USA

Editor for this issue: Amy Brunett <brunettlinguistlist.org>


LINGUIST is pleased to announce the launch of an exciting new feature: Easy Abstracts! Easy Abs is a free abstract submission and review facility designed to help conference organizers and reviewers accept and process abstracts online. Just go to: http://www.linguistlist.org/confcustom, and begin your conference customization process today! With Easy Abstracts, submission and review will be as easy as 1-2-3!
Directory
        1.    Mathieu Avanzi, Prosodic Prominence: Perceptual and Automatic Identification

Message 1: Prosodic Prominence: Perceptual and Automatic Identific
Date: 18-Jul-2009
From: Mathieu Avanzi <mathieu.avanziunine.ch>
Subject: Prosodic Prominence: Perceptual and Automatic Identific
E-mail this message to a friend

Full Title: Prosodic Prominence: Perceptual and Automatic Identific
Short Title: Prom-2010

Date: 10-May-2010 - 10-May-2010
Location: Chicago, USA
Contact Person: Mathieu Avanzi
Meeting Email: < click here to access email >
Web Site: http://www2.unine.ch/speechprosody-prominence/page28592.html

Linguistic Field(s): Computational Linguistics; Phonetics; Phonology

Call Deadline: 25-Nov-2009

Meeting Description:

Speech Prosody 2010 Satellite Workshop:

Efficient tools for (semi-)automatic prosodic annotation are becoming more and more important for the speech community, as most systems of prosodic annotation rely on the identification of syllabic prominence in spoken corpora (whether they lead a phonological interpretation or not). The use of automatic and semi-automatic annotation has also facilitated multilingual research; many experiments on prosodic prominence identification have been conducted for European and non-European languages, and protocols have been written in order to build large databases of spoken languages prosodically annotated all around the world.

The aim of this workshop is to bring together specialists of automatic prosodic annotation interested in the development of robust algorithms for prominence detection, and linguists who developed methodologies for the identification of prosodic prominence in natural languages on perceptual bases. The conference will include oral and poster sessions, and a final round table.

Call for Papers:

Extended Dealine - 4-page papers: November 25, 2009

The following topics will be considered:

-Annotation of prominences
-Perceptual processing of prominences: background of the Gestalt theories
-Acoustic correlates of prominence
-Prominence and its relations with prosodic structure
-Prominences and its relations with accent, stress, tone and boundary
-The use of syntactic/pragmatic information in prominence identification
-Perception of prominence by naive listeners
-Statistical methods for prominences' detection
-Number of relevant prominence degrees: categorical or continuous scale
-Prosodic prominence and visual perception

Paper Submission:

Anonymous four-page papers (including figures and references) must be written in English and uploaded as pdf files through http://www2.unine.ch/speechprosody-prominence/page28603.html. All papers will be reviewed by a minimum of three members of the scientific committee. Accepted four-page papers will be included in the online proceedings of the workshop, published on the workshop website. The publication of extended selected papers after the workshop in a special issue of a journal is being considered.



Read more issues|LINGUIST home page|Top of issue




Please report any bad links or misclassified data

LINGUIST Homepage | Read LINGUIST | Contact us

NSF Logo

While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed
on its pages, it cannot vouch for their contents.