* * * * * * * * * * * * * * * * * * * * * * * *
LINGUIST List logo Eastern Michigan University Wayne State University *
* People & Organizations * Jobs * Calls & Conferences * Publications * Language Resources * Text & Computer Tools * Teaching & Learning * Mailing Lists * Search *
* *
LINGUIST List 19.1514

Thu May 08 2008

Diss: Comp Ling: Eisenstein: 'Gesture in Automatic Discourse Proces...'

Editor for this issue: Evelyn Richter <evelynlinguistlist.org>


To post to LINGUIST, use our convenient web form at http://linguistlist.org/LL/posttolinguist.html.
Directory
        1.    Jacob Eisenstein, Gesture in Automatic Discourse Processing


Message 1: Gesture in Automatic Discourse Processing
Date: 07-May-2008
From: Jacob Eisenstein <jacobecsail.mit.edu>
Subject: Gesture in Automatic Discourse Processing
E-mail this message to a friend

Institution: Massachusetts Institute of Technology
Program: Computer Science and Artificial Intelligence Laboratory
Dissertation Status: Completed
Degree Date: 2008

Author: Jacob Eisenstein

Dissertation Title: Gesture in Automatic Discourse Processing

Dissertation URL: http://people.csail.mit.edu/jacobe/diss.html

Linguistic Field(s): Computational Linguistics

Dissertation Director:
Randall Davis
Regina Barzilay

Dissertation Abstract:

Computers cannot fully understand spoken language without access to the
wide range of modalities that accompany speech. This thesis addresses the
particularly expressive modality of hand gesture, and focuses on building
structured statistical models at the intersection of speech, vision, and
meaning.

My approach is distinguished in two key respects. First, gestural patterns
are leveraged to discover parallel structures in the meaning of the
associated speech. This differs from prior work that attempted to interpret
individual gestures directly, an approach that was prone to a lack of
generality across speakers. Second, I present novel, structured statistical
models for multimodal language processing, which enable learning about
gesture in its linguistic context, rather than in the abstract.

These ideas find successful application in a variety of language processing
tasks: resolving ambiguous noun phrases, segmenting speech into topics, and
producing keyframe summaries of spoken language. In all three cases, the
addition of gestural features -- extracted automatically from video --
yields significantly improved performance over a state-of-the-art text-only
alternative. This marks the first demonstration that hand gesture improves
automatic discourse processing.



Read more issues|LINGUIST home page|Top of issue




Please report any bad links or misclassified data

LINGUIST Homepage | Read LINGUIST | Contact us

NSF Logo

While the LINGUIST List makes every effort to ensure the linguistic relevance of sites listed
on its pages, it cannot vouch for their contents.