Publishing Partner: Cambridge University Press CUP Extra Publisher Login
amazon logo
More Info


New from Oxford University Press!

ad

Oxford Handbook of Corpus Phonology

Edited by Jacques Durand, Ulrike Gut, and Gjert Kristoffersen

Offers the first detailed examination of corpus phonology and serves as a practical guide for researchers interested in compiling or using phonological corpora


New from Cambridge University Press!

ad

The Languages of the Jews: A Sociolinguistic History

By Bernard Spolsky

A vivid commentary on Jewish survival and Jewish speech communities that will be enjoyed by the general reader, and is essential reading for students and researchers interested in the study of Middle Eastern languages, Jewish studies, and sociolinguistics.


New from Brill!

ad

Indo-European Linguistics

New Open Access journal on Indo-European Linguistics is now available!


Email this page
E-mail this page

Review of  The Syntactic Process


Reviewer: Radu Daniliuc
Book Title: The Syntactic Process
Book Author: Mark Steedman
Publisher: MIT Press
Linguistic Field(s): Semantics
Syntax
Book Announcement: 11.2540

Discuss this Review
Help on Posting
Review:

Mark Steedman: The Syntactic Process. A Bradford Book
2000, MIT Press Language, Speech, and Communication Series
330 pages

Reviewed by Laura and Radu Daniliuc (The Australian National University)

At the end of a century that experienced a real boom of syntactic theories,
MIT Press offers its voracious readers a new book presenting a fresh
perspective on the theory of grammar. Its author is Mark Steedman, Professor
of Cognitive Science in the Division of Informatics at the University of
Edimburgh. He is also the author of "Surface Structure and Interpretation"
(MIT Press, 1996), which describes a new approach to the theory of natural
language grammar, namely the Combinatory Categorial Grammar. The present
book, published in the Language, Speech, and Communication Series, develops
and clarifies some of the ideas previously formulated and it is built on the
argument that the surface syntax of natural languages maps spoken and
written forms directly to a compositional semantic representation including
predicate-argument structure, quantification and information structure
without forming any intervening structural representation.
Steedman's theory of grammar belongs to the wider circle of Categorial
Grammar, formed by a group of theories of natural language, in which the
main responsibility for defining syntactic form is borne by the lexicon.
Categorial Grammar is considered to be one of the oldest and purest examples
of lexicalized theories of grammar that also includes head-driven phrase
structure grammar, lexical functional grammar, Tree-adjoining grammar,
Montague grammar, relational grammar, and certain recent versions of Chomsky
's theory. Since its beginning somewhere in the 30's (Ajdukiewicz, 1935),
Categorial Grammar has aimed at describing a language by assigning logical
types (syntactic or semantic expressions) to lexical atoms. Types of complex
expressions are derived from types of atomic expressions by a logical proof
that relies on fundamental logical rules interpreted in terms of type
theory. The most important variants of Categorial Grammar were proposed in
the eighties, the most representative being Steedman's Combinatory
Categorial Grammar, as mainly explained in "The Syntactic Process". Drawing
on ideas explored by other categorical grammars, his theory does not cover a
narrow direction, but it refers to such areas as formal linguistics,
intonational phonology, computational linguistics, and experimental
psycholonguistics and is orientated toward Chomsky's goal of formalizing an
explanatory theory of linguistic form.
Steedman tries to demystify the complexity of syntax by arguing that this
complexity is the result of the lexical specification of grammar and of the
small number of universal rule-types for combining predicates and arguments
of the appropriate type and position by rules of functional application
written as X/Y Y => X and Y X\Y => X. In order to associate
predicate-argument structures with syntactic categories, the functional
application rules must be expanded as X/Y:f Y:a => X:Ya and as Y:a X\Y:f =>
X:f a (p.37), rules that apply an identical compositional-semantic operation
in both syntax and semantics. Steedman considers that all combinatory rules
permitted in Combinatory Categorial Grammar obey the Principle of
Combinatory by Transparency which states "all syntactic combinatory rules
are type-transparent versions of one of a small number of simple semantic
operations over functions" (p.37) and their involvement offers a common
mechanism for canonical word order, leftward extraction constructions and
right-node-raising constructions based on a single lexical entry for the
verb.
As it can easily be deduced, the theory of Combinatory Categorial Grammar,
which tries to show a harmony between syntax, semantics, phonology, and
discourse information, is most closely related to Generalized Phrase
Structure Grammar (Gazdar 1981), Head Grammar (Pollard 1984) and
Tree-Adjoining Grammar (Joshi, Levy and Takahashi 1975). Its most basic
assumption has been formulated as the Principle of Lexical Head Government
which states that "both bounded and unbounded syntactic dependencies are
specified by the lexical syntactic type of their head" (p.32). This
lexicalized view on grammar is shaped by the Principle of Head Categorial
Uniqueness which minimizes the size of the lexicon involved by arguing that
"a single nondisjunctive lexical category for the head of a given
construction specifies both the bounded dependencies that arise when its
complements are in canonical position and the unbounded dependencies that
arise when these complements are displaced under relativization,
coordination, and the like" (p.33). The directionality specified in the
lexicon may not be contradicted by combinatory rules, which are
characterized in terms of three universal principles delimiting the space of
possible combinatory rules in all human languages. These principles are:
1. The Principles of Adjacency: Combinatory rules may only apply to finitely
many phonologically realized and string-adjacent entities.
2. The Principle of Consistency: All syntactic combinatory rules must be
consistent with the directionality of the principal function.
3. The Principle of Inheritance: If the category that results from the
application of a combinatory rule is a function category, then the slash
defining directionality for a given argument in that category will be the
same as the one(s) defining directionality for the correspondent argument(s)
in the input function(s). (p.54)
These are the general ideas underlying Steedman's theory of grammar as they
are presented in his book. As for its structure, "The Syntactic Process"
comprises three parts, the first two ("Grammar and Information Structure"
and "Coordination and Word Order") entirely centered on competence and the
last one ("Computation and Performance") dealing with issues of performance
mechanisms and computational matters. One of the main assumptions of the
book is that "the Surface Syntax of natural language acts as a completely
transparent interface between the spoken form of the language, including
prosodic structure and intonational phrasing, and a compositional semantics"
. (p.1) The syntactic and semantic components are related in Steedman's
theory by the Principle of Categorial Type Transparency according to which
"for a given language, the semantic type of the interpretation together with
a number of language-specific directional parameter settings uniquely
determines the syntactic category of a category" (p.36).
The book, considered by its author an attempt to reunite in "a single
framework and a uniform notation" the results of the project that he and his
colleagues worked on, begins by stating some uncontroversial assumptions in
the form of rule-to-rule condition and the competence hypothesis, deducing
some of the even more widely accepted Constituent Condition on rules of
competence grammar. The Introduction endorses the methodological priority of
investigating complex syntax over performance mechanisms.
Part I, which advances an alternative combinatory view of competence
grammar, begins with a chapter on rules, constituents, and fragments and
offers a rethinking of the nature of Surface Structure from coordination,
parenthicalization, and intonation. The traditional notion of Surface
Structure is entirely replaced by a freer notion ("the only necessary
derivational one") of "surface constituency" corresponding to Information
Structure. Syntactic structure is merely "the characterization of the
process of constructing a logical form, rather than a representational level
of structure that actually needs to be built". (p.xi) The next chapter
presents the intuitive basis of Combinatory Categorial Grammars with simple
examples motivating the individual rule types. Furthermore, Steedman
explains the constraints on Natural Grammar, i.e. constraints on bounded and
unbounded constructions. Chapter 5 focuses on structure and intonation,
revising and extending the author's former views (1991) on the matter.
Prosodic information is integrated with the standard grammatical categories
to more directly capture Intonation Structure, together with its
interpretation as Information Structure. The reader should be reminded that
in Combinatory Categorial Grammar Intonation Structure and Surface Structure
are subsumed under the notion of Information Structure and that Intonation
Structure and discourse Information Structure are integrated into the
grammar itself.
Part II can be perceived as a more technical approach that draws on two
related case studies: the "verb-raising" construction in Dutch (in chapter 6
"Cross-Serial Dependencies in Dutch") and gapping in English and Dutch (in
chapter 7 "Gapping and the Order of Constituents").
Part III discusses issues of computation and human grammatical performance.
Chapter 8 talks about "combinators", i.e. operations that map functions onto
other functions, and grammars and Chapter 9 comments on processing in
context, analyzing a specific architecture for a parser. Throughout the
book, the investigation is driven by questions on the language-processing
system as a computational whole. The last chapter, "The Syntactic Interface"
, represents a summary of the architecture of Steedman's theory as a whole,
its role in acquisition and performance, and its relation to other theories
of grammar.
The reader may wonder why a book describing a particular linguistic theory
is entitled "The Syntactic Process". The reason is that its author has in
view a theory of natural grammar more directly compatible with certain
syntactic phenomena flagrantly disrupting order and constituency and with
psychological and computational factors able to map such surface forms onto
interpretable meaning representations.
Steedman's work and theory can be perceived as an attempt to offer solutions
to many puzzling questions that human language has been raising for
centuries. The analyses he presents represent a real challenge for those
interested in linguistics, covering a wide range of works in the field and
opening new territories to be explored.


References
Gazdar, Gerald. 1981. "Unbounded Dependencies and Coordinate Structure".
Linguistic Inquiry, 12, 155-184
Joshi, Aravind, Leon Levy and M. Takahashi. 1975. "Tree-adjunct Grammars".
Journal of Computer Systems Science, 10, 136-163
Pollard, Carl. 1984. Generalized phrase Structure Grammars, Head Grammars,
and Natural Languages. Ph.D. thesis, Stanford University
Wilson, Rob and Keil, Frank. 1999. MIT Encyclopedia of Cognitive Sciences


 
ABOUT THE REVIEWER:

Amazon Store: