LINGUIST List 23.2548

Wed May 30 2012

Review: Syntax; Linguistic Theories; Language Acquisition: Borsley & Börjars (2011)

Editor for this issue: Rajiv Rao <>

Date: 30-May-2012
From: Yusuke Kubota <>
Subject: Non-Transformational Syntax
E-mail this message to a friend

Discuss this message

Announced at
EDITORS: Robert D. Borsley and Kersti BörjarsTITLE: Non-Transformational SyntaxSUBTITLE: Formal and Explicit Models of GrammarPUBLISHER: Wiley-BlackwellYEAR: 2011

Yusuke Kubota, Department of Language and Information Sciences, University of Tokyo


This book presents a state-of-the-art overview of the three major variants ofnon-transformational syntactic theories: Head-Driven Phrase Structure Grammar(HPSG), Lexical-Functional Grammar (LFG) and Categorial Grammar (CG). The bookis divided into two parts; the first part consists of chapters (two for eachtheory) that provide thorough overviews of these theories, and the other sixchapters deal with related and somewhat broader issues such as sentenceprocessing, language acquisition, and general theoretical issues such as therole of features in grammatical theory and the notion of lexicalism.

In the first chapter, ''Elementary Principles of Head-Driven Phrase StructureGrammar'', Georgia M. Green presents the basics of HPSG. Green begins with adiscussion of general architectural considerations that have informed theformulation of HPSG, such as the notion of grammar as a set of constraints andthe organization of constraints and grammar rules in terms of typed featurestructures. The chapter then sketches a simple grammar of English, explaininghow basic syntactic notions and phenomena such as subcategorization, agreement,binding and long-distance dependencies are treated in HPSG. In HPSG, recursiveobjects called feature structures, which encode feature-value pairs (where thevalues of certain features can themselves be feature-value pairs), play acentral role in grammatical description. Green illustrates how identityconditions (called structure sharing) imposed on such complex feature structuresenable explicit and precise analyses of linguistic phenomena without recourse tothe notion of syntactic transformation.

Building on Green's introductory chapter, ''Advanced Topics in Head-Driven PhraseStructure Grammar'', by Andreas Kathol, Adam Przepiórkowski and Jesse Tsengdiscusses broader and more advanced issues in HPSG. The chapter concisely coversa wide range of topics, including: a lexicalist treatment of complex predicatesin terms of argument composition; the linearization-based approach to wordorder-related phenomena (an extension of HPSG that decouples linear order fromhierarchical constituency); the Minimal Recursion Semantics framework ofunderspecified semantics; sophisticated treatments of morpho-syntactic issuessuch as clitics, case assignment and agreement; and an approach to integratingHPSG with the ideas from Construction Grammar. The interconnections (andpossible tensions) between different analytic techniques and lines ofresearch---such as the opposition between the argument composition-based vs.linearization-based analyses of complex predicates---are addressed carefully.The chapter is essentially a snapshot of cutting edge HPSG research around theend of the 1990s, but many issues discussed here are still relevant and havewider implications cross-theoretically.

The next two chapters deal with LFG. In ''Lexical-Functional Grammar:Interactions between Morphology and Syntax'', Rachel Nordlinger and Joan Bresnandescribe the morpho-syntactic component of LFG. The idea behind LFG is thatmaking phrase structure representations (called c-structure) maximally simpleand representing notions such as grammatical relations in a separate component(called f-structure) leads to an overall simplification of grammar. The authorsdemonstrate how this multi-component architecture enables a simplecharacterization of the difference between configurational andnon-configurational languages, where different parts of grammar (syntactic rulesvs. lexicon) are made primarily responsible for building up (nearly identical)f-structures in the two language types. This is followed by an analysis of asomewhat more complicated case, where multiple parts of c-structure add up tospecify one unitary component in f-structure, which is found in Welsh verb orderand multiple tense marking in Australian languages. Here, the notion of'co-head' plays a central role in dispensing with the notion of head movement intransformational approaches.

In ''Lexical-Functional Grammar: Functional Structure'', Helge Lødrup explains therole of f-structure in LFG. The first half of the chapter is an exposition ofLexical Mapping Theory (LMT), a theory of argument realization developed in LFG.Here, Lødrup first explains how the mapping between semantic roles andgrammatical relations is generally mediated via a small set of principles makingreference to two binary features, +/-o(bjective) and +/-r(estrictive). Then, theauthor illustrates how this general theory of argument realization is employedin the analyses of argument alternation phenomena such as passive and locativeinversion. The second half consists of analyses of syntactic phenomena such asraising, control, long-distance dependencies and binding. In the analyses ofthese phenomena, imposing various kinds of identity conditions between differentparts of f-structure plays a crucial role. Lødrup explains how the key notionsof functional control, anaphoric control and functional uncertainty areformulated in LFG and are employed in the analyses of these phenomena (i.e.raising, control, etc.).

The last two chapters in the theory part deal with CG. Unlike the chapters forHPSG and LFG, the two chapters for CG each independently introduce differentvariants of CG. ''Combinatory Categorial Grammar'', by Mark Steedman and JasonBaldridge, presents the theory of Combinatory Categorial Grammar (CCG). Thechapter starts with a simple CG grammar, consisting of function applicationalone, and motivates an extension to CCG that has more flexible rules such astype-raising and function composition. The rules introduced are explainedalongside relevant linguistic examples. This is followed by analyses of majorsyntactic phenomena, including binding, control, raising, long-distancedependencies and coordination. The notion of modal control, a major theoreticalrevision to the CCG introduced in Baldridge (2002), is explained along the way.This innovation, building on a technique originally developed in Type-LogicalGrammar (TLG), enables CCG to maintain a fully universal rule componentcross-linguistically. The chapter ends by briefly touching on implications forhuman sentence processing and computational implementation.

''Multi-Modal Type-Logical Grammar'', by Richard T. Oehrle, presents an overviewof TLG. TLG differs from CCG in that it literally identifies grammar (of naturallanguage) as a kind of logic. Thus, in TLG, operations such as type-raising andfunction composition are not recognized as primitive rules but rather as derivedtheorems. Oehrle starts by laying out the basic theoretical setup of TLG, whichis followed by a couple of linguistic applications. Among these, the interactionbetween raising and quantifier scope illustrates the flexibility of the theory(especially its syntax-semantics interface). The provided fragment of Dutchillustrates another important aspect of Multi-Modal TLG, namely, the notion ofmodal control. Here, the mismatch between surface word order andpredicate-argument structure exhibited by cross-serial dependencies is mediatedby a type of rule called ‘structural rules’, which govern the way in whichsyntactic proof is conducted. This, in effect, allows for modeling the notion ofverb raising in transformational grammar in a logically precise setup. Anextensive appendix at the end situates TLG in the larger context of logic-basedapproaches to linguistic theory and provides pointers to original sources andfurther linguistic applications.

The rest of the book deals with somewhat broader issues. In ''AlternativeMinimalist Visions of Language'', Ray Jackendoff compares current minimalisttheory with the Simpler Syntax approach that he endorses, which is closelyrelated to HPSG and LFG. The discussion centers on the fact that mainstreamgenerative syntax has so far relied on an unwarranted distinction between 'core'and 'peripheral' phenomena, and has failed to attain descriptive adequacy bysimply ignoring the latter. Jackendoff takes up some representative cases ofsuch 'peripheral' phenomena, and demonstrates that they exhibit properties thatare strikingly similar to 'core' phenomena. In an approach that draws acategorical distinction between the 'core' and the 'periphery', suchsimilarities cannot be anything other than a pure accident. Jackendoff arguesthat such a treatment misses an important generalization and concludes thatcertain constraint-based approaches to syntax, including Simpler Syntax, wherethe commonality between the 'core' and the 'periphery' can be seamlesslycaptured by the notion of constructions, embody a more adequate architecture ofgrammar.

In ''Feature-Based Grammar'', James P. Blevins discusses the problem of syncretismin the context of feature-based theories such as HPSG and LFG. In thesetheories, agreement is typically handled via unification. That is, the governingverb and the subcategorized element each contribute their own specifications foragreement features such as case and gender, and agreement is enforced byunifying the (often partial) information contributed by each element to yield acomplete description. If no coherent description is obtained via unification,agreement fails. Syncretic forms are problematic for this type of approach,since, cross-linguistically, such forms can often simultaneously satisfyconflicting morphological requirements (typically, as a shared argument ofcoordinated functors) by virtue of the fact that they happen to have identicalphonological forms for the conflicting specifications. A simpleunification-based approach incorrectly predicts that such cases lead toagreement failure. Blevins suggests that this problem can be avoided byreplacing the notion of unification by the notion of subsumption, which merelychecks whether the specifications of subcategorizing and subcategorized elementsare consistent. The chapter ends by briefly discussing whether such a change canbe readily implemented in HPSG and LFG, and concludes that the waysubcategorization is handled in HPSG, in terms of cancellation of list-valuedspecifications of subcategorized elements, poses a problem for a straightforwardimplementation of the subsumption-based approach.

In ''Lexicalism, Periphrasis, and Implicative Morphology'', Farrell Ackerman,Gregory T. Stump and Gert Webelhuth provide a detailed review of the notion oflexicalism. They identify four principles which may plausibly be taken toconstitute the notion of lexicalism. A widely adopted approach to complexpredicates in HPSG and LFG, known as argument composition, violates one of theseprinciples, which states that syntactic operations cannot alter lexicalproperties encoded in words, where argument structure is taken to be part oflexical properties. The authors then suggest an alternative possibility in whicha different principle is abandoned; one which dictates that lexemes besyntactically realized as a single word (expressed as a continuous string).This, in effect, introduces discontinuous constituency, and, as such, theauthors illustrate an approach to the morphology-syntax interface building onthe realizational model of morphology, which implements this analytic option.The framework is illustrated with analyses of two phenomena exhibiting(potentially) discontinuously expressed complex morphological words: compoundtense in Slavic languages in the inflectional domain; and phrasal predicates inHungarian in the derivational domain.

''Performance-Compatible Competence Grammar'', by Ivan A. Sag and Thomas Wasow,discusses how the surface-oriented and constraint-based architecture that manynon-transformational theories share bears on the question of constructing arealistic model of human sentence processing. The chapter discusses some recentexperimental results showing that human sentence processing is incremental andparallel, and exploits different levels of grammatical information as soon asthey become available. Constraint-based grammars, the authors argue, provide amore natural fit to these experimental results, since the grammar is free fromthe notion of 'syntactic derivation', which, without a highly abstractcharacterization of the relationship between competence grammar and performance,is inconsistent with such experimental results. The authors provide a briefcomparison between their model and Philips's (1996) strictly incremental modelbased on minimalist syntax, speculating that once Philips's model is completelyformalized, it might result in a constraint-based reformulation of theminimalist theory. They reject Philips's proposal in the end, however,commenting that too much detail is left unresolved in his proposal.

The final two chapters deal with language acquisition. The two chapters addressthis question from entirely different perspectives. In ''Modeling Grammar Growth:Universal Grammar without Innate Principles or Parameters'', Georgia M. Greensketches an outline of a theory of language acquisition where the knowledge ofgrammar is acquired in an incremental manner, without presupposing any innatelanguage acquisition faculty. The key idea that Green puts forward is that many(or most) aspects of language acquisition can be thought of as instances of moregeneral cognitive capacities that the infant is developing at the same time as(s)he is learning language. Green sketches how the development from the one-wordutterance stage to the multi-word utterance stage, and the subsequentacquisition of polar and constituent questions, can be modeled as incrementalgrammar development. The discussion touches on several fundamental issues inlanguage acquisition that are simply shielded from being scrutinized inapproaches to language acquisition that start from the innateness premise.

In contrast to the emergent view of Green, in ''Language Acquisition withFeature-Based Theories'', Aline Villavicencio assumes the innateness view andjustifies this choice by pointing out the lack of any adequate and explicitmodel of language acquisition without an innate component. Villavicencio thenlists five elements that need to be specified in detail in any explicit model oflanguage acquisition: the object being learned; the learning data orenvironment; the hypothesis space; what counts as successful learning; and theprocedure that updates the learner’s hypothesis. The chapter reviews previousresearch addressing each of these issues, focusing on work that is consistentwith the assumptions of constraint-based and feature-based grammaticalframeworks. As part of this literature review, a relatively detailed sketch of aword order acquisition model is provided. In this model, Universal Grammar isformalized as a set of grammar rules in a unification-based CG organized in atyped inheritance hierarchy. The problem of word order acquisition is modeled asa problem of parameter setting, where the interdependence between the parametersis captured by means of default inheritance. Villavicencio argues that this useof default inheritance leads to a plausible model of language acquisition, sincethe organization of information in terms of default inheritance hierarchiesreduces the amount of information that a learner needs to be exposed to until(s)he arrives at the target grammar.


This book is of great value to researchers and students in syntax and relatedfields such as psycholinguistics, computational linguistics and formalsemantics. The first six chapters explain the general theoretical motivations ofeach theory succinctly, illustrate their linguistic application clearly, andprovide pointers to relevant literature. The other six chapters are also usefulin situating these theories within a larger context. I am thoroughly impressedby the breadth and depth covered in this volume. The book is literally packedwith useful information and thought-provoking ideas, crystallizing the insightsresulting from research on non-transformational syntax in the past 30 years orso. The chapter by Kathol et al. on advanced topics in HPSG and the one byOehrle on TLG are especially valuable. The former illuminates the open-ended anddynamic nature of the inquiry in theoretical linguistics, where linguistictheories develop through a communal effort by researchers who propose competinghypotheses on the basis of a shared set of explicitly formulated assumptions.The latter chapter is important in that it provides a highly accessibleintroduction to TLG, which, despite its potentials for linguistic application,has been largely ignored in the linguistic community due to the highly technicalnature of its underlying mathematical formalisms.

I would nevertheless like to point out two ways in which the book could havebeen made even better. The first concerns the treatment of the syntax-semanticsinterface. In many non-transformational syntactic theories, providing anexplicit syntax-semantics interface has always been of central concern, andthere are some important recent developments in this domain in each of the threetheories: in LFG, the development of glue semantics (e.g. Dalrymple 2001) haschanged the landscape of the syntax-semantics interface radically; in HPSG, anew approach called Lexical Resources Semantics (Richter and Sailer 2004) iscurrently being developed as the first serious theory of syntax-semanticsinterface grounded in explicit model-theoretic semantics; and in CG, two recentproposals are attracting attention as promising approaches to thesyntax-semantics interface, with one of them facilitating the modeling of bothsemantic and phonological components in terms of the lambda calculus (de Groote2001, Muskens 2003), and the other employing the notion of ‘continuations’ fromcomputer science in characterizing the syntax-semantics interface (Shan andBarker 2006). The architecture of the syntax-semantics interface bears directlyon several important issues that recurrently come up in the present volume, suchas the plausibility of a parallel architecture of grammar, in which syntacticand semantic representations are built in tandem. In view of these, a somewhatmore detailed treatment of the syntax-semantics interface would have been desirable.

Another area where more extensive discussion would have been useful is regardingcomparisons of the three theories. The chapters in this book are more or lessstand-alone readings, and cross-references among chapters are scarce. This issomewhat disappointing since, given the nature of the present book, there are alot of connections and points of contrast that are worth mentioning orelaborating. To take just one example, LFG treats auxiliaries as purelyinflectional elements occupying the head of a functional projection (in a waymore in line with GB/minimalist literature), whereas in HPSG and CG (where suchfunctional heads are dispensed with), they are simply treated as a kind ofraising verb. Does such a difference have any empirical consequences? To whatextent do such differences reflect the built-in architectures of the respectivetheories? Blevins's chapter is exceptional in touching on these sorts of issues,but one or two additional chapters focusing solely on such questions andexploring them in detail with respect to some major grammatical phenomenon wouldhave been interesting to include. This is important, since considerations ofsuch issues are likely to be of central concern in research onnon-transformational syntax, and syntax in general, in the next era.

Notwithstanding the above desiderata, the book is very readable, and representsan excellent introduction to the major variants of non-transformational syntax.It is highly recommended as an essential source of reference for both workingsyntacticians and researchers in related (sub)fields.


Baldridge, Jason. 2002. Lexically Specified Derivational Control in CombinatoryCategorial Grammar. Ph.D. thesis, University of Edinburgh.

Dalrymple, Mary. 2001. Lexical Functional Grammar. New York: Academic Press.

de Groote, Philippe. 2001. Towards abstract categorial grammars. In ACL39. 148-155.

Muskens, Reinhard. 2003. Language, lambdas, and logic. In G.-J. Kruijff and R.Oehrle, eds., Resource Sensitivity in Binding and Anaphora, 23-54. Kluwer.

Philips, Colin. 1996. Order and Structure. Ph.D. thesis, MIT.

Richter, Frank and Manfred Sailer. 2004. Basic concepts of lexical resourcesemantics. In A. Beckmann and N. Preining, eds., ESSLLI 2003 -- Course MaterialI, vol. 5 of Collegium Logicum, 87--143. Kurt Godel Society Wien.

Shan, Chung-chieh and Chris Barker. 2006. Explaining Crossover and Superiorityas Left-to-Right Evaluation. Linguistics and Philosophy. 29. 91-134.


Yusuke Kubota is a postdoctoral fellow of the Japan Society for the Promotion of Science at the University of Tokyo. He received his PhD in Linguistics at the Ohio State University. His recent work focuses on developing a linguistically adequate model of the syntax-semantics interface based on categorial grammar, exploring phenomena such as coordination and complex predicates.

Page Updated: 30-May-2012