Publishing Partner: Cambridge University Press CUP Extra Publisher Login

New from Cambridge University Press!


Revitalizing Endangered Languages

Edited by Justyna Olko & Julia Sallabank

Revitalizing Endangered Languages "This guidebook provides ideas and strategies, as well as some background, to help with the effective revitalization of endangered languages. It covers a broad scope of themes including effective planning, benefits, wellbeing, economic aspects, attitudes and ideologies."

New from Wiley!


We Have a New Site!

With the help of your donations we have been making good progress on designing and launching our new website! Check it out at!
***We are still in our beta stages for the new site--if you have any feedback, be sure to let us know at***

Review of  Language as a Complex Adaptive System

Reviewer: Julie Bruch
Book Title: Language as a Complex Adaptive System
Book Author: Nick C. Ellis Diane Larsen-Freeman
Publisher: Wiley
Linguistic Field(s): Applied Linguistics
Language Acquisition
Issue Number: 21.4928

Discuss this Review
Help on Posting
EDITORS: Nick C. Ellis and Diane Larsen-Freeman
TITLE: Language as a Complex Adaptive System
SERIES TITLE: Best of Language Learning Series
PUBLISHER: Wiley-Blackwell
YEAR: 2010

Julie Bruch, Dept. of Languages, Mesa State College


This special 60th anniversary issue of the journal ''Language Learning'' contains
ten papers from the 2008 conference on Language as a Complex Adaptive System.
These papers will be of interest to researchers in the fields of Complex
Adaptive Systems, Connectionism, Emergentism, Cognitive Linguistics, and Dynamic
System Theory. They represent a unified and substantive set of innovative
perspectives that explore and help more fully explain language learning,
processing, and variability. The research concepts and methodology in this
collection, while wide-ranging, are successfully presented as an accessible and
cohesive thread that ties scientific and humanistic approaches into a well
integrated whole. Together, these papers reveal an exciting paradigm shift that
is in stark contrast to the traditional Generativist, Universal Grammar, and
Minimalist agendas, which largely ignored the influence of context and the
innate variability of language. This collection takes a huge step toward
further moving the Emergentist agenda into the mainstream.

The basic assumption of the book is that social interaction as well as general
human cognitive processing abilities are fundamental to language acquisition and
to the dynamics of ongoing language development and change over time. The
position paper labels language in this view as a complex adaptive system (CAS)
and claims that the CAS perspective unifies recent theoretical perspectives from
cognitive and sociolinguistics which challenge the traditional view of language
as a fixed system of symbols.

The papers here are based on a usage-based theory which sees language as
fundamentally involving form-meaning constructions. The usage-based perspective
is also used for quantitative modeling of language processing and emerging
language change within both individual users and whole linguistic communities.
The position paper presents seven characteristics of language as a CAS, many of
which debunk formerly held generativist characterizations. These
characteristics emphasize language as inherently: 1) socially formed,
represented, and constrained, 2) largely variable, mutable, and emergent, and 3)
dependent on human cognitive mechanisms.


''A Usage-Based Account of Constituency and Reanalysis'' by Clay Beckner and Joan
Bybee: This paper describes how corpus data (from COCA) reveals constituent
structure as a natural outflow of general cognitive categorization, chunking,
and memory processes rather than as part of any innate or universal linguistic
competence. It also points out how usage trends can result in constituent
reanalysis and changes in distribution over time. This contradicts the
generativist view that reanalysis of structural forms in language happens
predominantly through incomplete first language acquisition and therefore occurs
between successive generations. The study focuses on changing constituent
boundaries of the complex phrasal prepositions ''in spite of'' and on an analysis
of its syntacto-semantic functions. It details the reanalysis of ''in spite of''
from its original lexical function to its current grammaticalized constituent
function with its own peculiar meanings, structural behaviors, and
distributions. The paper ends with a well-supported defense of this analysis of
the change and a rebuttal of possible generativist-based objections to it. The
data provided support the modern usage of ''in spite of'' as a reanalyzed,
formulaic, single-constituent chunk. Finally, the paper underlines the
perspective that such constituent structure is emergent and adaptive, and
therefore, is a prime example of complex interacting factors.

''The Speech Community in Evolutionary Language Dynamics'' by Richard A. Blythe
and William A. Croft: This paper reports on the use of mathematically-based
language analysis for modeling language change as a result of speaker
interaction over time. This work is grounded in the notion that all past and
current interaction of multiple language users, together with sociocultural,
cognitive, and environmental factors, are directly related to the evolution of
language forms. One key goal of the model developed by these authors is to
describe and explain how language variants are selected and replicated by
speakers. They used modeling from the field of statistical physics to formulate
a statistically-based ''utterance selection model'' based on probabilities of
occurrence of variants in interaction as speakers engage in linguistic
communication. It is clear from the examples the authors provide that such a
model is capable of explaining and predicting the extinction, selection and
replication, and variability of specific features of language over time. One
interesting application of this model discussed in the paper is the work done on
the New Zealand dialect of English, which emerged from the various dialects of
immigrants in the mid 1800s. This paper points to intriguing possibilities in
the use of mathematical modeling for explaining language as a CAS.

''Linking Rule Acquisition in Novel Phrasal Constructions'' by Jeremy K. Boyd,
Erin A. Gottschalk, and Adele E. Goldberg: This third paper discusses two
psycho-linguistic experiments designed to assess adults' ability to learn novel
syntactic-semantic pairings (linking rules) after simple exposure to input. The
authors point out that one of the greatest weaknesses of the constructionist
model is the paucity of testing of the model, so these experiments attempted to
address this weakness. In the first experiment, adults were exposed to novel
form-meaning linked constructions (through voiced-over movie clips) and then
tested for both learning and retention. The authors report that the novel
linking rules were learned, but that the rules were not retained to a
significant degree after a lapse of one week. The second experiment required
participants to describe the novel events they saw in the movies using the
paired mapping constructions they learned, a test of production as well as of
comprehension. Here too, the authors report that learning was significantly
evident, and they conclude that the ability of adults to quickly and easily
learn novel linking rules involving unknown syntax-semantics constructions is
evidence for the constructionist model of language acquisition which favors
cognitive-based learning processes over innateness.

''Constructing a Second Language: Analyses and Computational Simulations of the
Emergence of Linguistic Constructions from Usage'' by Nick C. Ellis with Diane
Larsen-Freeman: This fourth paper looks at second language acquisition of three
types of English verb-argument constructions (verb plus locative,
verb-object-locative, and ditransitive verbs). The research here examines
inductive learning based on usage, corpus linguistics based analyses of second
language learner usage, frequency effects of construction islands, and
computational modeling of these factors. The authors show in this paper that
social co-adaptation and cognitive processing actively follow from and then feed
into language usage in a manner that facilitates language learning, and at the
same time, reinforces the mapping of human socio-cognition onto the structure of
language. The common thread of the findings here is that form, function,
meaning, context, and usage are all inextricably linked. The paper reports on
corpus data work done by Ellis and Ferreira-Junior which shows that order of
acquisition of verb-argument constructions for second language learners is
affected by usage factors such as input frequency distributions, semantic
prototypicality of verbs, and strength of association within verb islands, much
like the order of acquisition for children. The authors conclude that learner
input is biased toward frequently occurring prototypical forms, thereby
increasing learnability. Later in the paper, the authors report on two
connectionist computer simulations that used learning algorithms to help
describe how verb-argument constructions emerge in language. For all three
verb-argument construction types, the model was able to learn higher frequency,
higher prototypicality verbs first, and then used that learning to bolster
learning of successive verbs. It was also able to use cues from the surrounding
words. The second simulation removed the layer of semantic bootstrapping, and
results showed similar but slower learning. The authors point out that this
work empirically supports the idea that verb-argument constructions, and by
extension, other linguistic constructions, emerge based on category learning and
social co-adaptation through usage. They suggest that natural language systems
themselves evolve through co-adaptive usage into easily acquired patterns.

''A Usage-Based Approach to Recursion in Sentence Processing'' by Morten H.
Christiansen and Maryellen C. MacDonald: This paper theorizes that recursivity
in linguistic constructions is learned and finite rather than part of
potentially infinitely nested grammar structures as suggested in traditional
models. The paper reports on results from simulation modeling and on four
behavioral experiments. These authors simulated acquisition of linguistic
constructions by training a ''Simple Recurrent Network'' to recognize and process
constituents, in the form of various types of right, left, and center-branching
recursion. This indicated to the authors that there is a close interaction
between ''intrinsic architectural constraints'' and input received. The four
human experiments asked participants to judge and rate grammaticality of
sentences containing varying degrees of right, left, and center-branching
recursion. The previous simulations predicted that with increased depth of
recursion, human ratings of acceptability would decrease. Actual results
reported here confirmed that modeling predictions and human processing matched.
The authors conclude that the close match between the connectionist
computational models and human grammaticality judgments challenges the
Minimalist view of recursivity as being bound only by external memory
limitations. They close by observing that this work strongly supports the view
that the ability to use recursion in language is acquired gradually over time
through usage experiences, and therefore, that it is variable across users and
languages, making it an unlikely candidate for innateness or language evolution.

''Evolution of Brain and Language'' by P. Thomas Schoenemann: This paper assesses
the evidence for changes in human cognition and language being co-evolutionary
and argues that such changes are characteristic of complex adaptive systems.
The author points out some important correlates of increasing brain size and the
development of language, namely that brains developed for learning ability,
interactive sociability and social complexity, conscious awareness, and areas of
specialized conceptual processing. The author compares key areas of the human
brain with their homologous areas in other primates and suggests that as higher
level language processing (together with other types of higher level cognition)
evolved in humans, these areas became more developed and interconnected (''biased
evolutionary changes''). He indicates that this points to the inevitable
evolution of semantic processing and processing of complex syntax. The author
concludes that overall increases in brain size allowed for elaboration of
specialized language processing areas of the human brain and for corresponding
increases in cognitive ability.

''Complex Adaptive Systems and the Origins of Adaptive Structure: What
Experiments Can Tell Us'' by Hannah Cornish, Monica Tamariz, and Simon Kirby:
This paper reports on how an unstructured artificial language became
increasingly regularized and learnable through the very process of being learned
by various ''generations.'' This iterative learning showed that language evolves
as an adaptive system. In the experiments outlined here, participants learned
an artificial language as it was paired with pictures. They were then asked to
describe pictures using the language they had learned. Their descriptions were
used as the training material for the next set of learners, whose descriptions
continued the iterative learning in a chain carried on for ten ''generations'' of
learners. The authors indicate that the results coincided closely with results
of computational models of iterated learning in which languages themselves
adapt, through repeated transmission, becoming increasingly learnable and stable
with progressively fewer variants and more regularization. Each generation of
learners unknowingly contributed to the understanding of the succeeding
generation by adapting, regularizing, and selecting certain forms that would be
more likely to continue to be transmitted. The authors state that this showed
that language systems themselves generate emerging structure as they are
learned, used, and transmitted. Finally, the authors suggest that this work has
explanatory relevance for the original evolution of language as well as for the
dynamics of continued transmission of language and language change.

''Meaning in the Making: Meaning Potential Emerging From Acts of Meaning'' by
Christian M. I. M. Matthiessen: The paper focuses on the construction of
meaning occurring over the lifetime of individual language users. Similar to
the other papers in this volume, the author invokes the idea of language as a
continuous loop which simultaneously expresses, potentiates, and evolves
meaning, leading to constant learning. The author provides quantified examples
of various types of discourse, illustrating how certain aspects of the language
system contribute to overall probability of learning and continued usage of
those aspects. He shows how these types of ''probabilistic profiles'' interact
with varying registers in various contexts leading to further adaptive variation
in the system itself. Significantly, the author suggests that these
characteristics of adaptability and variability are precisely the conditions
requisite to language learnability, especially as learning occurs over time. He
details the human phases of ''learning how to mean,'' emphasizing that acquisition
of additional register repertoires is a key component of continued learning.
The author reiterates the central point that individuals construct meanings
throughout life by expanding their understanding of registers and that these
personalized meaning potentials collectively create the larger language system.

''Individual Differences: Interplay of Learner Characteristics and Learning
Environment'' by Zoltán Dörnyei: This paper attempts a firmer conceptualization
of the dynamic interplay between learner, language, and environment,
specifically in second language acquisition, focusing on the highly variable
nature of each factor. The author criticizes traditional conceptualizations of
second language acquisition that monolithically distinguish single aspects such
as motivation or aptitude, arguing that it is essential to consider the highly
fluctuating nature of these aspects as they relate to both temporal and
contextual variables. He insists that motivation, aptitude, learning styles,
and other aspects are in constant interplay with each other and with the
changing contexts and environments of language learning. He discusses the
importance of carrying out quantifiable research on the dynamic
interrelationships among language, cognition, social interaction, and
environment. The author maintains that unless all these factors are studied as
an integrated and dynamic complex, analyses will fall into over-simplistic or
even false characterizations. In the last section of the paper, the author
tries to provide direction for overcoming the challenges posed by empirically
studying language acquisition as a dynamic enterprise. Many of the papers in
this book are already attempting to meet the challenge of innovating more
adequate experimental design in the face of the adaptive systems paradigm shift
as suggested by this author.

''If Language Is a Complex Adaptive System, What Is Language Assessment?'' by
Robert J. Mislevy and Chengbin Yin: The last paper encourages a
reconceptualization of language assessment based on the emerging view of
language as a complex adaptive system. The authors, using a connectionist
paradigm, suggest that rather than testing discrete language traits, assessments
should consider interactions among people and situations. They demonstrate the
application of evidentiary argument structure to formulations of language
assessment structure and consider how various assessment configurations might be
reconceptualized in the interactionist approach. The authors criticize the
behavioral approach of many traditional decontextualized test configurations,
claiming that such tests do not adequately predict performance in real usage
situations. They encourage a more realistic approach to testing, that of using
communicative situations to gain more valid observations of learner features
such as appropriateness and effectiveness of language use within a context as
well as fluency and accuracy of use. The authors point out that factors such as
individual profiles of exposure to the language items being tested and previous
experience with task demands can affect test results as much as actual language
proficiency. They argue that tests that target specific structures are bound to
under-represent dimensions of language that are required in real situations and
suggest that tests that simulate real usage situations are more fully able to
provide evidence about students' real-world interactive capabilities in language.


Readers of this book are certain to gain a great sense of increased
understanding, not only of the workings of language but also of current research
innovations within the Emergentist paradigm. All ten of the papers are clearly
written so that those with little previous exposure to this type of work will be
easily engaged and be able to follow the evidence and arguments presented. At
the same time, there is enough technical detail to provoke thought among those
who are already immersed in connectionist endeavors. The research described
here is careful and convincing, yet many of the papers also point out ways in
which future research can be refined, so there is a wealth of ideas for those
hoping to carry out similar studies.

The goal of the editors of this book, to present a ''path-breaking'' perspective
on how to understand language processing and learning, has been amply met. The
chapter contributors do not neglect to emphasize the problems inherent in
dealing with the complexity of linguistic forms, language usage, cognition, and
change and variability. Yet, they clearly point out multiple avenues for
approaching such complex systems in ways that are both valid and viable.

One strength of the book is the continuity of perspective present throughout the
various papers. The papers are unified in their recognition of the same
problems, such as accounting for variability and change, but they represent a
wide range of fields (from sociolinguistics to anthropology to psycholinguistics
to evolution) so that together they are able to build a solid front composed of
mutually supporting research. Before going into the specifics of their
particular project, the authors of each paper provide a short discussion of the
problems and questions involved in re-imagining language from the connectionist
viewpoint; however, this reiteration is done from such a variety of perspectives
that it does not become redundant.

The only minor inconsistency in the book is that the first nine papers are
largely research-based and address the larger problem of conceptualizing
language processing and acquisition while the last paper jumps to a more
practical question, that of language testing. While this topic is certainly
relevant to the paradigm shift suggested in the book, it might have been a
better fit to place the last paper in a collection of work that is more praxis

While the emergentist paradigm has been developing for at least the last twenty
years, it has yet to gain full acceptance in the mainstream, so this book is
bound to spark controversy with its clear rejection of traditional paradigms.
Even so, or perhaps because of this, the book is an insightful contribution to
those who are interested in exploring the connections between cognition,
language, and society. It has certainly broken ground for research to come.

Julie Bruch received her Ph.D. in Linguistics from the University of Kansas. She currently teaches Linguistic Diversity, History of English, Structure of English, and Beginning Japanese at Mesa State College in Colorado. Her principle research interests are culture and language and language change and diversity.

Format: Paperback
ISBN: 144433400X
ISBN-13: 9781444334005
Pages: 200
Prices: U.S. $ 39.95