LINGUIST List 2.75

Saturday, 16 Mar 1991

Disc: Cognitive Linguistics

Editor for this issue: <>


Directory

  1. , more on the cognitive issue
  2. Robert Goldman, cognitive linguistics; Linguist list
  3. "Michael Kac", Syntax
  4. George Berg, Poser (was: cognitive linguistics)
  5. Margaret Fleck, modularity?
  6. , cognitive linguistics
  7. Susan Newman, re: Cognitive Linguistics
  8. Fred, Cognitive Linguistics

Message 1: more on the cognitive issue

Date: 13 Mar 91 11:06:52 EST
From: <JASKEbat.bates.edu>
Subject: more on the cognitive issue
 In view of the several lengthy and unanswered dissertations
that have been posted to this list about the merits or demerits
of the appropriation of the name Cognitive Linguistics, I would
like to put in my little grain of sand, for whatever it's worth.

 First of all i should say that i must in part agree with
those who think this appropriation it's self-righteous and not
likely to help cooperation or understanding among the different
`schools' that claim to be doing cognitive linguistics. To this
i must add that i see this action in the unfortunate context of
arrogance and self-righteousness, on the part of everybody in the
field, and certainly on the part of many of those who say they're
doing autonomous or formal linguistics.

 Anyway, what I wanted to respond to though was the some
scholars who have been writing on this subject here but who seem
to have a very mistaken idea of what non-formalist linguists are
doing when they take time off from trying to undermine (non-
hyphenated?) linguistics. They say things like "Well,
"functionalists" can get away with saying that language is such
and such because they haven't started looking at the real stuff
of language, like syntax and morphology and phonology; they just
look at certain semantic and pragmatic phenomena and conclude
that language is such and such." I don't know how generalized
this impression is among formal linguists - though probably
quite -, but those who hold it should really start waking up up
from such a silly and non-sensical fantasy.

 I find it extremely ironic that it is people that
concentrate on a very narrow spectrum of linguistic phenomena,
what they call core linguistics, who would be saying things like
that. Don't they think that the other 99% of language that they
don't deem autonomous enough to bother with is represented in
cognition too? Don't they realize that it is because they look
at 1% of language outside the context of the other 99% that they
think that language is as strange and bizarre as they claim it
is, all cold, encapsulated, and meaningless?

 Let's get serious. Like somebody else said, let's stop
bickering and let's get back to work. The problem is that when i
get this kind of mail i get all worked up! Let's hope that it
subsides soon.

 Jon Aske
 Linguistics Dept.
 UCBerkeley
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 2: cognitive linguistics; Linguist list

Date: Wed, 13 Mar 91 09:46:09 -0600
From: Robert Goldman <rpgrex.cs.tulane.edu>
Subject: cognitive linguistics; Linguist list
As an AI person, I agree with you that the questions that AI NLP
handles are ENGINGEERING questions, rather than LINGUISTICS questions,
per se, and I concede that this distinction is overlooked far too
often.

However, I have no choice but to squawk at the ridiculous straw man
you've made of Schank's views (this is straw man (a) in your note, I
believe). I'm by no means convinced by them, but even *I* wouldn't
claim that a Schank-style semantic (or expectation-driven) parser
would IGNORE the linear order of words in its input.

In fact, when a semantic parser sees `Mary' it tries to retrieve a
corresponding memory concept. When it reads `saw', it retrieves a
corresponding conceptual representation which looks for a subject, and
sets up an EXPECTATION for an object. When `John' comes along, the
appropriate conceptual representation is identified as the patient of
the seeing.

So such a parser IS capable of distinguishing the two sentences

 (1) Mary saw John.

 (2) John saw Mary.

One must at least appeal to the passive construction to point out
problems with this approach! Off-hand, I've forgotten how this
construction is handled.

For a less dismissive review of semantic parsing, there's an entry on
"Parsing, expectation-driven" in the Encyclopedia of AI, Shapiro ed.,
Wiley, 1990.

Robert Goldman
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 3: Syntax

Date: Wed, 13 Mar 91 20:06:30 -0600
From: "Michael Kac" <kaccs.umn.edu>
Subject: Syntax
William Poser in his recent comment refers to a certain well known AI re-
searcher who denies the relevance of syntax to natural language understanding.
(Presumably he means Roger Schank -- why be coy about it?) Anyway, it seems to
me that one of the most interesting consequences of David Caplan's work on
disordered syntactic comprehension is the extent to which it shows just how
important syntax actually is in human language understanding. However you
choose to formalize it, the distinction between 'raising' and 'control' pre-
dicates, for example, turns out to be significant in that there are aphasics
who have problems with the former but not the latter and ones for whom the
reverse is true.

There are other kinds of evidence that can be adduced as well, from common
experience. One kind involves cases where even though there is substantial
semantic/pragmatic bias that would lead you to think it would work differently,
there are ambiguous sentences in which there is a strong preference for the
anomalous interpretation. My favorite comes from a New Yorker newsbreak con-
sisting of an article about Princess Anne which ends with the sentence *The
daughter of Queen Elizabeth and her horse finished third in the competition*.
There seem to be a lot of people out there -- linguists included, I might add --
who subscribe to the view that ambiguities are (a) always resolvable if there's
sufficient context, and (b) resolved in the direction of what the contextual
bias suggests. This is clearly not so.

There are actually two issues addressed in the preceding paragraph. One has
to do with the view that context has a certain kind of power, which it some-
times does and sometimes doesn't. The second has to do with the role of syn-
tax in understanding. There's clearly a component to that process which in-
volves grouping smaller expressions into larger ones, which evidences itself
rather dramatically in examples like the one cited above.

Michael Kac
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 4: Poser (was: cognitive linguistics)

Date: Thu, 14 Mar 91 12:37:41 EST
From: George Berg <bergcs.albany.edu>
Subject: Poser (was: cognitive linguistics)
 In his recent contribution to the "cognitive linguistics" debate I think
that Bill Poser is inadvertently misrepresenting the amount of linguistic
sophistication among researchers in artificial intelligence (AI) and
artificial neural networks ("connectionism"). 

 Although he doesn't say who his "well-known figure in Artificial
Intelligence" is, I assume that Poser is talking about the work of Roger
Schank in the early-to-mid 1970's - his Conceptual Dependency research. I
think it's a red-herring when he says that "in its pure form" CD is incapable
of distinguishing between the subject and object of a sentence. In its
*actual* form, it is. Take a look in the literature, esp. Schank and
Riesbeck's "Inside Computer Understanding". Illustrated there is the actual
code for programs which, using Conceptual Dependency, correctly process (with
admittedly limited coverage) sentences into his semantic representation.
Although not syntactically sophisticated, it is adequate to do what he wanted.

 Also, Poser seems to misunderstand the importance of what Schank was saying
and doing. He was reminding us that there is more to language than syntax.
Especially if you are interested in building models of natural language
processing for either AI or cognitive science purposes, you *must* address
semantic and pragmatic issues. Even if you choose to disregard Conceptual
Dependency, you cannot dismiss the positive influence that Schank has had on
AI, computational linguistics and cognitive science. His work may have had
little impact on theoretical linguists, but that's because they're not
currently examining the same issues. 

 By the way, unless I am mistaken, Schank got his degree in linguistics. I
doubt that he is "rather naive about language and unacquainted with
theoretical work". Also, in a scientific discussion, I think it is important
to cite work by name. If you feel you shouldn't, your case is probably weak
enough that you shouldn't state it at all.

 I can't identify Poser's "well known figure in work on neural networks"
offhand (maybe McClelland?). It may well be that his/her talk abstract said
that the network "learned syntax" when all it could do was recognize the
distinction between transitive and intransitive verb, but most credible
researchers doing connectionist-based work in natural language processing do
have a reasonable knowledge of linguistics. They might not subscribe to the
orthodox views in theoretical linguistics, but they know the issues
(especially that there is more to syntax that the difference between
transitive and intransitive verb). If Poser doubts this, he should talk to
some of the researchers who will be at the AAAI Spring Symposium on
Connectionist Natural Language Processing, which will be at Stanford 3/26-28.

 Poser goes on to say:
 To most linguists such claims are so far out as to look
 like the work of cranks, but their proponents are not
 regarded as cranks by people in AI and psychology. The
 gulf in knowledge of language between linguists and
 non-linguists is huge.

 Linguists should give the poor "non-linguists" some credit for intelligence.
The reason that people such as Schank and others are not viewed as cranks, is
that they are not. It is easy to misrepresent people's views (any of us can be
made to look like a crank if misrepresented). Schank points out the
relationship between syntax and the rest of linguistic knowledge and
abilities. Looking into the AI literature on natural language processing you
find intellegent discussion of the linguistic issues (even a GB-based approach
on occasion). Researchers using artificial neural networks are currently
wrestling with representing and processing structured information, of which
syntactic relationships are a prime example (cf Pollack, Van Gelder,
Chalmers). Those that deal with language are, by and large, knowledgeable in
the areas of linguistics relevant to their work. 

 In short, rather than setting up straw men, and in the process widening an
"us vs. them" gap between linguists and researchers in AI and artificial
neural networks, we should be examing the work of those people who cross those
gulfs (e.g. Berwick, Marcus) to see what we all can learn from them.

 George Berg
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 5: modularity?

Date: Thu, 14 Mar 91 16:20:45 GMT
From: Margaret Fleck <fleckrobots.oxford.ac.uk>
Subject: modularity?
I think that Bill Poser has gotten a rather strange picture of what
goes on in computer vision. Perhaps, like all of us, he has to
construct his models of other areas of Cog Sci/AI from a handful of
lab-mates plus a few media personalities (like Chomsky). Computer
vision is roughly analogous to computational linguistics, plus speech
and the language bits of AI. We do have a slightly larger supply of
engineers (though speech has a fair number), but they are by no means
the whole field. Many people in vision would be surprized and
offended at your claim that they don't care about human processing,
and slightly miffed to be regarded as part of "AI." To make an exact
parallel to the membership of this list, though, you might want a mix
of computer vision people and the visual psychophysicists with whom
they collaborate.

I have two reasons for viewing visual and linguistic processing as
potentially related. First, the low-level parts of both fields
perform analogous interpretation of similar types of sensory input:

 speech low-level visual processing: edge finding,
 stereo matching, texture analysis

 computational phonology, high-level visual processing: shape
 morphology, and syntax representation, segmentation, object
 recognition

Historically, speech and low-level vision have occasionally shared
algorithms, e.g. for removing noise. There has been little contact at
the higher levels, except that both provide toy examples to the neural
nets people. However, I think it is an open question whether
segmenting an image into objects is done in a similar way to
segmenting a speech waveform into morphemes. Answering this question
would seem to require communication between those who understand the
details of the two types of low-level processing.

Secondly, there needs to be some interface between (high-level) vision
and (mid-level) linguistic processing. For example, to learn the
meaning of the word "pear," a child must isolate the right string of
phonemes, also isolate a description of the shape, color, texture of
the object involved, and associate the two. Although this might be
done by feeding the output of early visual and linguistic processing
through some elaborate mechanism of high-level cognition, the simplest
hypothesis would be that the two sensory processing modules
communicate directly at the level of (low-level) semantics.

If this were true, the output of high-level vision and the
visual/shape parts of low-level semantic representations would have to
be written in a common language. This might potentially be a big
source of constraint for constructing theories in the two fields.
Perhaps not all visual information is "linguistically relevant," but
numeral classifiers illustrate that at least some shape information
is. Unfortunately, there is no good formal model for describing shape.

If you really want to get me going, I could also point out that two
common methods of linguistic input (reading and sign language) are
done via the visual system. And that, although vision is more heavily
general-purpose than speech, there do exist conventionalized systems
of visual symbols (e.g. airport signs, road signs, conventions for
presenting graphs and mathematical diagrams, facial expressions) and
some general-purpose sorts of auditory processing. If there are going
to be similarities between language and some other sort of processing,
this seems as good a candidate as any.

On a different note, I would generalize Poser's claims about ignorance
to a general statement that most researchers in Cog Sci/AI are
painfully clumsy outside their home field. It is hard to watch a
linguist building a mathematical model or a computer scientist
building a linguistic one without feeling like taking the pen or
keyboard away from them. Optimistic planners in Cog Sci/AI seem to
have grossly underestimated how difficult it is for researchers in
different areas to communicate with one another.

Margaret Fleck
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 6: cognitive linguistics

Date: Thu, 14 Mar 91 15:55:52 EST
From: <Alexis_Manaster_RamerMTS.cc.Wayne.edu>
Subject: cognitive linguistics
If I may weigh in, I think that everybody has the right to call
their theory or approach whatever they want, provided the term
has not been preempted by some else. For example, most people
use the term 'formal linguistics' to refer to work which is not,
strictly speaking, formal, 'generative' is often used to name
work which is not, strictly speaking, generative, and so on.

Alexis
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 7: re: Cognitive Linguistics

Date: Thu, 14 Mar 1991 11:58:54 PST
From: Susan Newman <snewmanparc.xerox.com>
Subject: re: Cognitive Linguistics
Re: Daniel Everett's remarks on cognitive linguistics (Date: 13 Feb
91):

I am intrigued by your comment that work such as that represented in
WFDT may not be as widely representd in LSA as othe work because its
argumentation style doesn't lend itself to falsification a la Popper.
Could you say bit more about what you mean here? I am interested
because I think similar mismatches in argumentation are at stake in
other areas of cognitive science, as well as because I study
argumentation as a key (social and cognitive) tool for human knowledge
construction. By the way, as a linguistically oriented cognitive
scientist (though not a linguist) I would by no means agree to being a
working Popperian, nor to the implication that the only possible
alterantive is Feyerabend.

Susan Newman
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue

Message 8: Cognitive Linguistics

Date: Fri, 15 Mar 1991 10:43:01 -0500
From: Fred <YOUNGHEEvm.epas.utoronto.ca>
Subject: Cognitive Linguistics
I have a pet beagle named Fred who thinks he's a linguist and who
sometimes reads my e-mail. After sniffing around for a while
yesterday, he left me the following note:

---------
Dear Colleague:

It is clear from the tenor of the discussions surrounding the use
of the term 'cognitive linguistics' by the members of one of the
research communities of linguists to describe what they do that a
political nerve has been touched. Those, like Fromkin and
Pesetsky who might describe themselves as doing 'autonomous
linguistics' claim that they have strong empirical support for
their position, and indeed they do. But it is incorrect to
infer, as they seem to, that the hypothesis of autonomy is an
empirical hypothesis. In the first place that hypothesis has
never been precisely formulated, and where it has been invoked
it's been so vague as to be untestable. The reason of course is
that it has never been meant to be tested, nor can one even
imagine a single crucial experiment which for any committed
autonomist could falsify the hypothesis. Where theoretical
predictions made by autonomists have been shown not to accord
with the facts, the theory has been revised or the facts have
been ignored (or said to be outside the domain of grammar, or to
be covered by ceteris paribus clauses). I do not mean to suggest
that such behavior speaks to the poverty of autonomous
linguistics; as Kuhn and Lakatos have argued, that kind of
behavior may be pretty much the *very best* that can be expected
in science.
 For Poser to claim that there have been no attempts at
nonautonomist explanation for locality principles no doubt says
something more about what he has read and what he is prepared to
accept than what has been done. This is not to say that there is
an equally convincing nonautonomist analysis for every autonomist
one (or vice versa), but there surely have been enough unanswered
successes in both camps to make each hypothesis independently
plausible, given our current state of knowledge. (Since the
autonomy question has been central in linguistics now for about
twenty-five years, this says something not terribly encouraging
about the rate of progress we can expect in our field.)
 The fact is that no one knows - Fromkin doesn't know,
Pesetsky doesn't know, Lakoff doesn't know, and you and I don't
know - whether an autonomous solution to the distributional facts
of language is required or viable, or indeed what in the end
autonomy will even mean. It is good that there are now two
research programs with at least partially conflicting goals; if
the Pesetskys and Fromkins pay attention to what the Lakoffs and
Fillmores and Langackers are doing, and vice-versa, so much the
better for the prospects for our science. One reason I doubt
such attention has been or will be paid is that there's all this
political tooting going on around the various analyses. And it's
when we confuse the tooting for the real results of our work that
we're in trouble.
 As to the name 'Cognitive Linguistics', I suppose I'd be
more sympathetic to the point of view expressed by Fromkin if she
complained equally about the usurpation of the terms
'generative', 'transformational', and so on by others. Sure,
it's a political move by the Cognitive Linguists, but not without
precedent. It was, after all, not so long ago that Chomsky
referred to his theory as the 'Extended Standard Theory' (a term
coined by Haj Ross, I believe). I'd recommend that you not get
involved in this dispute.
 Yours,
 Fred
Mail to author|Respond to list|Read more issues|LINGUIST home page|Top of issue