| EDITOR: Antia, Bassey Edem
TITLE: Indeterminacy in Terminology and LSP
SUBTITLE: Studies in Honour of Herbert Picht
PUBLISHER: John Benjamins, Amsterdam
Pius ten Hacken, Swansea University
The book under review is a Festschrift which was apparently intended to be
published at the occastion of Herbert Picht's retirement from the Copenhagen
Business School (xvii), an event which took place in 2005 (xi). As a typical
Festschrift, the volume has a photo of the dedicatee at the start and a
''bibliovita'' compiled by Carolina Popp (217-230) at the end. It starts with a
short foreword by Christer Laurén and an introduction by the editor. Then there
are thirteen articles by distinguished scholars all related in some way to
indeterminacy in language for special purposes (LSP). I will summarize these
articles separately, adding some evaluative remarks (preceded by [PtH]), before
giving a general evaluation of the volume.
SUMMARY AND DISCUSSION OF INDIVIDUAL CHAPTERS
The introduction by Bassey Edem Antia (xiii-xxii) links the role of
indeterminacy in communicative systems to the emergence of relativity and
quantum mechanics in physics and to postmodernism. It also introduces the
Festschrift dedicatee and gives an overview of the chapters.
1. Øivin Andersen's ''Indeterminacy, context, economy and well-formedness in
specialist communication'' (3-14) argues that there is a continuum between nouns
and verbs. Ten prototypical properties of nouns and verbs are given and some
examples of items that correspond to them in various degrees.
[PtH] What this article fails to do is to show why the fact that there is some
kind of continuum between nouns and verbs is interesting. It does not specify in
which theoretical framework it operates. The twenty properties used as criteria
are listed without any supporting argument. If ''noun'' and ''verb'' are important
concepts in a theory, one would expect the theory to identify the defining
criteria. If they are epiphenomena, there is no point in discussing their
2. Margaret Rodgers' ''Lexical chains in technical translation: A case study in
indeterminacy'' (15-35) analyzes the expressions used to refer to a particular
medical device, a noise suppressor, in the German, English and French versions
of the user manual for this device. She concludes that there is no one-to-one
correspondence between term and concept, because in the text different
expressions are used, e.g. hypernyms. English has the highest degree of
uniformity and French the strongest tendency to vary the expression.
[PtH] Indeterminacy can be seen in different ways. Without touching on the
question of the indeterminacy of concepts and names for them, this article makes
the valid point that as soon as terms are used, they behave as linguistic
3. Sergej Griniewicz argues in ''Eliminating indeterminacy: Towards linguistic
aspects of anthropogenesis'' (37-47) that in the course of the history of
humanity, vocabulary has continually expanded and semantic syncretism reduced.
He estimates the vocabulary of early humans at 4000-4500 words, much smaller
than that of contemporary humans. Many examples from Old English and Old Russian
illustrate this development.
[PtH] Much of the article is devoted to examples which illustrate individual
cases of the purported phenomenon but cannot provide quantitative support for
it. Convinced by Pustejovsky's (1995) argument against the possibility of
enumerating and listing senses, I see the entire enterprise of counting the
number of senses for a particular word as rather dubious. I actually doubt the
intrinsic possibility of supporting Griniewicz's argument with quantitative data
even in principle.
4. Klaus-Dirk Schmitz's ''Indeterminacy of terms and icons in software
localization'' gives an overview of the problems involved in the choice of terms
and icons in software. In order to localize (translate and adapt) software, a
first condition is that any text that is displayed is separated from the program
code and does not occur in images. In this way it can be translated and adapted
without delving into the program code or producing new images. Transparent and
consistent use of terms and icons is necessary for the empowerment of software
users. Long nominal compounds and culture-specific metaphors are the most
obvious problems with terms. For icons, culture-specific symbols and motivation
by homophony are the most problematic.
[PtH] This article gives an excellent overview of the issues of software
localization. It should be compulsory reading for software engineers and their
managers. I would also use it as background reading in any course on software
localization for translators.
5. Gerhard Budin's ''Epistemological aspects of indeterminacy in postmodernist
science'' (61-71) discusses the role of indeterminacy in the opposition between
modernist and post-modernist science. Post-modernism emerged in reaction to the
exaggerated trust in rationalism. Indeterminacy is not limited to post-modernist
approaches to science, however, because it is also a crucial element of quantum
physics. Indeterminacy and determinacy can both be presented as positive
(openness, perspicuity) and negative (vagueness, bias).
[PtH] It is tempting to compare the analysis in this article with the one in the
introduction. They seem to make contradictory claims. It is somewhat surprising
to read that ''In the 1960s, Chomsky's universalistic, rule-based language theory
was appreciated in this modernist sense. Today, the same Chomsky is one of the
most ardent post-modernist political observers and writers.'' (65). Ten Hacken
(2007) shows that Chomsky's universalism is a constant feature of his linguistic
research program. As for his political position, Chomsky's opposition to the
Vietnam war in the 1960s seems to represent the same attitude as his opposition
to the Iraq war now.
6. Johan Myking argues in ''No fixed boundaries'' (73-91) that it is impossible to
establish a clear dichotomy between specialized communication and other forms of
communication. This dichotomy is central to Wüster's (1985) classical approach
to terminology. Various modern alternatives in the study of terminology have
been proposed as solutions to this problem. They have in common that more
generally applicable methods of linguistics are used together with traditional
tools in the description of terminology.
[PtH] This article is lacking in structural signposts for the reader, making it
difficult to see the coherence of the general argument. For linguists interested
in the opposition between traditional and modern approaches to terminology, it
contains useful insights and further references.
7. Vladimir Leitchnik and Serguey Shelov argue in ''Commensurability of
scientific theories and indeterminacy of terminological concepts'' (93-106) that
theories of science as advanced by Kuhn (1970) and by Paul Feyerabend are wrong
when they claim that different scientific theories are incommensurable. They
claim instead that the polymorphism of natural language semantics ensures
commensurability through fuzziness. Normalized terminology seeks to avoid
polymorphism, but for the top layers of the conceptual hierarchy this is not
[PtH] The argument for the central claim of this article does not go beyond the
statement that fuzziness solves incommensurability. This claim seems to me
rather dubious. As demonstrated in ten Hacken (2007), incommensurability is the
result of differences in scientific framework. Such a framework (paradigm or
research program) indicates, for instance, what is a good research question and
what is acceptable as an explanation. The fact that terms obtain incompatible
meanings is a consequence of the difference in models. An explanation does not
become more acceptable by making the terms it uses fuzzier. Instead, one of the
main tasks of the framework is to eliminate the fuzziness of terms by creating a
common understanding among researchers.
8. Birthe Toft's ''Concept formation and indeterminacy in the LSP of economics''
(107-117) discusses the term ''equilibrium'' in classical economics. It is part of
a network of terms that can be seen as a root analogy. In a metaphor based on
Newtonian physics, equilibrium can be interpreted in relation to the forces of
supply and demand. Indeterminacy can arise when different people use common
words as terms with conflicting definitions. These definitions may reflect
ideological bias. The use of equilibrium suggests that laissez-faire policies
result in a kind of harmony.
[PtH] This article contains some interesting suggestions, but they are not
elaborated in depth and there is no clear line of argument.
9. Ingrid Simonnæs' ''Vague legal concepts: A contradiction in adjecto?''
(119-134) discusses the nature of vagueness, the nature of concept, and the
nature of legal concept. The examples of German ''Mensch'' and ''Gewalt'' illustrate
the problems of imposing boundaries on concepts. The vagueness that results can
only be resolved in court. (This article was translated by Benjamin Tyrybon.)
[PtH] The point of this article is fairly simple, but the examples are well-chosen.
10. Reiner Arntz & Peter Sandrini show in ''Präzision versus Vagheit: Das Dilemma
der Rechtssprache im Lichte von Rechtsvergleich und Sprachvergleich'' (135-153)
that there are two problems for the terminological correspondence in law,
differences in language and differences in legal system. The question of
precision and vagueness is particularly interesting in the case of parallel
texts, e.g. in Switzerland. In the EU, the European Court tends to consider all
versions of a text rather than taking one of them as binding. In the translation
of international legal texts, the use of existing terms in national law for new
concepts created by international organizations should be avoided.
[PtH] This article contains various interesting details but does not make a
clear point. Unlike the rest of the volume, it is in German.
11. Sue Ellen Wright's ''Coping with indeterminacy: Terminology and knowledge
representation in digital environments'' (156-179) gives an overview of knowledge
representation resources. It starts with a mindmap, which divides knowledge
representation resources into prototypical knowledge organization systems such
as termbases and ontologies, and other knowledge representation resources. This
is followed by a brief description of the individual types.
[PtH] This article provides a useful systematic overview. Individual resource
types are discussed in uneven degree of detail, but many useful references make
up for the lack of detail in some sections.
12. Bodil Nistrup Madsen's ''Ontologies and indeterminacy'' (180-198) presents
CAOS, a Computer-Aided Ontology Structuring system. A terminologist using CAOS
can enter statements about the hierarchy and subdivision of concepts, which the
system combines into a coherent ontology. In order to reduce indeterminacy, CAOS
imposes a number of formal constraints on the use of criteria for subdividing a
concept in the hierarchy. It has sometimes been claimed, e.g. by Temmerman
(2000), that such an approach is not appropriate for natural sciences, because
of the indeterminacy of concepts. However, this apparent indeterminacy arises
only because there exist different, incompatible ontologies, each of which can
be described according to the organizing principles of CAOS.
[PtH] The constraints in the system described in this article constitute a good
background for the exploration of indeterminacy. The claim about the source of
apparent indeterminacy is prima facie plausible and it would therefore be
interesting to see to what extent it can be substantiated by encoding the
competing systems in CAOS.
13. Anita Nuopponen's ''Terminological modelling of processes'' (199-213) applies
a hierarchical model for the analysis of processes she had developed before to a
particular version of the Japanese tea ceremony. It turns out that not all
branches of the model, given in an appendix, are needed in the description and
that the model has to be expanded by additional labels at the lower levels.
[PtH] The main weaknesses of this article are the definitions of fundamental
concepts and the degree of embedding in general research. The definitions of
''process'', ''procedure'', and ''ceremony'' are taken uncritically from Wikipedia
and online general language dictionaries. Some of the distinctions in the
author's system are not described in sufficient detail to understand them
without consulting the author's earlier presentations. Conversely, no mention of
alternative systems (e.g. FrameNet, Fontenelle 2003) is made.
The genre of the Festschrift is unusual in the domain of scientific publication
because the contributions are chosen on the basis of their authors rather than
their content. The main criterion is to collect pieces of work from people
related to the scholar the Festschrift is dedicated to. In many cases, including
the book under review, a theme is chosen to increase the overall coherence of
the volume. This puts the contributors under pressure to produce something that
relates to the theme as well as to the work of the dedicatee of the Festschrift.
This situation produces a number of factors contributing to the variety of the
articles. Whereas there is a theme as indicated in the title of the volume,
contributors take various aspects of the theme as a basis and develop it in
different directions. In some contributions, indeterminacy is the central topic
whereas in others it seems to be a kind of add-on. Some contributions are of a
good standard of quality, whereas others would probably not have been selected
in other contexts.
It must have been the editor's intention to increase the coherence of the volume
by dividing the different contributions into three parts. Part 1 (chapters 1-4)
is headed ''Indeterminacy: Lexical perspectives'', part 2 (chapters 5-10)
''Indeterminacy: Epistemological perspectives'', and part 3 (chapters 11-13)
''Indeterminacy: Modelling perspectives''. Part 4 is the Bibliovita of Heribert
Picht. In my perception, the titles of the parts do not apply equally well to
each of the contributions within them and it is hard to make any generalizations
about the contributions in a particular part.
For a Festschrift, one category of buyers of the volume is independent of any
review of its contents. They buy it because of the person it is dedicated to.
Despite the impressive line-up of contributors, other people are unlikely to be
interested in the full set of papers. Also in view of the price (EUR 99, $ 134),
the most likely buyers are libraries. It is therefore convenient that references
are at the end of each contribution rather than collected at the end of the volume.
Fontenelle, Thierry (ed.). (2003) Special Issue: FrameNet and Frame Semantics.
_International Journal of Lexicography_ 16:231-366.
ten Hacken, Pius. (2007) _Chomskyan Linguistics and its Competitors_. London:
Kuhn, Thomas S. (1970) _The Structure of Scientific Revolutions, Second Edition,
Enlarged_. Chicago: University of Chicago Press (orig. 1962).
Pustejovsky, James. (1995) _The Generative Lexicon_. Cambridge (Mass.): MIT Press.
Temmerman, Rita. (2000) _Towards New Ways of Terminology Description: The
Sociocognitive Approach_. Amsterdam: Benjamins.
Wüster, Eugene. (1985) _Einführung in die allgemeine Terminologielehre und
terminologische Lexikographie_. København: Handelshøjskolen. [This book has been
published in various editions]
ABOUT THE REVIEWER
Pius ten Hacken is senior lecturer in linguistics and translation at the
Department of Modern Languages of Swansea University. He is the author of
_Chomskyan Linguistics and its Competitors_ (London: Equinox, 2007) and the
editor of _Terminology, Computing and Translation_ (Tübingen: Narr, 2006).