"In this book, Richard Kern explores how technology matters to language and the ways in which we use it. Kern reveals how material, social and individual resources interact in the design of textual meaning, and how that interaction plays out across contexts of communication, different situations of technological mediation, and different moments in time."
The lexicon is now a major focus of research in computational linguistics and natural language processing (NLP), as more linguistic theories concentrate on the lexicon and as the acquisition of an adequate vocabulary has become the chief bottleneck in developing practical NLP systems. This collection describes techniques of lexical representation within a unification-based framework and their linguistic application, concentrating on the highly topical issue of structuring the lexicon using inheritance and defaults. Topics covered include typed feature structures, default unification, lexical rules, multiple inheritance and non-monotonic reasoning. The contributions describe both theoretical results and implemented languages and systems, including DATR, the Stuttgart TFS and ISSCO's ELU. This book arose out of a workshop on default inheritance in the lexicon organized as a part of the Esprit ACQUILEX project on computational lexicography. Besides the contributed papers mentioned above, it contains a detailed description of the ACQUILEX lexical knowledge base (LKB) system and its use in the representation of lexicons extracted semi-automatically from machine-readable dictionaries.
'A very valuable description of the current work in this field ... should be available in all centres of artificial intelligence.' --Artificial Intelligence Review