Linguistics has become an empirical science again after several decades when it was preoccupied with speakers' hazy 'intuitions' about language structure. With a mixture of English-language case studies and more theoretical analyses, Geoffrey Sampson gives an overview of some of the new findings and insights about the nature of language which are emerging from investigations of real-life speech and writing, often (although not always) using computers and electronic language samples ('corpora'). Concrete evidence is brought to bear to resolve long-standing questions such as 'Is there one English language or many Englishes?' and 'Do different social groups use characteristically elaborated or restricted language codes?' Sampson shows readers how to use some of the new techniques for themselves, giving a step-by-step 'recipe-book' method for applying a quantitative technique that was invented by Alan Turing in the World War II code-breaking work at Bletchley Park and has been rediscovered and widely applied in linguistics fifty years later.
Sampson asks why the discipline lost its way in the closing decades of the twentieth century, showing how the reliance on 'speaker intuitions' resulted from misunderstandings about the nature of science, reinforced by accidents of publication history. Finally, he discusses the distinction between aspects of human language which can and those which cannot be investigated scientifically. Describing the meanings of words is a different kind of enterprise from grammatical analysis. Taking the empirical scientific method seriously means that we must be serious about its limitations also.