Lexicology

Please download to get full document.

View again

of 9
23 views
PDF
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Document Description
Lexicology
Document Share
Document Tags
Document Transcript
  Wörterbücher zur Sprach- und Kommunikationswissenschaft (WSK) Hrsg. v. Schierholz, Stefan J. / Wiegand, Herbert Ernst Theories and Methods in Linguistics Herausgeber: Bernd Kortmann 2012 DOI: 10.1515/wsk.35.0.lexicology  lexicology  Hans-Jörg Schmid   linguistic discipline dealing with the synchronic and diachronic analysis and description of the structure of the lexicon as well as of individual lexical items Lexikologie linguistische Disziplin, die sich mit der synchronen und diachronen Analyse und Beschreibung der Struktur des Lexikons sowie einzelner lexikalischer Einheiten befasst 1. Lexicology as a linguistic branch Lexicology has been introduced more recently, and is still accepted less widely as a linguistic discipline, than the other core-linguistic branches, presumably because it cuts across phonology, morphology, semantics and even syntax. However, the very place of the lexicon at the intersection of these central components of grammar, and the growing awareness of the importance of the lexicon for theories of grammar as a whole, have given rise to the need to establish a linguistic discipline specialising in the investigation of lexical structures. The field of lexicology is often seen as encompassing three areas: a) the study of lexical structures and relations (e.g. word fields, synonymy, collocations); b) the study of word meanings (i.e. lexical semantics); c) the study of the morphological structure of words and of the regularities underlying the coining of new words (i.e. word-formation). Monographs on lexicology include Tournier   (1988),  Hansen, Hansen Neubert   and Schentke  (1990), Schippan  (1992),  Lutzeier   (1995),  Blank   (2001),  Lipka  (2002) and Pöll  (2002). Geeraerts  (2010) provides a survey of theories of lexical semantics. By and large, theories and methods in lexicology can be subsumed under the labels structuralist  ,  pragmatic , cognitive and  psycholinguistic . The major difference between the first three types of approaches arises from assumptions concerning the nature and locus of meaning: structuralist theories see meaning as residing in the differences between related lexical items or in the distinctive semantic features that are responsible for and emerge from these differences; pragmatic theories look for the meanings of words in their use; and cognitive approaches reject the separation between linguistic and non-linguistic conceptual knowledge and argue that word meanings reflect cognitive categories and relations. Psycholinguistic approaches focus on that part of the cognitive system which is  metaphorically referred to as the mental lexicon  and model its structure and the cognitive mechanisms related to lexical processing. The focus of this article will be on content-related rather than form-related issues in lexicology, on functional rather than formal approaches, and on the synchronic rather than the diachronic perspective. 2. Structuralist approaches The foundation of structuralist approaches in lexicology is the insight, usually attributed to Ferdinand de Saussure , that the lexicon is not a haphazard list of words but a structured system of signs. The value or significance of these signs is not seen to be inherent in them but considered to be constituted by the differences to other signs. In short, signs are of a differentiating rather than substantive nature. This theoretical axiom also lies at the heart of structuralist methods in lexicology, which consequently rely very much on the comparison of lexical items as a heuristic tool. The main practical method in most structuralist approaches is the introspective comparison of semantically related lexical items with the goal of identifying contrasts. Specifically, lexical items from more or less closed sets are selected and compared with regard to identifiable formal and/or semantic differences. It is characteristic of structuralist approaches in lexical semantics that they are explicitly language-immanent and insist on separating sense  from (the act of) reference  and linguistic meaning from encyclopaedic knowledge. 2.1. Word-field theory One of the earliest lexicological theories heavily influenced by structuralist thinking is word-field (or lexical field) theory. Protagonists in this paradigm include  Jost Trier, Walter Porzig,  Leo Weißgerber, Eugenio Coseriu, Horst Geckeler, Peter Rolf Lutzeier   and  Leonhard Lipka . In addition to the basic structuralist tenets outlined above, early word-field theory claims that the lexicon consists of mosaic-like structures which do not tolerate gaps. As a consequence, a change in the sense of one lexeme has consequences for other lexemes, too, as the whole word-field must be re-arranged. In contrast to other structuralist approaches such as componential analysis, word-field theory emphasizes the existence of larger units in the lexicon and focuses on the nature of their structure. Word-fields are assumed to reach their boundaries when, metaphorically speaking, lexemes run out of oppositions to other lexemes. 2.2. Sense relations Semantic relations like oppositions are in the focus of theories of paradigmatic (or sense) relations proposed by semanticists such as  John Lyons, Geoffrey Leech  and  D. Alan Cruse.  Lyons  (1977: 206), for example, explicitly defines the sense  of a lexeme as a relationship between the words of a language, independently of its denotation  or potential use in referring expressions . He identifies a range of sense relations, including synonymy, hyponymy, various types of oppositions (e.g. polar opposition, converseness, complementarity and non-binary contrasts), which have been developed further by  Leech  (1981), Cruse  (2000) and others. 2.3. Syntagmatic relations Lexicological approaches focusing on paradigmatic relations are complemented by those which emphasize the role of syntagmatic relations. The insight  –   already formulated by de  Saussure    –   common to all these approaches is that the recurrent co-presence of lexical items in discourse creates stored associations which then form a part of the meanings of the individual lexical items. Theoretical notions proposed to capture this insight include Walter Porzig ’s (1934) wesenhafte Bedeutungsbeziehungen ,  Eugenio Coseriu ’s (1967) lexical solidarities  and the notion of collocation  associated with  J.R. Firth  (1957). It is the latter notion which has gained the widest currency and has been refined most methodologically in corpus-based investigations (cf. Stubbs  1995,  Mukherjee  2009: Ch. 4.2). 2.4. Feature semantics Feature semantics is less interested in structures involving several lexemes and the relations between them but seeks to decompose the meanings of individual lexical items into atomic semantic elements referred to as distinctive semantic features  or semes . These elements are held to constitute the meanings of lexemes. The method of comparing similar lexical items, which was inspired by  Louis Hjelmslev ’s glossematic approach and  Roman Jakobson ’s Prague School phonology, is more a heuristic tool than an end in itself. Just as comparing the features of phonemes yields distinctive phonological features such as [voiceless] and [bilabial], the juxtaposition of lexical items produces distinctive semantic features, e.g. [human] or [animate]. A third source of inspiration for feature semantics was the componential analysis of, for example, kinship terms in anthropology by Floyd Lounsbury . Early proponents of this approach include  Bernard Pottier, Algirdas Greimas  and  Eugene  Nida.    Leonhard Lipka  (2002: 127-132) refines feature semantics by introducing several types of features, among them inferential features , which denote optional semantic aspects that can be inferred from use and can account for the variability of word meanings. The feature-semantic approach promoted by  Anna Wierzbicka  in many publications (e.g. 1996) is marked by her explicit aim to establish an exhaustive list of atomic semantic features or semantic primitives, which are both necessary and sufficient to account for the meanings of words in the world’s languages. 2.5. Structuralist approaches to word-formation In the field of word-formation, structuralist ideas play a key role  –   together with American structuralism and the transformational-generative paradigm  –   in the Tübingen school initiated by  Hans Marchand   (1969) and further developed by  Ernst Brekle, Klaus Hansen, Dieter Kastovsky, Leonhard Lipka, Gabriele Stein  and others (see Kastovsky  2005 for a summary). Marchand stressed the systematic nature of word-formation types, on the levels of both form and meaning. Many of the cornerstones of the approach by  Marchand  , e.g. the notion of syntagma and the focus on the motivation behind products of word-formation, were derived from the work of Saussure  and his followers. Structuralist ideas also lie at the heart of many approaches to German word-formation, cf. e.g. Fleischer   and  Barz  (2007). 3. Pragmatic approaches Structuralist approaches are reductionist in that they abstract from individual usage events to reveal the systematic nature of lexical structures and meanings. Pragmatic approaches , by contrast, are united in (a) their explicit rejection of the postulate of the language-immanence of linguistic meaning and the autonomy of linguistic semantics, and (b) their preference for a position which sees linguistic meaning as emerging from, or even being constituted by,  language use. Major arguments in favour of this view are the recognition that word meanings seem to be vague, variable and highly context-dependent and the resulting claim that the idealization required for the structuralist position does not do justice to their versatility. A key representative of such a meaning-is-use theory is the philosopher  Ludwig Wittgenstein , who in his later work ( Wittgenstein  1953) emphasized that the meaning of a word is nothing other than its use in the language. As a consequence, Wittgenstein  insisted that the best method for identifying the meanings of words was not to ponder on their intension but to actually ‘look’, as he emphasizes, at their extension in actual usage events. One of his examples, the concept of game , has been influential in spreading not only the meaning-is-use theory of meaning but also the notion of  family resemblances , which was taken up by cognitive approaches in the framework of prototype theory (see section 4 below). A second, less well-known proponent of a pragmatic approach in lexical semantics is the Swiss linguist  Ernst Leisi (1985), who defines meaning as being constituted by the set of conditions under which a lexical item can be used appropriately. Pragmatic approaches in the area of word-formation stress the context-dependency, instability and semantic unpredictability of word-formation patterns and products (cf.  Bauer   1979). Pamela Downing  (1977), for example, discusses so-called deictic compounds  such as apple- juice seat  , whose main or even exclusive function is a one-off act of pointing to a certain referent in a given situation.  Eve  and  Herbert Clark   (1979) introduce the notion of contextuals  to account for the context-dependent nature of innovative denominal verbal conversions in English. With regard to methodology, unlike structuralist approaches, pragmatic approaches rely on the empirical observation of actual usage events as a basis for lexicological analysis and description. 4. Cognitive approaches Cognitive approaches in lexicology emerged in the late 70s and early 80s of the 20 th  century as a response to the then dominant structuralist and generativist paradigms. The most fundamental objection was mounted against the claims that linguistic meaning is to be separated from encyclopaedic knowledge and that lexical meaning is distinct from conceptual content. Meaning construction is not regarded as a specifically linguistic capacity but as being related to other general cognitive abilities such as memory, attention allocation and especially categorization. Cornerstones of cognitive-lexicological theorizing are the prototype theory of categorization, frame theory, and the theory of idealized cognitive models. 4.1. Prototype theory of categorization According to the prototype theory of categorization (for summaries see, e.g., Kleiber   1998, Ungerer   and Schmid   2006), word meanings reflect the structures of cognitive categories. These, in turn, show so-called prototype effects: firstly, membership in a cognitive category is not determined by a list of necessary and sufficient criteria (as suggested by the feature-semantic approach and the so-called classical  or  Aristotelian theory of categorization), but by a matter of degree, ranging from prototypical representatives to less good ones and fairly poor ones. Secondly, cognitive categories have fuzzy boundaries to neighbouring categories. In more semantic terms, meanings of lexemes may overlap in such a way that one referent can be referred to by two or even more lexical items. Thirdly, the gradient category structure  ranging from very good to rather peripheral members can be accounted for in terms of attributes associated with category members. Prototypical members of categories are associated with a larger number of attributes which are characteristic of the whole category than are more peripheral representatives. And fourthly, the principle underlying the internal coherence of cognitive categories varies with the level of abstraction: typically, categories on the so-called basic level of categorization , where we find categories such as ‘dog’, ‘car’ and ‘table’ rather than ‘animal’, ‘vehicle’ and ‘furniture’, are united by the principle of prototype structure and shared gestalt, while superordinate categories cohere by virtue of the principle of family resemblances. This accounts for the fact that it is often impossible to find category-wide attributes which are shared by all members subsumed under superordinate categories (  Rosch  et al. 1976, Ungerer   and Schmid   2006). Experimental methods used in prototype semantics, which are known as goodness-of-example ratings and attribute-listing tasks ( Schmid   2000), srcinally come from the work of  Eleanor  Rosch  (cf.  Rosch  and  Mervis  1975,  Rosch  et al. 1976). In goodness-of-example ratings, test subjects are confronted with members, or sub-categories, of cognitive categories and are asked to rate these with regard to their degree of typicality of the category. For example, ‘sofa’, ‘chair’ and ‘table’, but also ‘curtain’ and ‘telephone’ were used as stimuli to be rated with regard to the category ‘furniture’. In attribute-listing tasks, informants are presented with the name of a cognitive category, for example bicycle , and asked to write down all the attributes that they find characteristic of the objects which can be referred to by this term. The weight of a given attribute in the structure of cognitive categories is assessed essentially by calculating the relative proportion of informants who named this attribute. A third classic design developed by William Labov (1973) to test the fuzzy nature of category boundaries is the picture-naming task. Here, informants are given pictures or line-drawings of objects and asked to name these. By manipulating the proportions, shapes and other features of vessels such as cups, mugs and bowls,  Labov  managed to show that there is high inter-subjective agreement on the names for prototypical objects, but considerable inter-subjective variation on the names for objects which feature characteristics of neighbouring categories, such as ‘cup’ and ‘bowl’. The basic tenets of prototype theory have also been tested by means of quantitative corpus studies (cf. Schmid   2010a). The interpretation of corpus data in terms of prototypicality relies on the assumption that prototypicality correlates with frequency of usage, which in turn rests on the ideas that repetition increases the degree of entrenchment and that prototypicality effects are at least one indicator for entrenchment ( Schmid   2007). 4.2. Frames and other knowledge structures Cognitive approaches to lexicology model the lexicon as structured networks of cognitive categories representing both semantic information, in a narrow sense, and conceptual, i.e. encyclopaedic, information. Work in Artificial Intelligence and linguistics suggests that these lexico-conceptual networks consist of larger, intricately intertwined knowledge structures which are referred to as  frames  ( Fillmore  1977)  , scripts ( Schank   and  Abelson  1977) or idealized cognitive models  (  Lakoff   1987). As argued, for example, by Fillmore  (1977), the meanings of lexemes such as buy  and sell ,  bachelor, boycott   and many, or probably indeed all, other lexemes can only be described with reference to the larger cognitive structures in which they are embedded and with reference to the tacit knowledge about background assumptions and cultural values assumed to be stored in them. Cognitive models are adduced to explain the radial structure of cognitive categories (  Lakoff   1987), and they are also seen to
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks