Is the mind a cauliflower or an onion? British insights into cognitive organization from the study of abnormal function

Please download to get full document.

View again

of 22
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Document Description
Clinical and normal psychology have had a long tradition of close interaction in British psychology. The roots of this interplay may predate the development of the British Psychological Society, but the Society has encouraged and supported this line
Document Share
Document Tags
Document Transcript
  Is the mind a cauli è ower or an onion? Britishinsights into cognitive organization from thestudy of abnormal function Rosaleen A. McCarthy*  Department of Experimental Psychology, Cambridge, UK  Clinical and normal psychology have had a long tradition of close interaction in Britishpsychology. The roots of this interplay may predate the development of the BritishPsychological Society, but the Society has encouraged and supported this line of researchsince its inception. One fundamental British insight has been to consider the evidencefrompathology asa potential constraint ontheories ofnormal function. Inturn, theoriesof normal function have been used to understand and illuminate cognitive pathology.This review discusses some of the areas in which clinical contributions to cognitivetheory havebeen most substantial. As with other contributions tothis volume, attemptsare also made to read the runes and anticipate future developments. The past 20 years have seen genuine and rapid developments in the  é elds of cognitivepsychology, computational psychology, neuroscience and neuroimaging. We are now atthe threshold of a potentially exciting phase in which the insights of these variousdisciplines can be brought to bear on the understanding of abnormal cognition. In turn,there has been an increasing fuzziness in the demarcation between the theories of ‘normal’and ‘abnormal’ function. Most clinical practitioners expect theories of normal cognitionto have a very direct bearing on the problems experienced by their client groups. At asomewhat slower rate, the broader population of non-clinical psychologists has becomemore generally aware of the relevance of cognitive abnormality as a means of ‘dissecting’and analysing the operation of the normal system.Clinical researchers/practitioners were among the founding parents of the BPS andmade some of the most enduring contributions to the emerging discipline of psychology.Since the earliest days of the BPS, clinical and academic research traditions havecomplemented each other in British psychology (an intermingling that was less apparentin the developing American Psychological Association and has taken much longer toachieve academic recognition). Other contributions to this special issue attest to thecontinuing fecundity of the clinical/theoretical interplay from the perspective of theoretical psychology. My brief is to survey the  é eld from a clinical standpoint. BeatriceEdgell’s overview attests to the central role in the development of the BPS played by suchluminaries as Henry Head and W. H. R. Rivers. For the modern reader, many of the talks171 British Journal of Psychology  (2001),  92 , 171–192  Printed in Great Britain ©  2001 The British Psychological Society*Requests for reprints should be made to: Rosaleen A. McCarthy, Department of Experimental Psychology, DowningStreet, Cambridge CB2 3EB, UK. E-mail:  from the early discussion meetings have a familiar ring to their titles: for example, ‘Thefunction of the frontal lobes’; ‘Bilateral cortical lesions causing deafness and aphasia’; and‘Disorders in symbolic thinking due to local lesions in the brain’.In this short article I draw together some of the strands of evidence from the clinicalliterature that bear most directly on normal function. As with other papers in thiscentenary issue, the present overview is centred on  British  contributions. I attempt toillustrate the British perspective with reference to a small number of key research areas.The topics that I have chosen to discuss—namely literacy, attention, perception andmemory—necessarily re è ect some of my own strongest research interests as well as beingtopics that have strong historical and current relevance. I acknowledge at the outset thatmy coverage is necessarily super é cial—and admit with regret that it has not beenpossible to give consideration to a wider range of issues within the scope of this review.Hopefully, the other contributions to this special issue will  é ll at least some of theinevitable gaps.Psychological aspects of clinical neurology werealways strongly representedwithin theSociety. The  é rst honorary member was John Hughlings-Jackson, consultant neurologistat Queen Square, Hughlings-Jackson reacted against the fashionable ‘modular’ emphasisof the German school of neurology.He called into question thesimple view that the brainhad a cauli è ower-like organization of discrete processing modules that were susceptibleto local and independent damage. Instead, he proposed that cognition was the product of an interactive hierarchical system. Rather than individual abilities having discrete neuralsubstrata, functional localization could vary according to the level of control that wasinvolved. It would vary according to whether re è exive, automatic or voluntary processingwas implicated. Thus, patients with massive damage to the left (language) hemisphereoften preserved the automatic or re è exive aspects of speech (being able to swear  è uentlywhen frustrated or produce lyrics when singing).The same people were unable to producecomparable responses under voluntary control (such as the patient asked to repeat theword ‘No’ who responded ‘No Doctor, I cannot say ‘‘No’’’). In Hughlings-Jackson’s view,the mind/brain had a complex layered structure—somewhat closer to that of an onionthan to a cauli è ower. This tension between the cauli è ower and onion theories persistsinto 21st century neuropsychology. Taking cognition apart The study of abnormal cognition was and is Janus-faced. On the practical side, cognitiveabnormalities present us with real problems that require urgent practical solutions. Onthe theoretical side, these disorders provide us with a very special set of perspectives intothe organization and contents of cognition. Over the past 50 years, the interplay betweentheories of normal and abnormal function has had some of its widest impact within theneurology/neuropsychology ‘border zone’. However, the study of structural brain lesionsis by no means the only route towards the fractionation and understanding of cognition.Other types of pathology also offer relevant and distinct sources of evidence. In recentyears, the focus has broadened beyond neurological disease to also include neuroses andpsychoses. With the advent of neuroimaging, and the evidence for consistent patterns of cerebral abnormality in psychiatric disorders, some cognitive researchers may now feelmore comfortable in drawing evidence from psychiatry as well as neurology. At the very  Rosaleen A. McCarthy 172  least the traditional distinction between ‘organic’ and ‘psychiatric’ has become construc-tively blurred.For example, studies of clinical anxiety states, phobias and depression havenow been brought to bear in understanding the interplay between emotion and cognition(Mogg & Bradley, 1998; Oatley & Johnson-Laird, 1987; Teasdale, 1999). Case studies One very distinctive British contribution to the clinical/theoretical interface has been theemphasis given to single case studies rather than aggregate data based on the mean ormedian scores of large groups. The use of single-case methodology presents somevery realadvantages and in recent years has even led to the establishment of a journal speci é callydevoted to single case-based analysis (  Neurocase ). Case-based research is no global panacea:it has possibly been at its most successful in analysing the fractionation of cognitive skillsrather than on localizing neural substrates. In the context of cognitive research, case-study methods are based on the strong assumption that variation between individualsre è ects relatively  é ne-grain differences in the contents of cognition—rather thanindividual differences in cognitive architecture. Since the same assumption also under-pins most cognitive psychology, case-based and experimental approaches are typicallycomplementary.  Dissociation/association In the limiting case, a single instance of clinical dissociation can challenge a theoreticalmodel of normal cognitive architecture. The recent history of cognitive neuropsychologyis replete with examples of such challenges. At their simplest, single dissociations of function suggest that different types or levels of resource are required in order to performtwo tasks. Double dissociations arise when a second case shows a reciprocal pro é le—indicating that the two sets of task-demand are not on a monotonic continuum (e.g.Shallice, 1988). However, various patterns of dissociation are by no means a guarantee of interesting or fundamental cognitive fractionations: a deaf person and a blind one wouldshow a double dissociation between the interpretation of pictures and the understandingof spoken words. The resources required for the two tasks clearly differ, but not in amanner that  necessarily  bears directly on cognitive accounts of speech processing or objectrecognition.Any claim of dissociation thereforeneeds tobe backed-up by theexclusion of alternative explanations and a valid set of tasks for instantiating the cognitive process of interest. When these objectives are realized, the case with the most limited or ‘pure’ andselective pattern of de é cit may stand as the gold standard against which other patients areevaluated (e.g. Ellis & Young, 1988; McCarthy & Warrington, 1990b; Shallice, 1988).However, a continuing problem for case-based analysis lies in the relative weight givento associated as compared to dissociated de é cits: how pure is  pure ? When is a pattern of associated de é cit causal—and when is it merely contingent? The signi é cance that isgiven to co-occurring de é cits is at least partly dependent on theoretical disposition. In atheoretical world where different cognitive functions are mediated by independentsubsystems, associated pro é les of de é cit are either quasi-accidental nuisances (possiblyarising from damage to adjacent neural areas) or the consequence of some serialdependency between subsystems (the cauli è ower theoretical stance). The advent of 173 Is the mind a cauli  è  ower or an onion?  interactive computational models of cognitive function has led to a change in emphasis:dependency between subsystems is often viewed as far more common and necessary fornormal function. Associations between de é cits are viewed as a prediction rather than anuisance (e.g. Patterson, Graham, & Hodges, 1994). Their absence may be explained interms of inadequate testing by ‘other’ laboratories, the varying task demands imposed byunmatched materials, or perhaps to the personal history of the patient/participant (e.g.Funnell & Sherridan, 1992; Ralph  et al  ., 1998; Stewart, Parkin, & Hunkin, 1992). Thedebate between the  fractionators  and  associators  remains a live issue in many domains,hinging ultimately on the precise dynamics of the cognitive system and the validity of experimental tasks as a means of instantiating the process in question. Reading and dyslexia The investigation of acquired disorders of reading was at the heart of the ‘cognitiveexplosion’ in neuropsychology that took place in the latter decades of the 20th century. In1966, Marshall and Newcombe produced a seminal paper describing ‘patterns of paralexia’ in cases of missile injury. The paper represented a break with the century-old neurological tradition of analysing disorders of literacy in terms of the relationshipbetween impairments of reading and writing. Rather than classifying their patients ascases of ‘dyslexia with dysgraphia’, or ‘dyslexia without dysgraphia’, Marshall andNewcombe drew attention to the different types of errors that their dyslexic patientsmade. They distinguished between three syndromes. The least controversial was visualdyslexia, in which patients seemed to be impaired in forming an adequate percept of theletter string. The other two syndromes, surface dyslexia and deep dyslexia, were identi é edwith very speci é c psycholinguistic processes. In surface dyslexia, participants made errorsthat were similar in sound to the target word. They were unabletoaddress the meaning of words directly from print and instead relied on inef  é cient spelling-to-sound translationas an attempt to solve the problem of reading. Deep dyslexia was something of a mirrorimage: these patients made reading errors that had a similar meaning to the target, andwere unable to use spelling–sound translation. In a subsequent paper Marshall andNewcombe (1974) went on to discuss the ‘patterns of paralexia’ made by their patients interms of a model of normal cognitive function. The initial response to Marshall andNewcombe’s contribution was muted, but their paper was a slow-starter and was destinedto become one of the most in è uential in cognitive analyses of neurological de é cit.Theories of normalreading were in a state of   è ux at the end of the 1960s. An importantarea of debate centred on the respective roles of direct (print N meaning) and indirect(print N sound N meaning) routines in the processing of text. The American traditionhad been typically competitive, with one view gaining temporary ascendancy over theother only to be displaced by the next set of data. However, at least one English theoryhad attempted to incorporate a parallel architecture, with separate print-speech andprint-meaning subsystems: John Morton’s ‘Logogen’ model (Morton, 1969). Morton’smodel lent itself quite naturally to the analysis of acquired dyslexia—a perspective thatwas made explicit by Marshall and Newcombe (1974) and developed by Shallice andWarrington (1975) in their paper reporting a replication of Marshall and Newcombe’ssyndrome of semantically based reading or ‘phonemic dyslexia’. By 1980, whenColtheart, Patterson and Marshall published their seminal edited collection of papers  Rosaleen A. McCarthy 174  on deep dyslexia, Morton’s model provided the modal framework for understandingreading and its disorders.  Subtypes of dyslexia Breakdown at peripheral levels of the reading system may result in neglect dyslexia,where the patient misreads one side of a word (e.g. peach N teach or telescope N peri-scope). Damage to the Logogen system or word-form itself may result in spelling dyslexiawhere the patient proceeds by identifying individual letters of a word, rather than byrecognizing the word as a whole unit (reading cat as ‘C,A,T, cat’ or have as ‘H,A,V,E,have’). At central levels, the syndrome that Marshall and Newcombe described as ‘surfacedyslexia’ has been interpreted as an inability to read for meaning and an over-reliance onreading by sound. Individual cases may be able to pronounce invented non-words (e.g.blint, grame, and fud) and read English words with frequent pronunciation rules (so-called ‘regular’ words such as teach, wean, teen queen). However, words with unusualspelling–sound correspondences may be read as if they were regular (yacht, ache, or lovemight be read as ‘yakked’, ‘aychie’ and ‘lowve’). Marshall and Newcombe’s srcinal casesof surface dyslexia were apparently reliant on the use of letter-by-letter-sound translationrules and had lost the ability to use context sensitive multi-letter correspondences.However, subsequent reports of patients have shown a graded severity of the disorderranging from preservation of many ‘irregular’ words through various intermediate levelsto failure on anything but the most common spelling N sound rules. Such  é ndingsappear consistent with interactive activation models of spelling-to-sound translation andwith the graceful degradation of the system (perhaps according to the frequency of print–sound correspondences experienced in everyday life). However, the cognitiveinsuf  é ciencyleading to imperfect reading by sound may be grounded in visual analysis, in theprocedures of translation, or indeed in production for output, depending on theindividual case.Marshall and Newcombe describing a complex set of de é cits in deep dyslexia, andspeculated that reading via the semantic system might intrinsically be error prone.However, the syndrome has since been fractionated and effective reading aloud via thesemantic system appears possible—at least in the adult skilled reader. In the mostselective cases, the only signi é cant de é cit may be an inability to read unfamiliar words orletter strings, a syndrome termed  phonological dyslexia . If patients have additionalproblems of a linguistic or semantic nature, a variety of additional reading errors maybe shown resulting in deep dyslexia.The reading errors seen in deep dyslexia have a characteristic form and includesemantic approximations (e.g. strife N danger, act N play) and grammatical transforma-tions (wise N wisdom; warmth N warm). Words with limited meaning outside a speci é clinguistic context are usually more dif  é cult than others. Function words (and, if, for, by)are particularly vulnerable and are often interchanged. Within the class of nouns, abstractwords (e.g. notion, industry, supply) are more likely to elicit errors than are nouns withsensory referents (e.g. person, radiator, barracks). The search for a single cognitiveexplanation for this recurrent cluster of de é cits has continued without reaching a  é rmconclusion. Computational models of de é cit (e.g. Patterson, 1990), explanations in termsof the relative ease or dif  é culty of assigning meaning to different semantic categories175 Is the mind a cauli  è  ower or an onion?
Similar documents
View more...
Search Related
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks