Activation of the Right Inferior Frontal Cortex During Assessment
of Facial Emotion
Katsuki
Nakamura,1
Ryuta
Kawashima,2
Kengo
Ito,3
Motoaki
Sugiura,2
Takashi
Kato,3
Akinori
Nakamura,3
Kentaro
Hatano,3
Sumiharu
Nagumo,1
Kisou
Kubota,4
Hiroshi
Fukuda,2 and
Shozo
Kojima1
1Department of Behavioral and Brain
Sciences, Primate Research Institute, Kyoto University, Inuyama,
484-8506; 2Department of Nuclear Medicine and
Radiology, IDAC, Tohoku University, Sendai, 980-8575;
3Department of Biofunctional Research, National
Institute for Longevity Sciences, Obu, 474; and
4Department of Social and Information Sciences,
Nihon Fukushi University, Handa, 475, Japan
 |
ABSTRACT |
Nakamura, Katsuki,
Ryuta Kawashima,
Kengo Ito,
Motoaki Sugiura,
Takashi Kato,
Akinori Nakamura,
Kentaro Hatano,
Sumiharu Nagumo,
Kisou Kubota,
Hiroshi Fukuda, and
Shozo Kojima.
Activation of the Right Inferior Frontal Cortex During Assessment
of Facial Emotion.
J. Neurophysiol. 82: 1610-1614, 1999.
We measured regional cerebral blood flow (rCBF)
using positron emission tomography (PET) to determine which brain
regions are involved in the assessment of facial emotion. We asked
right-handed normal subjects to assess the signalers' emotional state
based on facial gestures and to assess the facial attractiveness, as well as to discriminate the background color of the facial stimuli, and
compared the activity produced by each condition. The right inferior
frontal cortex showed significant activation during the assessment of
facial emotion in comparison with the other two tests. The activated
area was located within a triangular area of the inferior frontal
cortex in the right cerebral hemisphere. These results, together with
those of previous imaging and clinical studies, suggest that the right
inferior frontal cortex processes emotional communicative signals that
could be visual or auditory and that there is a hemispheric asymmetry
in the inferior frontal cortex in relation to the processing of
emotional communicative signals.
 |
INTRODUCTION |
With the recent advent of brain-imaging
techniques, the left hemisphere has been confirmed to be dominant in
relation to language processing, in terms of both perception and
production, in right-handed normal subjects (Demonet et al.
1993
; Petersen et al. 1989
, 1990
; Posner
and Carr 1992
). Even sign language is processed preferentially by the left hemisphere (Bellugi et al. 1983
;
Klima et al. 1988
; McGuire et al. 1997
);
nonlinguistic visuospatial information is, however, considered to be
processed mainly by the right hemisphere. All of these data indicate
that language, whether spoken, written, or sign language, is primarily
controlled by the left hemisphere.
On the other hand, human communication includes a strong nonverbal,
emotional component that does not invoke the use of language per se.
Facial and body gestures and prosodic cues of voice can convey
communicative information. Right-handed patients with lesions of the
right hemisphere show difficulties in processing emotional communicative information, such as spontaneous prosody, prosodic comprehension, and comprehension of emotional gesturing
(aprosodias) (Heilman et al. 1975
;
Ross 1981
; Ross and Mesulam 1979
), while there is evidence that the left hemisphere also functions in some affective behaviors (e.g., Kolb and Taylor 1981
). Among
emotional communicative signals, some facial gestures are believed to
be universal (Ekman and Friesen 1975
). Facial gestures
are of interest with respect to the evolution of communication because
they also are found among animals (Darwin 1872
;
Hauser 1996
). Facial gestures reflect the signalers'
emotional state and are used to negotiate social interactions. Clinical
studies have suggested that the right hemisphere plays a dominant role
both in the processing and execution of facial expressions
(Ahern et al. 1991
; Blonder et al. 1991
;
Bowers et al. 1987
, 1991
; DeKosky et al.
1980
; Etcoff 1984
; Kolb and Taylor
1981
; Ley and Bryden 1979
). However, neural substrates for the assessment of facial emotion remain unclear. In the
present study, we examined which brain regions are involved and whether
there is a hemispheric asymmetry in the assessment of facial emotion,
using positron emission tomography (PET).
 |
METHODS |
Seven right-handed normal male volunteers (aged 19-25 yr)
participated in this study. Written informed consent was obtained from
each of the subjects. The stimuli used were colored frontal images of a
female face against a uniform background of red, yellow, or blue.
Before the experiments, seven male psychologists classified >200 faces
into categories such as calm, happy, sad, angry, surprised, disgusted,
or others. Only faces that all of the seven psychologists invariably
classified as happy, calm, sad, or angry were used as stimuli. The face
stimuli were presented on a head-mounted display at 0.5-s intervals. In
the facial emotion (FE) test, each subject assessed each facial emotion
as positive (i.e., happy), neutral (i.e., calm), or negative (i.e., sad
or angry) and pressed a left, center, or right button with the thumb,
index, or middle finger, respectively, of his right hand, within 2 s of the stimulus onset. The stimulus disappeared automatically when
the subject pressed a button. We used two control tests. In the color
discrimination (CD) test, each subject discriminated the background
color; in the facial attractiveness (FA) test, each subject assessed
each of the face stimuli as attractive, neutral, or unattractive. We preferred the FA test as the control test because this was a
nonlinguistic and emotional test, and facial attractiveness itself does
not convey communicative information. Furthermore, the assessment of
facial attractiveness as well as facial emotion requires the subjects
to pay attention to physiognomic material (Perrett et al.
1994
). The faces used in both the control tests were
categorized as calm faces. For each subject, a separate scan was
performed for each test condition. The order of the three tests was
varied among the subjects. All the face stimuli were unfamiliar to the subjects, and each face was presented only once to each subject to
avoid the confounding effects of memory on neuronal activity. All
subjects were instructed to look at the center of the image during the
PET scans. Eye movements were measured and few, if any, saccadic eye
movements were observed during each scan. There were no significant
differences in the number of eye movements among the three test conditions.
Each subject was placed in a PET scanner (Siemens/CTI ECAT EXACT HR)
(Wiennahard et al. 1994
) in a dark room (0.7 lux) during the experiment. Before the PET measurements, a transmission scan was
performed using three rotating
68Ge/68Ga sources. This
scan directly measured the attenuation coefficients, and the data were
used to obtain corrected emission images. The emission scan was started
immediately after the administration of a bolus injection of ~15 mCi
(555 MBq) H215O using the
three-dimensional (3D) collection mode. Each experimental test and PET
measurement started immediately after the bolus injection and continued
for 120 s.
Standard anatomic structures as incorporated in a human brain atlas
system (Roland et al. 1994
) were fitted interactively to
high resolution magnetic resonance images (MRIs) of each subject using
both linear and nonlinear parameters. These parameters were subsequently applied to transform PET images of rCBF into the standard
brain anatomy. Statistical parametric mapping (SPM96, Wellcome
Department of Cognitive Neurology, London) software was used for
smoothing and statistical analysis (Friston et al.
1995
). A 3D Gaussian filter of 20 mm was used. Differences in
global flow were covaried out using analysis of covariance. Comparisons across test conditions were made by means of t-statistics,
and thereafter transformed into normally distributed Z
statistics. For each comparison, voxels with Z values >3.1
(P < 0.001, without correction for multiple
comparisons) were considered to denote regions of significantly
increased rCBF. Finally, each activation was superimposed onto the
averaged transformed MRI of the seven subjects. Anatomic localization
of the activated areas was made in relation to the mean reformatted MRI.
 |
RESULTS |
Behavioral data are summarized in Table
1. The subjects were administered a
greater number of trials (visual stimulations) in the CD test than in
the FE (Mann Whitney U test, P < 0.01) and
FA (P < 0.05) tests. The reaction time was shorter in
the CD test than in the FE and FA tests (P < 0.01).
There were no significant differences between the results in the FE and
FA tests. In the FE and CD tests, 81.4% (range 64-91%) and 96.4%
(87-100%) of the responses were correct, respectively. The difference
between the two was significant (P < 0.01). Because
the assessment of facial attractiveness depends on each individual's
personality, no response could be classified as "correct" in the FA
test. All of these data indicate that the CD test was easier than the
FE and FA test, but that the number of visual stimulations and motor responses were well matched between the FE and FA tests.
The FE test elicited activation of two foci in the right inferior
frontal cortex, the lateral occipital cortices of both sides, and the
left orbitofrontal cortex in comparison with the CD test (Table
2, top). The activations in
the right inferior frontal cortex (46, 32,
4) and the right lateral
occipital cortex (34,
88,
6) were significant after correction for
multiple comparisons (**). The activation in the right inferior frontal
cortex (46, 32, 0) was still significant when the activity during the
FE test was compared with the activity during the FA test (Table 2,
bottom). The activated area (*) was located within a
triangular area of the inferior frontal cortex. Because only the
activation in the right triangular area (Fig.
1) was determined to be significant (P < 0.001) in two separate comparisons, we assumed
that the right triangular area was more dominant in the assessment of
facial emotion. In the present study, the left inferior frontal cortex did not exhibit any significant activation during the assessment of
facial emotion.

View larger version (78K):
[in this window]
[in a new window]
|
Fig. 1.
Positron emission tomography (PET) images of significant activation in
the right inferior frontal cortex (triangular area) revealed by the
subtraction of the activity in the color discrimination test from that
in the facial emotion test. An activated area is superimposed on the
mean MRI image produced from all 7 subjects. A:
sagittal; B: coronal; C: transverse
images. D: mean adjusted regional cerebral blood flow is
plotted for the 3 test conditions (±SE). CD, color discrimination
test; FA, facial attractiveness test; FE, facial emotion test.
|
|
 |
DISCUSSION |
The present study provided two major findings. The first is that
the right hemisphere was more active than the left during the
processing of facial emotion. This finding is consistent with previous
clinical observations (Ahern et al. 1991
; Blonder
et al. 1991
; Bowers et al. 1987
, 1991
;
DeKosky et al. 1980
; Etcoff 1984
;
Heilman et al. 1975
; Kolb and Taylor
1981
; Ley and Bryden 1979
; Ross
1981
). The other finding is that the right inferior frontal
cortex may be involved in the processing of emotional communicative
signals based on facial gestures.
Recent studies in animals and humans have proposed the idea that the
same mechanisms in the inferior frontal cortex might be activated
during the observation and execution of actions (Gallese et al.
1996
; Rizzolatti et al. 1996a
,b
), and such an
"action observation/execution matching system" may be important for
communication (Gallese and Goldman 1998
;
Rizzolatti and Arbib 1998
). George et al.
(1993)
also reported inferior frontal cortex activation during
matching of facial emotions. Some imaging studies have reported
activations in the right inferior frontal cortex during the processing
of prosodic cues of voice (Imaizumi et al. 1997
;
Kawashima et al. 1993
; Zatorre et al.
1992
). Hornak et al. (1996)
reported that lesions of the ventral frontal cortex, which included both the orbitofrontal and inferior frontal cortices, affect the comprehension as well as production of face and voice emotional expressions. In 10 of
12 patients, the right ventral frontal cortex was among the regions
damaged. These data, together with the present observations, suggest
that the right inferior frontal cortex processes emotional communicative information with either visual or auditory input. Interestingly, the right inferior frontal cortex is a mirror image of
Broca's area in the left hemisphere. It has been proposed that the
functional-anatomic organization of the emotional communication in the
right hemisphere mirrors that of prepositional language in the left
hemisphere (Ross 1981
). There may be a hemispheric asymmetry in the inferior frontal cortex in relation to the processing communicative information. Asymmetric activations in the frontal cortex
for verbal and nonverbal information processing also was reported on
memory encoding and retrieval (Kelley et al. 1998
; Wagner et al. 1998
). There may be multiple functional
subregions for verbal and nonverbal information processing in the
frontal cortex since the activated foci in these studies appear to be located posterior and dorsal to the area reported in the present study.
However, it is not our aim to stress that the right inferior frontal
cortex processes only emotional communicative information nor that this
area is the only area involved in the processing of emotional
communicative information. There is evidence that cortical areas out of
the frontal region are involved in the recognition of facial emotions
(Adolphs et al. 1996
). The right inferior frontal cortex
is activated in other tasks related to language, such as metaphor
comprehension (Bottini et al. 1994
) and phonological working memory (Paulesu et al. 1993
), as well as in a
face matching task (Haxby et al. 1994
). Concerning
activations in the right inferior frontal cortex during memory recall
(Courtney et al. 1997
; Fletcher et al.
1998
; Haxby et al. 1996
; Moscovitch et
al. 1995
; Wagner et al. 1998
), the present
results might indicate that the processes of assessment of facial
emotion involve matching the current facial gesture to templates or
prototypes of facial emotions in the brain.
Human studies have suggested that the limbic system, in particular the
amygdala, plays an important role in the recognition of facial emotions
(Adolphs et al. 1994
; Breiter et al.
1996
; Morris et al. 1996
, 1998
; Phillips
et al. 1997
; Young et al. 1996
). We did not find
any specific increase in rCBF in the amygdala. Many previous studies
have suggested the involvement of the limbic system in the processing
of negative emotions, such as fear and disgust (Adolphs et al.
1994
; Breiter et al. 1996
; Morris et al. 1996
, 1998
; Phillips et al. 1997
). The limbic
structures may respond more to stimuli that directly induce strong
emotional responses. In human social life, even slight changes in
facial features can convey significant emotional communicative
information. It is reasonable to assume that the human brain has
developed areas for processing such communicative information.
 |
ACKNOWLEDGMENTS |
We thank Dr. K. Tanaka for helpful comments on the earlier version
of this manuscript.
This research was supported by Grants-in-Aid for Scientific Research on
Priority Areas from the Ministry of Education, Science, Sports, and
Culture of Japan (05206109, 08279203, and 09268215), JSPS-RFTF
(97L00202), and the Fund for Comprehensive Research on Aging and Health
from the Ministry of Welfare of Japan (96A1102).
 |
FOOTNOTES |
Address reprint requests to: K. Nakamura
The costs of publication of this article were defrayed in part by the
payment of page charges. The article must therefore be hereby marked
"advertisement" in accordance with 18 U.S.C. Section
1734 solely to indicate this fact.
Received 12 February 1999; accepted in final form 21 May 1999.
 |
REFERENCES |
-
Adolphs, R.,
Damasio, H.,
Tranel, D.,
and Damasio, A.
Cortical systems for the recognition of emotion in facial expressions.
J. Neurosci.
16:
7678-7687, 1996[Abstract/Free Full Text].
-
Adolphs, R.,
Tranel, D.,
Damasio, H.,
and Damasio, A. R.
Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala.
Nature
372:
669-672, 1994[Medline].
-
Ahern, G. L.,
Schomer, D. L.,
Kleefield, J.,
Blume, H.,
Cosgrove, G. R.,
Weintraub, S.,
and Mesulam, M.-M.
Right hemisphere advantage for evaluating emotional facial expressions.
Cortex
27:
193-202, 1991[Medline].
-
Bellugi, U.,
Poizner, H.,
and Klima, E. S.
Brain organization for language: clues from sign aphasia.
Hum. Neurobiol.
2:
155-170, 1983[Medline].
-
Blonder, L. X.,
Bowers, D.,
and Heilman, K. M.
The role of the right hemisphere in emotional communication.
Brain.
114:
1115-1127, 1991[Abstract].
-
Bottini, G.,
Corcoran, R.,
Sterzi, R.,
Paulesu, E.,
Schenone, P.,
Scarpa, P.,
Frackowiak, R.S.J.,
and Frith, C. D.
The role of the right hemisphere in the interpretation of figurative aspects of language. A positron emission tomography activation study.
Brain
117:
1241-1253, 1994[Abstract].
-
Bowers, D.,
Blonder, L. X.,
Feinberg, T.,
and Heilman, K. M.
Different impact of right and left hemisphere lesions on facial emotion and object imagery.
Brain
114:
2593-2609, 1991[Abstract].
-
Bowers, D.,
Coslett, H. B.,
Bauer, R. M.,
Speedie, L. J.,
and Heilman, K. M.
Comprehension of emotional prosody following unilateral hemispheric lesions: processing defect versus distraction defect.
Neuropsychologia
25:
317-328, 1987[Medline].
-
Breiter, H. C.,
Etcoff, N. L.,
Whalen, P. J.,
Kennedy, W. A.,
Rauch, S. L.,
Buckner, R. L.,
Strauss, M. M.,
Hyman, S. E.,
and Rosen, B. R.
Response and habituation of human amygdala during visual processing of facial expression.
Neuron
17:
875-887, 1996[Medline].
-
Courtney, S. M.,
Ungerleider, L. G.,
Keil, K.,
and Haxby, J. V.
Transient and sustained activity in a distributed neural system for human working memory.
Nature
386:
608-611, 1997[Medline].
-
Darwin, C.
The Expression of the Emotions in Man and Animals., London: John Murray, 1872.
-
DeKosky, S. T.,
Heilman, K. M.,
Bowers, D.,
and Valenstein, E.
Recognition and discrimination of emotional faces and pictures.
Brain Lang.
9:
206-214, 1980[Medline].
-
Demonet, J. F.,
Wise, R.,
and Frackowiak, R.S.J.
Language functions explored in normal subjects by positron emission tomography: a critical review.
Hum. Brain Map.
1:
39-47, 1993.
-
Ekman, P.,
and Friesen, W. V.
Unmasking the Face., Englewood Cliffs, NJ: Prentice-Hall, 1975.
-
Etcoff, N. L.
Perceptual and conceptual organization of facial emotions: hemispheric differences.
Brain Cogn.
3:
385-412, 1984[Medline].
-
Fletcher, P. C.,
Shallice, T.,
Frith, C. D.,
Frackowiak, R.S.J.,
and Dolan, R. J.
The functional roles of prefrontal cortex in episodic memory. II Retrieval.
Brain
121:
1249-1256, 1998[Abstract].
-
Friston, K. J.,
Holmes, A. P.,
Worsley, K. J.,
Poline, J.-P.,
Frith, C. D.,
and Frackowiak, R. S. J.
Statistical parametric maps in functional imaging: a general linear approach.
Hum. Brain Map.
2:
189-210, 1995.
-
Gallese, V.,
Fadiga, L.,
Fogassi, L.,
and Rizzolatti, G.
Action recognition in the premotor cortex.
Brain
119:
593-609, 1996[Abstract].
-
Gallese, V.,
and Goldman, A.
Mirror neurons and the stimulation theory of mind-reading.
Trends Cogn. Sci.
2:
493-501, 1998.
-
George, M. S.,
Ketter, T. A.,
Gill, D. S.,
Haxby, J. V.,
Ungerleider, L. G.,
Herscovitch, P.,
and Post, R. M.
Brain regions involved in recognizing facial emotion or identity: an oxygen-15 PET study.
J. Neuropsychiatry
5:
384-394, 1993[Abstract].
-
Hauser, M. D.
The Evolution of Communication., Cambridge: The MIT Press, 1996.
-
Haxby, J. V.,
Horwitz, B.,
Ungerleider, L. G.,
Maisog, J. M.,
Pietrini, P.,
and Grady, C. L.
The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations.
J. Neurosci.
14:
6336-6353, 1994[Abstract].
-
Haxby, J. V.,
Ungerleider, L. G.,
Horwitz, B.,
Maisog, J. M.,
Rapoport, S. I.,
and Grady, C. L.
Face encoding and recognition in the human brain.
Proc. Natl. Acad. Sci. USA
93:
922-927, 1996[Abstract/Free Full Text].
-
Heilman, K.,
Scholes, R.,
and Watson, R.
Auditory affective agnosia: disturbed comprehension of affective speech.
J. Neurol. Neurosurg. Psychiatry
38:
69-72, 1975[Abstract].
-
Hornak, J.,
Rolls, E. T.,
and Wade, D.
Face and voice expression identification in patients with emotional and behavioural changes following ventral frontal lobe damage.
Neuropsychologia
34:
247-261, 1996[Medline].
-
Imaizumi, S.,
Mori, K.,
Kiritani, S.,
Kawashima, R.,
Sugiura, M.,
Fukuda, H.,
Itoh, K.,
Kato, T.,
Nakamura, A.,
Hatano, K.,
Kojima, S.,
and Nakamura, K.
Vocal identification of speaker and emotion activates different brain regions.
Neuroreport
8:
2809-2812, 1997[Medline].
-
Kawashima, R.,
Itoh, M.,
Miyazawa, H.,
Yamada, K.,
Matsuzawa, T.,
and Fukuda, H.
Changes of regional cerebral blood flow during listening to an unfamiliar spoken language.
Neurosci. Lett.
161:
69-72, 1993[Medline].
-
Kelley, W. M.,
Miezin, F. M.,
McDermott, K. B.,
Buckner, R. L.,
Raichle, M. E.,
Cohen, N. J.,
Ollinger, J. M.,
Akbudak, E.,
Conturo, T. E.,
Snyder, A. Z.,
and Petersen, S. E.
Hemispheric specialization in human dorsal frontal cortex and medial temporal lobe for verbal and nonverbal memory encoding.
Neuron
20:
927-936, 1998[Medline].
-
Klima, E. S.,
Bellugi, U.,
and Poizner, H.
Grammar and space in sign aphasiology.
Aphasiology
2:
319-327, 1988.
-
Kolb, B.,
and Taylor, L.
Affective behavior in patients with localized cortical excisions: role of lesion site and side.
Science
214:
89-91, 1981[Medline].
-
Ley, R. G.,
and Bryden, M. P.
Hemispheric differences in processing emotions and faces.
Brain Lang.
7:
127-138, 1979[Medline].
-
McGuire, P. K.,
Robertson, D.,
Thacker, A.,
David, A. S.,
Kitson, N.,
Frackowiak, R.S.J.,
and Frith, C. D.
Neural correlates of thinking in sign language.
Neuroreport
8:
695-698, 1997[Medline].
-
Morris, J. S.,
Friston, K. J.,
Buchel, C.,
Frith, C. D.,
Young, A. W.,
Calder, A. J.,
and Dolan, R. J.
A neuromodulatory role for the human amygdala in processing emotional facial expressions.
Brain
121:
47-57, 1998[Abstract].
-
Morris, J. S.,
Frith, C. D.,
Perrett, D. I.,
Rowland, D.,
Young, A. W.,
Calder, A. J.,
and Dolan, R. J.
A differential neural response in the human amygdala to fearful and happy facial expressions.
Nature
383:
812-815, 1996[Medline].
-
Moscovitch, M.,
Kapur, S.,
Kohler, S.,
and Houle, S.
Distinct neural correlates of visual long-term memory for spatial location and object identity: a positron emission tomography (PET) study in humans.
Proc. Natl. Acad. Sci. USA
92:
3721-3725, 1995[Abstract/Free Full Text].
-
Paulesu, E.,
Frith, C. D.,
and Frackowiak, R. S. J.
The neural correlates of the verbal component of working memory.
Nature
362:
342-345, 1993[Medline].
-
Perrett, D. I.,
May, K. A.,
and Yoshikawa, S.
Facial shape and judgements of female attractiveness.
Nature
368:
239-242, 1994[Medline].
-
Petersen, S.,
Fox, P.,
Posner, M.,
Mintum, M.,
and Raichle, M.
Positron emission tomographic studies of the cortical anatomy of single-word processing.
Nature
331:
585-589, 1989.
-
Petersen, S. E.,
Fox, P. T.,
Snyder, A.,
and Raichle, M. E.
Activation of prestriate and frontal cortical activity by words and word-like stimuli.
Science
249:
1041-1044, 1990[Medline].
-
Phillips, M. L.,
Young, A. W.,
Senior, C.,
Brammer, M.,
Andrew, C.,
Calder, A. J.,
Bullmore, E. T.,
Perrett, D. I.,
Rowland, D.,
Williams, S. C.,
Gray, J. A.,
and David, A. S.
A specific neural substrate for perceiving facial expressions of disgust.
Nature
389:
495-498, 1997[Medline].
-
Posner, M. E.,
and Carr, T. H.
Lexical access and the brain: anatomical constraints on cognitive modes of word recognition.
Am. J. Psychol.
105:
1-26, 1992[Medline].
-
Rizzolatti, G.,
and Arbib, M. A.
Language within our grasp.
Trends Neurosci.
21:
188-194, 1998[Medline].
-
Rizzolatti, G.,
Fadiga, L.,
Gallese, V.,
and Fogassi, L.
Premotor cortex and the recognition of motor actions.
Cogn. Brain Res.
3:
131-141, 1996a[Medline].
-
Rizzolatti, G.,
Fadiga, L.,
Matelli, M.,
Bettinardi, V.,
Paulesu, E.,
Perani, D.,
and Fazio, F.
Localization of grasp representations in humans by PET. I. Observation versus execution.
Exp. Brain Res.
111:
246-252, 1996b[Medline].
-
Roland, P. E.,
Graufelds, C. J.,
Wahlin, J.,
Ingelman, L.,
Andersson, M.,
Ledberg, A.,
Pedersenm, J.,
Akerman, S.,
Dabringhaus, A.,
and Zilles, K.
Human brain atlas: for high resolution functional and anatomical mapping.
Hum. Brain Map.
1:
173-184, 1994.
-
Ross, E. D.
The aprosodias: functional-anatomic organization of the affective components of language in the right hemisphere.
Arch. Neurol.
38:
561-589, 1981[Abstract].
-
Ross, E. D.,
and Mesulam, M.-M.
Dominant language functions of the right hemisphere: prosody and emotional gesturing.
Arch. Neurol.
36:
144-148, 1979[Abstract].
-
Talairach, J.,
and Tournoux, P.
Co-Planar Stereotaxic Atlas of the Human Brain., Stuttgart: George Thieme Verlag, 1988.
-
Wagner, A. D.,
Poldrack, R. A.,
Eldridge, L. L.,
Desmond, J. E.,
Glover, G. H.,
and Gabrieli, J.D.E.
Material-specific lateralization of prefrontal activation during episodic encoding and retrieval.
Neuroreport
9:
3711-3717, 1998[Medline].
-
Wienhard, K.,
Dahlbom, M.,
Eriksson, L.,
Bruckbauer, T.,
Pietrzyk, U.,
and Heiss, W.-D.
The ECAT EXACT HR: performance of a new high resolution positron scanner.
J. Comput. Assist. Tomogr.
18:
110-118, 1994[Medline].
-
Young, A. W.,
Hellawell, D. J.,
Van de Wal, C.,
and Johnson, M.
Facial expression processing after amygdalotomy.
Neuropsychologia
34:
31-39, 1996[Medline].
-
Zatorre, R. J.,
Evans, A. C.,
Meyer, E.,
and Gjedde, A.
Lateralization of phonetic and pitch discrimination in speech processing.
Science
256:
846-849, 1992[Medline].