Activation of the Right Inferior Frontal Cortex During Assessment of Facial Emotion

Katsuki Nakamura,1 Ryuta Kawashima,2 Kengo Ito,3 Motoaki Sugiura,2 Takashi Kato,3 Akinori Nakamura,3 Kentaro Hatano,3 Sumiharu Nagumo,1 Kisou Kubota,4 Hiroshi Fukuda,2 and Shozo Kojima1

 1Department of Behavioral and Brain Sciences, Primate Research Institute, Kyoto University, Inuyama, 484-8506;  2Department of Nuclear Medicine and Radiology, IDAC, Tohoku University, Sendai, 980-8575;  3Department of Biofunctional Research, National Institute for Longevity Sciences, Obu, 474; and  4Department of Social and Information Sciences, Nihon Fukushi University, Handa, 475, Japan


    ABSTRACT
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

Nakamura, Katsuki, Ryuta Kawashima, Kengo Ito, Motoaki Sugiura, Takashi Kato, Akinori Nakamura, Kentaro Hatano, Sumiharu Nagumo, Kisou Kubota, Hiroshi Fukuda, and Shozo Kojima. Activation of the Right Inferior Frontal Cortex During Assessment of Facial Emotion. J. Neurophysiol. 82: 1610-1614, 1999. We measured regional cerebral blood flow (rCBF) using positron emission tomography (PET) to determine which brain regions are involved in the assessment of facial emotion. We asked right-handed normal subjects to assess the signalers' emotional state based on facial gestures and to assess the facial attractiveness, as well as to discriminate the background color of the facial stimuli, and compared the activity produced by each condition. The right inferior frontal cortex showed significant activation during the assessment of facial emotion in comparison with the other two tests. The activated area was located within a triangular area of the inferior frontal cortex in the right cerebral hemisphere. These results, together with those of previous imaging and clinical studies, suggest that the right inferior frontal cortex processes emotional communicative signals that could be visual or auditory and that there is a hemispheric asymmetry in the inferior frontal cortex in relation to the processing of emotional communicative signals.


    INTRODUCTION
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

With the recent advent of brain-imaging techniques, the left hemisphere has been confirmed to be dominant in relation to language processing, in terms of both perception and production, in right-handed normal subjects (Demonet et al. 1993; Petersen et al. 1989, 1990; Posner and Carr 1992). Even sign language is processed preferentially by the left hemisphere (Bellugi et al. 1983; Klima et al. 1988; McGuire et al. 1997); nonlinguistic visuospatial information is, however, considered to be processed mainly by the right hemisphere. All of these data indicate that language, whether spoken, written, or sign language, is primarily controlled by the left hemisphere.

On the other hand, human communication includes a strong nonverbal, emotional component that does not invoke the use of language per se. Facial and body gestures and prosodic cues of voice can convey communicative information. Right-handed patients with lesions of the right hemisphere show difficulties in processing emotional communicative information, such as spontaneous prosody, prosodic comprehension, and comprehension of emotional gesturing (aprosodias) (Heilman et al. 1975; Ross 1981; Ross and Mesulam 1979), while there is evidence that the left hemisphere also functions in some affective behaviors (e.g., Kolb and Taylor 1981). Among emotional communicative signals, some facial gestures are believed to be universal (Ekman and Friesen 1975). Facial gestures are of interest with respect to the evolution of communication because they also are found among animals (Darwin 1872; Hauser 1996). Facial gestures reflect the signalers' emotional state and are used to negotiate social interactions. Clinical studies have suggested that the right hemisphere plays a dominant role both in the processing and execution of facial expressions (Ahern et al. 1991; Blonder et al. 1991; Bowers et al. 1987, 1991; DeKosky et al. 1980; Etcoff 1984; Kolb and Taylor 1981; Ley and Bryden 1979). However, neural substrates for the assessment of facial emotion remain unclear. In the present study, we examined which brain regions are involved and whether there is a hemispheric asymmetry in the assessment of facial emotion, using positron emission tomography (PET).


    METHODS
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

Seven right-handed normal male volunteers (aged 19-25 yr) participated in this study. Written informed consent was obtained from each of the subjects. The stimuli used were colored frontal images of a female face against a uniform background of red, yellow, or blue. Before the experiments, seven male psychologists classified >200 faces into categories such as calm, happy, sad, angry, surprised, disgusted, or others. Only faces that all of the seven psychologists invariably classified as happy, calm, sad, or angry were used as stimuli. The face stimuli were presented on a head-mounted display at 0.5-s intervals. In the facial emotion (FE) test, each subject assessed each facial emotion as positive (i.e., happy), neutral (i.e., calm), or negative (i.e., sad or angry) and pressed a left, center, or right button with the thumb, index, or middle finger, respectively, of his right hand, within 2 s of the stimulus onset. The stimulus disappeared automatically when the subject pressed a button. We used two control tests. In the color discrimination (CD) test, each subject discriminated the background color; in the facial attractiveness (FA) test, each subject assessed each of the face stimuli as attractive, neutral, or unattractive. We preferred the FA test as the control test because this was a nonlinguistic and emotional test, and facial attractiveness itself does not convey communicative information. Furthermore, the assessment of facial attractiveness as well as facial emotion requires the subjects to pay attention to physiognomic material (Perrett et al. 1994). The faces used in both the control tests were categorized as calm faces. For each subject, a separate scan was performed for each test condition. The order of the three tests was varied among the subjects. All the face stimuli were unfamiliar to the subjects, and each face was presented only once to each subject to avoid the confounding effects of memory on neuronal activity. All subjects were instructed to look at the center of the image during the PET scans. Eye movements were measured and few, if any, saccadic eye movements were observed during each scan. There were no significant differences in the number of eye movements among the three test conditions.

Each subject was placed in a PET scanner (Siemens/CTI ECAT EXACT HR) (Wiennahard et al. 1994) in a dark room (0.7 lux) during the experiment. Before the PET measurements, a transmission scan was performed using three rotating 68Ge/68Ga sources. This scan directly measured the attenuation coefficients, and the data were used to obtain corrected emission images. The emission scan was started immediately after the administration of a bolus injection of ~15 mCi (555 MBq) H215O using the three-dimensional (3D) collection mode. Each experimental test and PET measurement started immediately after the bolus injection and continued for 120 s.

Standard anatomic structures as incorporated in a human brain atlas system (Roland et al. 1994) were fitted interactively to high resolution magnetic resonance images (MRIs) of each subject using both linear and nonlinear parameters. These parameters were subsequently applied to transform PET images of rCBF into the standard brain anatomy. Statistical parametric mapping (SPM96, Wellcome Department of Cognitive Neurology, London) software was used for smoothing and statistical analysis (Friston et al. 1995). A 3D Gaussian filter of 20 mm was used. Differences in global flow were covaried out using analysis of covariance. Comparisons across test conditions were made by means of t-statistics, and thereafter transformed into normally distributed Z statistics. For each comparison, voxels with Z values >3.1 (P < 0.001, without correction for multiple comparisons) were considered to denote regions of significantly increased rCBF. Finally, each activation was superimposed onto the averaged transformed MRI of the seven subjects. Anatomic localization of the activated areas was made in relation to the mean reformatted MRI.


    RESULTS
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

Behavioral data are summarized in Table 1. The subjects were administered a greater number of trials (visual stimulations) in the CD test than in the FE (Mann Whitney U test, P < 0.01) and FA (P < 0.05) tests. The reaction time was shorter in the CD test than in the FE and FA tests (P < 0.01). There were no significant differences between the results in the FE and FA tests. In the FE and CD tests, 81.4% (range 64-91%) and 96.4% (87-100%) of the responses were correct, respectively. The difference between the two was significant (P < 0.01). Because the assessment of facial attractiveness depends on each individual's personality, no response could be classified as "correct" in the FA test. All of these data indicate that the CD test was easier than the FE and FA test, but that the number of visual stimulations and motor responses were well matched between the FE and FA tests.


                              
View this table:
[in this window]
[in a new window]
 
Table 1. Behavioral data

The FE test elicited activation of two foci in the right inferior frontal cortex, the lateral occipital cortices of both sides, and the left orbitofrontal cortex in comparison with the CD test (Table 2, top). The activations in the right inferior frontal cortex (46, 32, -4) and the right lateral occipital cortex (34, -88, -6) were significant after correction for multiple comparisons (**). The activation in the right inferior frontal cortex (46, 32, 0) was still significant when the activity during the FE test was compared with the activity during the FA test (Table 2, bottom). The activated area (*) was located within a triangular area of the inferior frontal cortex. Because only the activation in the right triangular area (Fig. 1) was determined to be significant (P < 0.001) in two separate comparisons, we assumed that the right triangular area was more dominant in the assessment of facial emotion. In the present study, the left inferior frontal cortex did not exhibit any significant activation during the assessment of facial emotion.


                              
View this table:
[in this window]
[in a new window]
 
Table 2. Regions of increased rCBF in the facial emotion test



View larger version (78K):
[in this window]
[in a new window]
 
Fig. 1. Positron emission tomography (PET) images of significant activation in the right inferior frontal cortex (triangular area) revealed by the subtraction of the activity in the color discrimination test from that in the facial emotion test. An activated area is superimposed on the mean MRI image produced from all 7 subjects. A: sagittal; B: coronal; C: transverse images. D: mean adjusted regional cerebral blood flow is plotted for the 3 test conditions (±SE). CD, color discrimination test; FA, facial attractiveness test; FE, facial emotion test.


    DISCUSSION
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

The present study provided two major findings. The first is that the right hemisphere was more active than the left during the processing of facial emotion. This finding is consistent with previous clinical observations (Ahern et al. 1991; Blonder et al. 1991; Bowers et al. 1987, 1991; DeKosky et al. 1980; Etcoff 1984; Heilman et al. 1975; Kolb and Taylor 1981; Ley and Bryden 1979; Ross 1981). The other finding is that the right inferior frontal cortex may be involved in the processing of emotional communicative signals based on facial gestures.

Recent studies in animals and humans have proposed the idea that the same mechanisms in the inferior frontal cortex might be activated during the observation and execution of actions (Gallese et al. 1996; Rizzolatti et al. 1996a,b), and such an "action observation/execution matching system" may be important for communication (Gallese and Goldman 1998; Rizzolatti and Arbib 1998). George et al. (1993) also reported inferior frontal cortex activation during matching of facial emotions. Some imaging studies have reported activations in the right inferior frontal cortex during the processing of prosodic cues of voice (Imaizumi et al. 1997; Kawashima et al. 1993; Zatorre et al. 1992). Hornak et al. (1996) reported that lesions of the ventral frontal cortex, which included both the orbitofrontal and inferior frontal cortices, affect the comprehension as well as production of face and voice emotional expressions. In 10 of 12 patients, the right ventral frontal cortex was among the regions damaged. These data, together with the present observations, suggest that the right inferior frontal cortex processes emotional communicative information with either visual or auditory input. Interestingly, the right inferior frontal cortex is a mirror image of Broca's area in the left hemisphere. It has been proposed that the functional-anatomic organization of the emotional communication in the right hemisphere mirrors that of prepositional language in the left hemisphere (Ross 1981). There may be a hemispheric asymmetry in the inferior frontal cortex in relation to the processing communicative information. Asymmetric activations in the frontal cortex for verbal and nonverbal information processing also was reported on memory encoding and retrieval (Kelley et al. 1998; Wagner et al. 1998). There may be multiple functional subregions for verbal and nonverbal information processing in the frontal cortex since the activated foci in these studies appear to be located posterior and dorsal to the area reported in the present study.

However, it is not our aim to stress that the right inferior frontal cortex processes only emotional communicative information nor that this area is the only area involved in the processing of emotional communicative information. There is evidence that cortical areas out of the frontal region are involved in the recognition of facial emotions (Adolphs et al. 1996). The right inferior frontal cortex is activated in other tasks related to language, such as metaphor comprehension (Bottini et al. 1994) and phonological working memory (Paulesu et al. 1993), as well as in a face matching task (Haxby et al. 1994). Concerning activations in the right inferior frontal cortex during memory recall (Courtney et al. 1997; Fletcher et al. 1998; Haxby et al. 1996; Moscovitch et al. 1995; Wagner et al. 1998), the present results might indicate that the processes of assessment of facial emotion involve matching the current facial gesture to templates or prototypes of facial emotions in the brain.

Human studies have suggested that the limbic system, in particular the amygdala, plays an important role in the recognition of facial emotions (Adolphs et al. 1994; Breiter et al. 1996; Morris et al. 1996, 1998; Phillips et al. 1997; Young et al. 1996). We did not find any specific increase in rCBF in the amygdala. Many previous studies have suggested the involvement of the limbic system in the processing of negative emotions, such as fear and disgust (Adolphs et al. 1994; Breiter et al. 1996; Morris et al. 1996, 1998; Phillips et al. 1997). The limbic structures may respond more to stimuli that directly induce strong emotional responses. In human social life, even slight changes in facial features can convey significant emotional communicative information. It is reasonable to assume that the human brain has developed areas for processing such communicative information.


    ACKNOWLEDGMENTS

We thank Dr. K. Tanaka for helpful comments on the earlier version of this manuscript.

This research was supported by Grants-in-Aid for Scientific Research on Priority Areas from the Ministry of Education, Science, Sports, and Culture of Japan (05206109, 08279203, and 09268215), JSPS-RFTF (97L00202), and the Fund for Comprehensive Research on Aging and Health from the Ministry of Welfare of Japan (96A1102).


    FOOTNOTES

Address reprint requests to: K. Nakamura

The costs of publication of this article were defrayed in part by the payment of page charges. The article must therefore be hereby marked "advertisement" in accordance with 18 U.S.C. Section 1734 solely to indicate this fact.

Received 12 February 1999; accepted in final form 21 May 1999.


    REFERENCES
TOP
ABSTRACT
INTRODUCTION
METHODS
RESULTS
DISCUSSION
REFERENCES

0022-3077/99 $5.00 Copyright © 1999 The American Physiological Society