Center for Cognitive Neuroscience and , 1 Brain Imaging and Analysis Center, Duke University, Durham, NC 27708, USA
![]() |
Abstract |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
![]() |
Introduction |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Behavioral studies have shown that humans are sensitive to temporal cues in facial displays. For example, subjects can temporally order scrambled sequences of videotaped emotional reactions, even when consecutive frames contain subtle transitions in facial expression (Edwards, 1998). In fact, performance improves under time constraints, suggesting that the extraction of dynamic features in facial expression occurs relatively automatically. In other circumstances, dynamic information contributes to face recognition abilities (Christie and Bruce, 1998
; Lander et al., 1999
) and judgements of facial affect (Bassili, 1978
, 1979
; Kamachi et al., 2001
) and identity (Seamon, 1982
; Hill and Johnston, 2001
; Thornton and Kourtzi, 2002
). As with other aspects of emotional perception, identification of expression changes may exhibit mood-congruent biases. Niedenthal and colleagues (Niedenthal et al., 2000
) showed that participants induced into a sad or happy mood take longer than controls to detect a change in morphed expressions that slowly decrease in intensity of displayed sadness or happiness, respectively.
Motion cues may also dissociate perceptual abilities in patients with neurologic and developmental disorders. Humphreys et al. (Humphreys et al., 1993) reported a double dissociation in two patients relative to performance on facial affect and identity tasks. Prosopagnosic patient H.J.A., who sustained ventral occipitotemporal damage, had difficulties with both facial identity and expression judgements using static photographs. However, his performance improved when asked to categorize facial expressions using moving point-light displays. On the other hand, patient G.K., who sustained bilateral parietal lobe damage, had relatively good performance on facial identity tasks but was impaired at facial affect recognition using either static or dynamic cues. Children with psychopathic tendencies also present with selective impairments in identifying emotion from cinematic displays of slowly morphing expressions (Blair et al., 2001
). In contrast, autistic children may benefit from slow dynamic information when categorizing emotional expressions (Gepner et al., 2001
). This latter finding differs from other autistic deficits on motion-processing tasks that require faster temporal integration, including lip reading (de Gelder et al., 1991
; Spencer et al., 2000
). Finally, the perception of biological motion in Williams syndrome is spared relative to other aspects of motion perception and visuomotor integration (Jordan et al., 2002
).
The neural substrates that mediate dynamic perception of emotional facial expressions are unknown. Previous neuroimaging studies have been limited to posed snapshots that lack temporal cues inherent in everyday socioemotional interactions. These studies have emphasized category-specific representations in the amygdala and associated frontolimbic structures by comparing responses to faces portraying basic emotions [e.g. (Breiter et al., 1996; Morris et al., 1996
; Phillips et al., 1997
, 1998
; Sprengylmeyer et al., 1997
; Whalen et al., 1998
, 2001
; Blair et al., 1999
; Kesler-West et al., 2001
)]. A separate line of research has revealed brain regions responsive to biological motion, including movement of face parts, but the role of emotion has been largely untested. Shifts of eye gaze, mouth movements and ambulation videos using animation or point-light displays elicit activity in the superior temporal sulcus and anatomically related areas [reviewed in Allison et al. (Allison et al., 2000
)]. Some of these regions, such as the amygdala (Bonda et al., 1996
; Kawashima et al., 1999
), also participate in facial affect recognition, suggesting a potential link between dorsal stream processing of biological motion and ventral stream processing of emotional salience.
The present study was designed to integrate these literatures by investigating perception of negative affect using facial stimuli that varied in their dynamic properties. Prototypical expressions of fear and anger were morphed with neutral expressions of the same actors to form the impression that the actors were becoming scared or angry in real-time (Fig. 1). fMRI activation to the emotion morphs was contrasted with the static expressions. In addition, identity morphs were created that blended facial identities across pairs of actors with neutral expressions. This condition was included to evaluate the specificity of the results with respect to changes in facial affect versus identity, and to dissociate the signaling of biologically plausible from implausible motion. Hypothesis about four brain regions were made a priori:
|
![]() |
Materials and Methods |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Twelve healthy adults provided written informed consent to participate in the study. Two of these subjects were dropped due to excessive head movement (center-of-mass motion estimates >3.75 mm in x, y or z planes). The remaining 10 participants (five male, five female; age range = 2130 years) were included in the statistical analysis. All participants were right-handed and were screened for history of neurologic and psychiatric illness and substance abuse. Procedures for human subjects were approved by the Institutional Review Board at Duke University.
Stimulus Development
Facial affect stimuli that are panculturally representative of basic emotions were taken from the Ekman series (Ekman and Friesen, 1976; Matsumoto and Ekman, 1989
). Prototypical expressions of fear and anger were morphed with neutral expressions of the same actor to create the dynamic emotional stimuli. The expression change depicted in the morph always portrayed increasing emotional intensity (i.e. from neutral to 100% fear, or neutral to 100% anger). In addition, pairs of actors with neutral expressions were morphed to create dynamic changes in identity. Identity morphs always combined pairs of actors of the same gender and ethnicity. All actors in the emotion morphs were included in the identity morphs, and all actors in the static images were included in the dynamic stimuli. A subset of actors portrayed both fear and anger.
Emotion morphs were used instead of videotaped expressions to allow experimental control over the rate and duration of the changes, as in previous studies [e.g. (Niedenthal et al., 2000)]. Morphs were created using MorphMan 2000 software (STOIK, Moscow, Russia). All faces were initially cropped with an ovoid mask to exclude extraneous cues (hair, ears, neckline, etc.). The images were then normalized for luminance and contrast and presented against a mid-gray background. Approximately 150 fiducial markers were placed on each digital source image in the morph pair and individually matched by computer mouse to corresponding points on the target image. Areas of the face relevant for perceiving changes in identity and expression, such as the eyes, mouth, and corrogator and obicularis oculi muscles were densely sampled (Ekman and Friesen, 1978
; Bassili, 1979
). All expressions were posed with full frontal orientations (i.e. there were no changes in viewpoint either across or within morphs). Morphs were presented at a rate of 30 frames/s, consistent with previous studies (Thornton and Kourtzi, 2002
). Forty-three frames were interpolated between the morph end points to provide smooth transitions across a 1500 ms duration. The final morph frame was presented for 200 ms for a total stimulus duration of 1700 ms. This duration approximates real-time changes of facial affect using videotaped expressions (Gepner et al., 2001
). Morphs were saved in .avi format and displayed as movie clips. Static displays of 100% fear, 100% anger, and neutral expressions were taken from the first and last frames of the emotion and identity morph movies and were presented for the same total duration as the morphs. Figure 1
illustrates four frames of a neutral-to-fear morph. Complete examples of neutral-to-anger and identity morph movies can be found at http://www.mind.duke.edu/level2/faculty/labar/face_morphs.htm.
Experimental Design
Participants viewed 36 unique exemplars of each of four stimulus categories static neutral, static emotional, dynamic neutral (identity morph), dynamic emotional (emotion morph). Half of the emotional stimuli represented fear and half represented anger. Each exemplar was presented twice during the course of the experiment (total 72 stimuli of each category). Stimuli were presented in a pseudorandom event-related design, subject to the constraint that no more than two exemplars of each category were presented in a row to avoid mood induction effects. Faces were separated by a central fixation cross. The intertrial interval varied between 12 and 15 s (mean 13.5 s) to allow hemodynamic and psychophysiological responses to return to baseline levels between stimulus presentations (Fig. 1). The testing session was divided into eight runs of 8 min 24 s duration. Run order was counterbalanced across participants, and no stimuli were repeated within each half-session of five runs. Stimulus presentation was controlled by CIGAL software (Voyvodic, 1999
) modified in-house to present video animations. Participants performed a three-alternative forced-choice categorical judgement in response to each face. Specifically, they used a three-button response box to indicate whether each face depicted an emotion morph (change in emotional expression), identity morph (change from one person to another), or static picture (no changes). Participants were told to respond whenever they could identify the category; speed of response was not emphasized. One example of each category was shown to the participants prior to entering the magnet to familiarize them with the stimuli.
Imaging Parameters and Data Analysis
MR images were acquired on a 1.5 T General Electric Signa NVi scanner (Milwaukee, WI) equipped with 41 mT/m gradients. The subjects head was immobilized using a vacuum cushion and tape. The anterior (AC) and posterior commissures (PC) were identified in the mid-sagittal slice of a localizer series. Thirty-four contiguous slices were prescribed parallel to the ACPC plane for high-resolution T1-weighted structural images [repetition time (TR) = 450 ms, echo time (TE) = 20 ms, field-of-view (FOV) = 24 cm, matrix = 2562, slice thickness = 3.75 mm]. An additional series of T1-weighted structural images oriented perpendicular to the ACPC were also acquired using the parameters specified above. Gradient echo echoplanar images sensitive to blood-oxygenation-level-dependent (BOLD) contrast were subsequently collected in the same transaxial plane as the initial set of T1-weighted structural images (TR = 3 s, TE = 40 ms, FOV = 24 cm, matrix = 642, flip angle = 90°, slice thickness = 3.75 mm, resulting in 3.75 mm3 isotropic voxels).
The fMRI data analysis utilized a voxel-based approach implemented in SPM99 (Wellcome Department of Cognitive Neurology, London, UK). Functional images were temporally adjusted for interleaved slice acquisition and realigned to the image taken proximate to the anatomic study using affine transformation routines. The realigned scans were coregistered to the anatomic scan obtained within each session and normalized to SPMs template image, which conforms to the Montreal Neurologic Institutes standardized brain space and closely approximates Talairach and Tournouxs (Talairach and Tournoux, 1988) stereotaxic atlas. The functional data were high-pass filtered and spatially smoothed with a 8 mm isotropic Gaussian kernel prior to statistical analysis. The regressors for the time-series data were convolved with a canonical hemodynamic response profile and its temporal derivative as implemented in SPM99. Statistical contrasts were set up using a random-effects model to calculate signal differences between the conditions of interest. Statistical parametric maps were derived by applying linear contrasts to the parameter estimates for the events of interest, resulting in a t-statistic for every voxel. Then, group averages were calculated by employing pairwise t-tests on the resulting contrast images. This sequential approach accounts for intersubject variability and permits generalization to the population at large. Interaction terms were analyzed in subsequent pairwise t-tests after the main effect maps were calculated to avoid false positive activations from the baseline period in the control conditions. The resultant statistical parametric maps were thresholded at a voxelwise uncorrected P < 0.001 and a spatial extent of five contiguous voxels.
![]() |
Results |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
fMRI Activation to Emotion Morphs
Compared to static emotional expressions, emotion morphs elicited responses along a bilateral frontotemporal circuit, including ventrolateral prefrontal cortex, substantia innominata, amygdala, parahippocampal gyrus, and fusiform gyrus (Fig. 2 and Table 1
). The activations in this contrast were predominantly restricted to ventral brain regions, with some additional dorsal activity in the dorsomedial prefrontal cortex, left precentral sulcus, right intraparietal sulcus, and putative visual motion area MT+. This contrast reveals brain regions whose responses were enhanced by dynamic changes in negative facial affect over and above their responses to the same expressions presented statically.
|
|
|
|
Relative to static neutral expressions, identity morphs elicited responses along both dorsal and ventral processing streams (Fig. 2 and Table 4
). The dorsal circuit included the dorsomedial prefrontal cortex, precentral sulcus, intraparietal sulcus, caudate nucleus, thalamus and visual area MT+. The ventral regions included the inferior frontal gyrus, amygdala, and fusiform gyrus. Most of the activations were bilateral.
|
|
A formal analysis was conducted to determine which brain regions showed an interaction between the emotion and motion factors in the experimental design (Table 6). A double-subtraction procedure was employed to compare the magnitude of the motion effect (dynamic versus static) across the facial expression categories (emotional versus neutral). For this analysis, fear and anger expressions were combined. Brain regions that were more sensitive to motion within the emotional expressions (compared to the neutral expressions) included the fusiform gyrus, anterior cingulate gyrus/ventromedial prefrontal cortex, the superior temporal gyrus and middle frontal gyrus. Brain regions that were more sensitive to motion within the neutral expressions (compared to the emotional expressions) included the inferior temporal gyrus/posterior fusiform gyrus, intraparietal sulcus, basal ganglia, dorsal anterior cingulate gyrus and lateral inferior frontal gyrus. These results are nearly identical to the activations shown in the simple effects analysis where the emotion morphs and identity morphs were directly contrasted against each other (Fig. 2
and Table 5
).
|
Brain regions sensitive to facial motion cues across emotional expression and identity changes were identified by a main effects analysis (Table 7). For this analysis, fear and anger expressions were combined. The results showed significant motion-related activity in six brain regions: visual area MT+, amygdala, inferior frontal gyrus, dorsomedial prefrontal cortex, intraparietal sulcus and caudate nucleus. All activations were bilateral except the caudate nucleus, which was left-sided.
|
A within-subjects ANOVA computed on behavioral accuracy data revealed a significant interaction between factors of emotion (fear, anger, neutral) and motion (dynamic, static), F(2,18) = 14.68, P < 0.001. Main effects of emotion [F(2,18) = 11.65, P < 0.001] and motion [F(2,18) = 48.69, P < 0.0001] were also found. Overall, participants were more accurate in identifying static images than dynamic images. Post hoc t-tests showed that across the static images, accuracy for anger (96 ± 1%) was worse than that for fear (98 ± 1%) or neutral (98 ± 1%) expressions, which did not significantly differ from each other. However, these data are potentially confounded by ceiling effects. Across the dynamic images, accuracy was worse for identity morphs (38 ± 4%) than either anger (64 ± 8%) or fear (63 ± 8%) morphs, which did not significantly differ from each other.
Behavioral Accuracy and Brain Activation
Inspection of individual subject data revealed that 6 of the 10 participants were less accurate in their recognition judgements of the identity morphs relative to the emotion morphs. We conducted a post hoc analysis of the identity morph versus emotion morph contrast (Table 5 and Fig. 2
) to determine any potential influence of behavioral accuracy on the statistical parametric maps. Participants were subdivided into two groups based on behavioral performance those for whom accuracy was equated across the morph categories (n = 4) and those for whom accuracy was worse for the identity morphs (n = 6). Because of the small sample sizes, we used both conservative (P < 0.001 uncorrected) and liberal (P < 0.05 uncorrected) threshold criteria for determining statistical significance in the identity morph versus emotion morph contrasts from each subgroup. A t-test was then computed across groups and thresholded at P < 0.001 uncorrected. The only brain region that showed differential activation in the identity morph versus emotion morph contrast as a function of behavioral accuracy was the right inferior frontal gyrus (BA 44). This area was more engaged to the identity morphs by the group with poorer performance, perhaps reflecting cognitive effort. However, this brain region emerged only at the more liberal statistical cutoff.
![]() |
Discussion |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Role of the Amygdala
The amygdala has been a focus of investigation in facial affect perception, yet its exact function remains debated. Several neuroimaging studies have reported amygdala activation to fearful faces (Breiter et al., 1996; Phillips et al., 1997
, 1998
; Whalen et al., 1998
, 2001
), but others have failed to replicate these results (Sprengylmeyer et al., 1997
; Kesler-West et al., 2001
; Pine et al., 2001
). The amygdalas response to fearful faces may be enhanced when the expression is caricatured (Morris et al., 1996
, 1998
), under conditions of full attention [(Pessoa et al., 2002
); but see (Vuilleumier et al., 2001
)], or when judgements involve the simultaneous presentation of multiple face exemplars (Hariri et al., 2000
; Vuilleumier et al., 2001
). Whereas some studies show fear specificity (Morris et al., 1996
, 1998
; Phillips et al., 1997
, 1998
; Whalen et al., 1998
, 2001
), others report generalization to other emotion categories, including happiness (Breiter et al., 1996
; Kesler-West et al., 2001
; Canli et al., 2002
; Pessoa et al., 2002
). Amygdala activation to faces further varies across subjects according to social (Hart et al., 2000
; Phelps et al., 2000
), personality (Canli et al., 2002
) and genetic (Hariri et al., 2002
) factors.
All of these studies presented posed facial displays devoid of temporal cues integral to real-life social exchanges. In the present study, we have shown that the amygdalas activity is enhanced by faces containing dynamic information relative to static snapshots of the same faces. Consistent with our hypotheses, the amygdala responded more to emotion morphs than to static emotional faces, especially for fearful expressions. Direct comparisons showed a specificity for fear over anger in the dynamic morphs but not in the static images. The category specificity in amygdala processing of dynamic facial displays cannot be attributed to other potential confounding features across the faces. Facial identity was the same across the dynamic and static images, and the morphing procedure allowed experimental control over the rate, intensity, duration and spatial orientation of the expression changes. Dynamic stimuli may be more effective in engaging the amygdala during face perception tasks and can potentially clarify the extent to which its responses exhibit category specificity.
Surprisingly, the amygdala also responded to dynamic changes in facial identity that were emotionally neutral. The intensity of amygdala activation to identity morphs was indistinguishable from that to the emotion morphs, even when the analyses were restricted to fear. We speculate that morphed stimuli containing rapid, artificial changes in facial identity elicit amygdala activity due to their threat or novelty value. In evolution, camouflage and other means of disguising identity have been effectively employed as deception devices in predatorprey interactions (Mitchell, 1993). It is possible that rapid changes in identity are interpreted by the amygdala as potentially threatening and consequently engage the bodys natural defense reactions. Alternatively, the amygdala may play a broader role in the perception of facial motion beyond that involved in emotional expression. The amygdala is known to participate in eye gaze and body movement perception (Brothers et al., 1990
), even when the stimuli have no apparent emotional content (Young et al., 1995
; Bonda et al., 1996
; Kawashima et al., 1999
). This account, though, does not explain why the amygdala activation was stronger for fear and identity morphs relative to anger morphs.
Parametric studies have indicated that the amygdala codes the intensity of fear on an expressors face [(Morris et al., 1996, 1998
); but see (Phillips et al., 1997
)]. However, it is not clear whether intense emotional expressions recruit amygdala processing because of perceptual, experiential or cognitive factors. Most imaging studies on facial affect perception have used blocked-design protocols that further complicate interpretation of results in this regard. The application of dynamic stimuli may help distinguish between these potential underlying mechanisms. In the present study, the emotion on the actors face did not reach full intensity until the last frame of each morph movie, whereas full intensity was continuously expressed in the static images. Thus, the intensity of expressed emotion in the morphs was, on average, only half of that portrayed in the static displays. If the amygdala simply codes the perceived intensity of fear in the expressors face, one would expect more activation for the static than dynamic stimuli. The statistical parametric maps do not support this interpretation. Alternatively, the amygdalas response may shift in latency to the time point at which full intensity was expressed. The temporal resolution of fMRI may be insufficient to reveal this possibility.
Neural processing in the amygdala may relate to abstract or cognitive representations of fear (Phelps et al., 2001). Previous studies have demonstrated that the amygdalas response to sensory stimuli depends upon contextual cues and evaluative processes. For instance, amygdala activity to facial expressions increases in a mood-congruent fashion (Schneider et al., 1997
), and amygdala activation to neutral faces is greater in social phobics, who may interpret these stimuli as threatening (Birbaumer et al., 1998
). The amygdala is also engaged by neutral stimuli that signal possible or impending threatening events, as in fear conditioning (Büchel et al., 1998
; LaBar et al., 1998
) and anticipatory anxiety (Phelps et al., 2001
). Rapid changes in fear intensity (and perhaps facial identity) may indicate an imminent source of threat in the environment, which recruits amygdala activity as a trigger for the defense/vigilance system (Whalen, 1998
; Fendt and Fanselow, 1999
; LaBar and LeDoux, 2001
). Further work is needed to elucidate the contributions of the amygdala to perceptual, experiential and cognitive aspects of fear.
The present results may partly explain why patients with amygdala lesions often exhibit deficits in judging the intensity of fear in facial displays. These patients sometimes have sufficient semantic knowledge to label fear, but they typically underestimate the degree of fear expressed in posed displays [(Adolphs et al., 1994; Young et al., 1995
; Calder et al., 1996
; Broks et al., 1998
; Anderson et al., 2000
; Anderson and Phelps, 2000
); but see (Hamann et al., 1996
)]. Fear recognition tasks require perceptual judgements of the extent of physiognomic change in the face relative to less fearful states and/or a canonical template of fear. If the amygdala codes dynamic cues in facial displays, these patients may have difficulty using kinetic information in face snapshots to make intensity judgements without additional contextual cues. Cognitive or experiential aspects of fear may also contribute to performance on recognition tasks. Testing amygdala-lesioned patients with dynamic stimuli may help determine the specific mechanism underlying their deficit and could potentially reduce the variability in performance across individual patients (Adolphs et al., 1999
).
Role of the Superior Temporal Sulcus (STS) and Associated Regions
Electrophysiological and brain imaging studies in humans and monkeys have implicated the cortex in and surrounding the banks of the STS in the social perception of biological motion [reviewed in Allison et al. (Allison et al., 2000)]. Biological motion associated with eye gaze direction (Puce et al., 1998
; Wicker et al., 1998
; Hoffman and Haxby, 2000
), mouth movements (Calvert et al., 1997
; Puce et al., 1998
; Campbell et al., 2001
), and hand and body action sequences (Bonda et al., 1996
; Howard et al., 1996
; Grèzes et al., 1998
; Neville et al., 1998
; Grossman et al., 2000
) engage the STS, particularly in its mid-to-posterior aspect. The present study extends the known role of the STS to the perception of dynamic changes in facial expression. The posterior aspect of the STS region responsive to the emotion morphs overlaps with the areas implicated in these previous studies. Activity of STS neurons in monkeys is evoked by static pictures of grimaces, yawns, threat displays and other expressions relevant for socioemotional interactions with conspecifics (Perrett et al., 1985
, 1992
; Hasselmo et al., 1989
; Brothers and Ring, 1993
). These images potentially recruit neural processing in the monkey STS because of implied motion in the expressions (Freyd, 1987
; Allison et al., 2000
; Kourtzi and Kanwisher, 2000
; Senior et al., 2000
). This may also explain why the STS was not activated in the emotion morph versus static emotion contrasts.
Interestingly, the STS preferentially signaled dynamic changes in facial expression relative to dynamic changes in facial identity. Given that rapid changes in facial identity do not exist in the physical world, this result supports the hypothesis that the STS distinguishes biologically plausible from biologically implausible or non-biological motion (Allison et al., 2000). In this regard, the STS is dissociated from visual motion area V5/MT+, which is situated just posterior and ventral to the banks of the STS (Zeki et al., 1991
; McCarthy et al., 1995
; Tootell et al., 1995
; Dumoulin et al., 2000
; Kourtzi and Kanwisher, 2000
; Huk et al., 2002
). As predicted, area MT+ was activated by all dynamic stimuli and did not prefer emotion over identity morphs. To our knowledge, this is the first demonstration of responses in area MT+ to dynamic facial expressions in humans. Differential STS processing of emotion morphs may alternatively relate to specific aspects of facial motion present in these stimuli. At a more relaxed statistical criterion, the posterior STS did discriminate fear from anger morphs, which involve distinct facial actions (Ekman and Friesen, 1978
). Further research is needed to evaluate whether this STS activity is related to specific facial actions or reflects modulation by the amygdala or other limbic structures.
Multimodal portions of the STS integrate form and motion through anatomic links with both ventral and dorsal visual areas (Rockland and Pandya, 1981; Desimone and Ungerleider, 1986
; Oram and Perrett, 1996
). In turn, the STS is interconnected with the prefrontal cortex in a gradient from ventral to dorsal regions as one proceeds along its rostrocaudal extent (Petrides and Pandya, 1988
). The STS is connected to limbic and paralimbic regions, such as the amygdala and cingulate gyrus, via direct projections (Herzog and Van Hoesen, 1976
; Pandya et al., 1981
; Amaral et al., 1992
) and through dorsal frontoparietal and temporopolar interfaces (Barbas and Mesulam, 1981
; Cavada and Goldman-Rakic, 1989
; Petrides and Pandya, 1988
, 1999
; Morecraft et al., 1993
). Components of this frontotemporolimbic circuit, including the medial fusiform gyrus, rostral area 8, medial prefrontal cortex/ventral anterior cingulate gyrus and temporopolar cortex/anterior STS, also distinguished emotion morphs from identity morphs in consort with the posterior STS.
Role of the Inferotemporal Cortex
Dissociable regions within the inferior temporal and fusiform gyri signaled dynamic changes in facial identity versus facial expression anteriomedial fusiform gyrus for expression changes and posterolateral inferotemporal cortex (inferior temporal gyrus and posterior fusiform gyrus) for identity changes. Anatomically segregated processing was also found across the superior and inferior temporal neocortex for facial affect and identity, respectively. Such regional specificity may account for the variability in performance across these two domains of face recognition in prosopagnosics with varying locations and extents of brain damage (Hasselmo et al., 1989; Humphreys et al., 1993
; Haxby et al., 2000
). Portions of the fusiform gyrus exhibited category specificity for fear over anger morphs, perhaps due to modulatory feedback from limbic structures such as the amygdala (Amaral et al., 1992
). Previous imaging studies have shown enhanced fusiform gyrus activity for fearful expressions (Breiter et al., 1996
; Sprengylmeyer et al., 1997
; Pessoa et al., 2002
). As revealed by connectivity modeling, the amygdala interacts with various sectors along the ventral visual stream during facial affect perception tasks (Morris et al., 1998
; Pessoa et al., 2002
).
Computational models and single-cell recordings in monkeys support a role for the inferior temporal gyrus in neural coding of facial identity independent of facial affect (Hasselmo et al., 1989; Haxby et al., 2000
). Inferotemporal activity in the present study may reflect dual coding of the identities present within the morph, since this area is hypothesized to participate in the structural encoding of faces (Kanwisher et al., 1997
; Allison et al., 1999
). This possibility could be confirmed in electrophysiological experiments with high temporal resolution. Importantly, the morph stimuli were created with smooth transitions between frames and presented at a rate that avoided strobing effects, which potentially engender recoding of each face in successive frames. Alternatively, face processing along the inferotemporal cortex may be subject to attentional modulation. Campbell et al. (Campbell et al., 2001
) found greater activity in inferotemporal cortex during viewing of meaningless facial actions (gurning) relative to meaningful facial actions of speech-reading. These authors also postulated an attentional account for their results.
Limitations and Future Directions
The present study was limited in three primary ways. First, only morphed images of fear and anger were presented. It is unknown if the results extend to other emotional expression categories. The creation of morphed stimuli is time-consuming and inclusion of all categories is difficult to accommodate within a single event-related fMRI paradigm. Future studies should compare morphed images of fear and anger to other expressions to determine the specificity of the present results. Secondly, only incremental emotional expression changes were presented. Future studies should compare incremental versus decremental changes in fear and anger to determine the sensitivity of the brain regions to directional aspects of morphed expressions. Finally, the individuals in the present study were less accurate in their categorical recognition of dynamic relative to static images. Although the statistical analysis of accuracy revealed only one brain area that may reflect cognitive effort on the task (BA 44), this analysis may have been underpowered due to sample size constraints. Future studies should compare activation to dynamic and static facial expressions under experimental conditions in which task performance is equated and/or unrelated to the primary experimental manipulation (e.g. during gender judgements).
![]() |
Conclusion |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
![]() |
Notes |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Address correspondence to Kevin S. LaBar, Center for Cognitive Neuroscience, Box 90999, Duke University, Durham, NC 27708, USA. Email: kevin.labar{at}duke.edu.
![]() |
References |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Adolphs R, Tranel D, Hamann SB, Young A, Calder A, Anderson AK et al. (1999) Recognition of facial emotion in nine subjects with bilateral amygdala damage. Neuropsychologia 37:11111117.[CrossRef][ISI][Medline]
Allison T, Puce A, Spencer DD, McCarthy G (1999) Electrophysiological studies of human face perception. I: Potentials generated in occipitotemporal cortex by face and non-face stimuli. Cereb Cortex 9:415430.
Allison T, Puce A, McCarthy G (2000) Social perception from visual cues: role of the STS region. Trends Cogn Sci 4:267278.[CrossRef][ISI][Medline]
Amaral DG, Price JL, Pitkänen A, Carmichael ST (1992) Anatomical organization of the primate amygdaloid complex. In: The amygdala: neurobiological aspects of emotion, memory, and mental dysfunction (Aggleton JP, ed.), pp. 166. New York: Wiley-Liss.
Anderson AK, Phelps EA (2000) Expression without recognition: contributions of the human amygdala to emotional communication. Psychol Sci 11:106111.[CrossRef][ISI][Medline]
Anderson AK, Spencer DD, Fulbright RK, Phelps EA (2000) Contribution of the anteromedial temporal lobes to the evaluation of facial emotion. Neuropsychology 14:526536.[CrossRef][ISI][Medline]
Barbas H, Mesulam M-M (1981) Organization of afferent input to subdivisions of area 8 in the rhesus monkey. J Comp Neurol 200:407431.[ISI][Medline]
Bassili JN (1978) Facial motion in the perception of faces and of emotional expression. J Exp Psychol Hum Percept Perform 4:373379.[CrossRef][ISI][Medline]
Bassili JN (1979) Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face. J Pers Soc Psychol 37:20492058.[CrossRef][ISI][Medline]
Birbaumer N, Grodd W, Diedrich O, Klose U, Erb M, Lotze M et al. (1998) FMRI reveals amygdala activation to human faces in social phobics. Neuroreport 9:12231226.[ISI][Medline]
Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (1999) Dissociable neural responses to facial expressions of sadness and anger. Brain 122:883893.
Blair RJR, Colledge E, Murray L, Mitchell DGV (2001) A selective impairment in the processing of sad and fearful expressions in children with psychopathic tendencies. J Abnorm Child Psychol 29: 491498.[CrossRef][ISI][Medline]
Bonda E, Petrides M, Ostry D, Evans A (1996) Specific involvement of human parietal systems and the amygdala in the perception of biological motion. J Neurosci 16:37373744.
Breiter HC, Etcoff NL, Whalen PJ, Kennedy WA, Rauch SL, Buckner RL et al. (1996) Response and habituation of the human amygdala during visual processing of facial expression. Neuron 17:875887.[ISI][Medline]
Broks P, Young AW, Maratos EJ, Coffey PJ, Calder AJ, Isaac C et al. (1998) Face processing impairments after encephalitis: amygdala damage and recognition of fear. Neuropsychologia 36:5970.[CrossRef][ISI][Medline]
Brothers L, Ring B (1993) Mesial temporal neurons in the macaque monkey with responses selective for aspects of social stimuli. Behav Brain Res 57:5361.[CrossRef][ISI][Medline]
Brothers L, Ring B, Kling A (1990) Response of neurons in the macaque amygdala to complex social stimuli. Behav Brain Res 41:199213.[CrossRef][ISI][Medline]
Büchel C, Morris JS, Dolan RJ, Friston KJ (1998) Brain systems mediating aversive conditioning: an event-related fMRI study. Neuron 20:947957.[ISI][Medline]
Calder AJ, Young AW, Rowland D, Perrett DI, Hodges JR, Etcoff NL (1996) Facial emotion recognition after bilateral amygdala damage: differentially severe impairment of fear. Cogn Neuropsychol 13: 699745.[CrossRef][ISI]
Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SCR, McGuire PW et al. (1997) Activation of auditory cortex during silent lipreading. Science 276:593596.
Campbell R, MacSweeney M, Surguladze S, Calvert G, McGuire P, Suckling J et al. (2001) Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning). Cogn Brain Res 12:233243.[ISI][Medline]
Canli T, Sivers H, Whitfield SL, Gotlib IH, Gabrieli JDE (2002) Amygdala response to happy faces as a function of extraversion. Science 296:2191.
Cavada C, Goldman-Rakic PS (1989) Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe. J Comp Neurol 287: 422445.[ISI][Medline]
Christie F, Bruce V (1998) The role of dynamic information in the recognition of unfamiliar faces. Mem Cognit 26:780790.[ISI][Medline]
de Gelder B, Vroomen J, van der Heide L (1991) Face recognition and lip-reading in autism. Eur J Cogn Psychol 3:6986.
Desimone R, Ungerleider LG (1986) Multiple visual areas in the caudal superior temporal sulcus of the macaque. J Comp Neurol 248:164189.[ISI][Medline]
Dumoulin SO, Bittar RG, Kabani NJ, Baker CL Jr, LeGoualher G, Pike BG et al. (2000) A new anatomical landmark for reliable identification of human area V5/MT: a quantitative analysis of sulcal patterning. Cereb Cortex 10:454463.
Edwards K (1998) The face of time: temporal cues in facial expressions of emotion. Psychol Sci 9:270276.[CrossRef][ISI]
Ekman P, Friesen WV (1976) Measuring facial movement. Environ Psychol Nonverb Behav 1:5675.[ISI]
Ekman P, Friesen WV (1978) The facial action coding system. Palo Alto, CA: Consulting Psychologists Press.
Ekman P, Friesen W V (1982) Felt, false, and miserable smiles. J Nonverb Behav 6:238252.[ISI]
Fendt M, Fanselow MS (1999) The neuroanatomical and neurochemical basis of conditioned fear. Neurosci Biobehav Rev 23:743760.[CrossRef][ISI][Medline]
Freyd JJ (1987) Dynamic mental representations. Psychol Rev 94:427238.[CrossRef][ISI][Medline]
Gepner B, Deruelle C, Grynfeltt S (2001) Motion and emotion: a novel approach to the study of face processing by young autistic children. J Autism Dev Disord 31:3745.[CrossRef][ISI][Medline]
Gloor P (1997) The temporal lobe and limbic system. New York: Oxford University Press.
Grèzes J, Costes N, Decety J (1998) Top-down effect of strategy on the perception of human biological motion: a PET investigation. Cogn Neuropsychol 15:553582.[CrossRef][ISI]
Grossman E, Donnelly M, Price R, Pickens D, MorganV, Neighbor G et al. (2000) Brain areas involved in perception of biological motion. J Cogn Neurosci 12:711720.
Hamann SB, Stefanacci L, Squire LR, Adolphs R, Tranel D, Damasio H, Damasio A (1996) Recognizing facial emotion. Nature 379:497.[CrossRef][ISI][Medline]
Hariri AR, Bookheimer SY, Mazziotta JC (2000) Modulating emotional responses: effects of a neocortical network on the limbic system. Neuroreport 11:4348.[ISI][Medline]
Hariri AR, Mattay VS, Tessitore A, Kolachana B, Fera F, Goldman D et al. (2002) Serotonin transporter genetic variation and the response of the human amygdala. Science 297:400403.
Hart AJ, Whalen PJ, Shin LM, McInerney SC, Fischer H, Rauch SL (2000) Differential response in the human amygdala to racial outgroup vs. ingroup face stimuli. Neuroreport 11:23512355.[ISI][Medline]
Hasselmo ME, Rolls ET, Bayli, GC (1989) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey. Behav Brain Res 32:203218.[ISI][Medline]
Haxby JV, Hoffman EA, Gobbini MI (2000) The distributed human neural system for face perception. Trends Cogn Sci 4:223233.[CrossRef][ISI][Medline]
Herzog AW, Van Hoesen GW (1976) Temporal neocortical afferent connections to the amygdala in the rhesus monkey. Brain Res 115:5769.[CrossRef][ISI][Medline]
Hess U, Kleck RE (1997) Differentiating emotion elicited and deliberate emotional facial expressions. In: What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS) (Ekman P, Rosenberg EL, eds), pp. 271286. New York: Oxford University Press.
Hill H, Johnston A (2001) Categorizing sex and identity from the biological motion of faces. Curr Biol 11:880885.[CrossRef][ISI][Medline]
Hoffman EA, Haxby JV (2000) Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nat Neurosci 3:8084.[CrossRef][ISI][Medline]
Howard RJ, Brammer M, Wright I, Woodruff PW, Bullmore ET, Zeki S (1996) A direct demonstration of functional specialization within motion-related visual and auditory cortex of the human brain. Curr Biol 6:10151019.[ISI][Medline]
Huk AC, Dougherty RF, Heeger DJ (2002) Retintotopy and functional subdivision of human areas MT and MST. J Neurosci 22:71957205.
Humphreys GW, Donnelly N, Riddoch MJ (1993) Expression is computed separately from facial identity, and it is computed separately for moving and static faces: neuropsychological evidence. Neuropsychologia 31:173181.[CrossRef][ISI][Medline]
Jordan H, Reiss JE, Hoffman JE, Landau B (2002) Intact perception of biological motion in the face of profound spatial deficits: Williams syndrome. Psychol Sci 13:162167.[CrossRef][ISI][Medline]
Kamachi M, Bruce V, Mukaida S, Gyoba J, Yoshikawa S, Akamatsu S (2001) Dynamic properties influence the perception of facial expressions. Perception 30:875887.[CrossRef][ISI][Medline]
Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 17:43024311.
Kawashima R, Sugiura M, Kato T, Nakamura A, Natano K, Ito K et al. (1999) The human amygdala plays an important role in gaze monitoring. Brain 122:779783.
Kesler-West ML, Andersen AH, Smith CD, Avison MJ, Davis CE, Kryscio RJ et al. (2001) Neural substrates of facial emotion processing using fMRI. Cogn Brain Res 11:213226.[ISI][Medline]
Kourtzi Z, Kanwisher N (2000) Representation of perceived object shape by the human lateral occipital cortex. Science 283:15061509.
LaBar KS, LeDoux JE (2001) Coping with danger: the neural basis of defensive behaviors and fearful feelings. In: Handbook of physiology, section 7: the endocrine system, Vol. IV: coping with the environment: neural and endocrine mechanisms (McEwen BS, ed.), pp. 139154. New York: Oxford University Press.
LaBar KS, Gatenby JC, Gore JC, LeDoux JE, Phelps EA (1998) Human amygdala activation during conditioned fear acquisition and extinction: a mixed-trial fMRI study. Neuron 20:937945.[ISI][Medline]
Lander K, Christie F, Bruce V (1999) The role of movement in the recognition of famous faces. Mem Cognit 27:974985.[ISI][Medline]
Matsumoto D, Ekman P (1989) AmericanJapanese cultural differences in intensity ratings of facial expressions of emotion. Motiv Emot 13:143157.[ISI]
McCarthy G, Spicer M, Adrignolo A, Luby M, Gore J, Allison T (1995) Brain activation associated with visual motion studied by functional magnetic resonance imaging in humans. Hum Brain Mapp 2:234243.
Mitchell RW (1993) Animals as liars: the human face of nonhuman duplicity. In: Lying and deception in everyday life (Lewis M, Saarni C, eds), pp. 5989. New York: Guilford Press.
Morecraft RJ, Guela C, Mesulam M-M (1993) Architecture of connectivity within a cingulofrontoparietal neurocognitive network. Arch Neurol 50:279284.[Abstract]
Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ et al. (1996) A differential neural response in the human amygdala to fearful and happy facial expressions. Nature 383:812815.[CrossRef][ISI][Medline]
Morris JS, Friston KJ, Büchel C, Frith CD, Young AW, Calder AJ et al. (1998) A neuromodulatory role for the human amygdala in processing emotional facial expresssions. Brain 121:4757.[Abstract]
Neville HJ, Bavelier D, Corina D, Rauschecker J, Karni A, Lalwani A et al. (1998) Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience. Proc Natl Acad Sci USA 95:922929.
Niedenthal PM, Halberstadt JB, Margolin J, Innes-Ker Å-H (2000) Emotional state and the detection of change in facial expression of emotion. Eur J Soc Psychol 30:211222.[CrossRef][ISI]
Oram MW, Perrett DI (1996) Integration of form and motion in the anterior superior temporal polysensory area (STPa) of the macaque monkey. J Neurophysiol 76: 109129.
Pandya DN, Van Hoesen GW, Mesulam M-M (1981) Efferent connections of the cingulate gyrus in the rhesus monkey. Exp Brain Res 42:319330.[ISI][Medline]
Perrett DI, Smith PAJ, Potter DD, Mistlin AJ, Head AS, Milner AD et al. (1985) Visual cells in the temporal cortex sensitive to face view and gaze direction. Proc R Soc Lond B 223:293317.[ISI][Medline]
Perrett DI, Hietanen JK, Oram MW, Benson PJ (1992) Organization and functions of cells responsive to faces in the temporal cortex. Phil Trans R Soc Lond B 335:2330.[ISI][Medline]
Pessoa L, McKenna M, Gutierrez E, Ungerleider LG (2002) Neural processing of emotional faces requires attention. Proc Natl Acad Sci USA 99:1145811463.
Petrides M, Pandya DN (1988) Association fiber pathways to the frontal cortex from the superior temporal region in the rhesus monkey. J Comp Neurol 273:5266.[ISI][Medline]
Petrides M, Pandya DN (1999) Dorsolateral prefrontal cortex: comparative cytoarchitectonic analysis in the human and the macaque brain and corticocortical connection patterns. Eur J Neurosci 11:10111036.[CrossRef][ISI][Medline]
Phelps EA, OConnor KJ, Gatenby JC, Gore JC, Grillon C, Davis M (2000) Activation of the left amygdala to a cognitive representation of fear. Nat Neurosci 4:437441.[ISI]
Phelps EA, OConnor KJ, Cunningham WA, Funayama ES, Gatenby JC, Gore JC et al. (2001) Performance on indirect measures of race evaluation predicts amygdala activation. J Cogn Neurosci 12:729738.[ISI]
Phillips ML, Young AW, Senior C, Brammer M, Andrew C, Calder AJ et al. (1997) A specific neural substrate for perceiving facial expressions of disgust. Nature 389:495498.[CrossRef][ISI][Medline]
Phillips ML, Young AW, Scott SK, Calder AJ, Andrew C, Giampietro V et al. (1998) Neural responses to facial and vocal expressions of fear and disgust. Proc R Soc Lond B 265:18091817.[CrossRef][ISI][Medline]
Pine DS, Szeszko PR, Bilder RM, Ardekani B, Grun J, Zaragn E et al. (2001) Cortical brain regions engaged by masked emotional faces in adolescents and adults: an fMRI study. Emotion 1:137147.[CrossRef][Medline]
Puce A, Allison T, Bentin S, Gore JC, McCarthy G (1998) Temporal cortex activation in humans viewing eye and mouth movements. J Neurosci 18:21882199.
Rockland KS, Pandya DN (1981) Cortical connections of the occipital lobe in the rhesus monkey: interconnections between areas 17, 18, 19 and the superior temporal sulcus. Brain Res 212:249270.[CrossRef][ISI][Medline]
Schneider F, Grodd W, Weiss U, Klose U, Mayer KR, Nagele T et al. (1997) Functional MRI reveals left amygdala activation during emotion. Psychiatr Res 76:7582.[Medline]
Seamon JG (1982) Dynamic facial recognition: examination of a natural phenomenon. Am J Psychol 85:363381.
Senior C, Barnes J, Giampietro V, Simmons A, Bullmore ET, Brammer M et al. (2000) The functional neuroanatomy of implicit-motion perception or representational momentum. Curr Biol 10:1622.[CrossRef][ISI][Medline]
Spencer J, OBrien J, Riggs K, Braddick O, Atkinson J, Wattam-Bell J (2000) Motion processing in autism: evidence for a dorsal stream deficiency. Neuroreport 11:27652767.[ISI][Medline]
Sprengylmeyer R, Rausch M, Eysel UT, Przuntek H (1997) Neural structures associated with recognition of facial expressions of basic emotions. Proc R Soc Lond B 265:19271931.[ISI]
Talairach J, Tournoux P (1988) Co-planar stereotaxic atlas of the human brain. New York: Thieme.
Thornton IM, Kourtzi Z (2002) A matching advantage for dynamic human faces. Perception 31:113132.[CrossRef][Medline]
Tootell RBH, Reppas JB, Kwong KK, Malach R, Born RT, Brady TJ et al. (1995) Functional analysis of human MT and related visual cortical areas using magnetic resonance imaging. J Neurosci 15:32153230.[Abstract]
Voyvodic JT (1999) Real-time fMRI paradigm control, physiology, and behavior combined with near real-time statistical analysis. Neuroimage 10:91106.[CrossRef][ISI][Medline]
Vuilleumier P, Armony JL, Driver J, Dolan RJ (2001) Effects of attention and emotion on face processing in the human brain: an event-related fMRI study. Neuron 30:829841.[CrossRef][ISI][Medline]
Whalen PJ (1998) Fear, vigilance, and ambiguity: initial neuroimaging studies of the human amygdala. Curr Direct Psychol Sci 7:177188.[ISI]
Whalen PJ, Rauch SL, Etcoff NL, McInerney SC, Lee MB, Jenike MA (1998) Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. J Neurosci 18:411418.
Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL (2001) A functional MRI study of human amygdala responses to facial expressions of fear versus anger. Emotion 1:7083.[CrossRef][Medline]
Wicker B, Michel F, Henaff MA, Decety J (1998) Brain regions involved in the perception of gaze: a PET study. Neuroimage 8: 221227.[CrossRef][ISI][Medline]
Young AW, Aggleton JP, Hellawell DJ, Johnson M, Broks P, Hanley JR (1995) Face processing impairments after amygdalotomy. Brain 118:1524.[Abstract]
Zeki SM, Watson JD, Leuck CJ, Friston KJ, Kennard C, Frackowiak RS (1991) A direct demonstration of functional specialization in human visual cortex. J Neurosci 11:641649.[Abstract]