1 Brain Research Unit, Low Temperature Laboratory, Helsinki University of Technology, FIN-02015 HUT, Espoo, Finland and , 2 Department of Clinical Neurophysiology, Helsinki University Central Hospital, FIN-00290 Helsinki, Finland
![]() |
Abstract |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
![]() |
Introduction |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
In their fMRI study, Neville et al. compared cortical activation between native signers, deaf and hearing, who were viewing sign language sentences and hearing non-signers reading English sentences (Neville et al. 1998). Similar language-related areas in the left hemisphere responded to both signed and written language, but only sign language activated the homologous areas in the right hemisphere of deaf and hearing signers. Hickok and co-workers argued that the observed right-hemisphere activity might not be related to linguistic processing of signs but instead represent an artifact resulting from comparisons between sign language and written language because, for example, prosody, emotional facial expressions and meaningful non-linguistic gestures are present in signed but absent in written languages (Hickok et al., 1998b
). Neville and collaborators also found stronger right-hemisphere activation for sign language than non-sign gestures in native signers whereas hearing non-signers showed no consistent differences in cortical activation between the stimuli (Neville et al., 1998
). Based on these findings, Corina et al. claimed that facial information or non-linguistic gestures common to both types of stimuli were unlikely to explain the right-hemisphere activation by sign language in native signers (Corina et al., 1998
). Here it is important to note that in the Neville study (Neville et al., 1998
), only the non-signs were linguistically meaningless to native signers whereas both signs and non-signs were equally meaningless to non-signers. How- ever, emotional facial expressions and prosody in the context of meaningful and meaningless signs are likely to be processed differently and could thus very well also involve different levels of right-hemisphere activity. For example, a recent lesion study (Adolphs et al., 2000
) reported impaired recognition, naming and categorizing of facial emotional expressions following lesions of right inferior parietal, right superior temporal and bilateral inferior frontal regions.
In the present study we examined cortical activation patterns in congenitally deaf signers and in hearing non-signers while they were passively viewing sign language. Because the signs were linguistically meaningful to deaf signers but linguistically meaningless to hearing non-signers, and because signs are known to activate similar language-related areas in deaf and hearing native signers (Bavelier et al., 1998; Neville et al., 1998
), we assumed the critical difference between the two groups to be in sign language experience and comprehension. Thus, comparison between the two groups can be considered to reflect differences in cortical activation between sign language observation vs observation of motor actions forming non-sign gestures. The activated areas were evaluated without sub- tractions between different tasks or the two subject groups, allowing also examination of areas similarly activated by meaningful and meaningless signs.
Previous positron emission tomography (PET) (Grafton et al., 1996; Rizzolatti et al., 1996b
) and magnetoencephalographic (MEG) (Hari et al., 1998
; Nishitani and Hari, 2000
) studies during observation of hand actions have shown significant activation in the human left inferior frontal lobe (IFL, including Broca's region), left superior temporal sulcus (STS, including Wernicke's region) and primary motor cortices. The monkey F5 area, the homologue of the human IFL cortex, contains mirror neurons that discharge both when the monkey grasps or manipulates objects, and when he observes another monkey or the experimenter performing similar actions (Gallese et al., 1996
; Rizzolatti et al., 1996a
). These mirror neurons are assumed to form a system matching execution and observation of motor actions. Rizzolatti and Arbib (Rizzolatti and Arbib, 1998
) suggested that the mirror neuron system provides the necessary communication bridge from doing to under- standing, also providing the evolutionary gestural basis for human language development.
Based on the previous imaging data on viewing of sign lan- guage and hand actions, we assumed that the bilateral superior temporal and inferior frontal regions would be activated by sign observation in both deaf signers and hearing non-signers. However, the strength of activity within these areas could differ between the two groups due to differences in the linguistic meaningfulness of the signs to the observer.
![]() |
Materials and Methods |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
We recorded MEG signals from seven congenitally, profoundly deaf (six subjects 2042 years, one subject 74 years; four males, three females) and seven healthy, normally hearing control subjects (ages 2334; all males). Analysis between four male hearing non-signers and four male deaf signers revealed similar results as the main comparisons, although the significance levels were lower due to the smaller number of subjects compared. There were no significant differences in the pattern of activation between the deaf male and female subjects; neither did the activations of the 74-year-old deaf subject and the younger deaf signers appear different. All subjects were right-handed. The deaf subjects were fluent users of Finnish Sign Language (FSL) which was their first and most prominent language, whereas the hearing control subjects had no previous experience with FSL. When asked specifically after the measurement, the hearing controls were unable to understand, or even guess, the meaning of the observed signs, all of which were non-iconic gestures without involvement of lip movements of the corresponding Finnish words.
Data Acquisition
During the experiment, the subjects passively viewed videotaped individual signs (22.5 s in duration) of FSL, presented once every 7 s. During the pause, the first frame of the following sign was displayed continuously to decrease contamination from V1/V2 visual cortex activation due to stimulus appearance. All subjects were instructed to watch carefully the movements of the person appearing on the video. Although most of the signs were bimanual, they involved more strongly the dominant than the non-dominant hand of the native right-handed signer who performed on the video. The stimuli were presented in the center of a screen located in front of the subject (visual angle 10°) and the subject was able to view simultaneously both the hands and the face of the signer without moving her eyes considerably. Magnetic signals were recorded with a helmet-shaped 122-channel Neuromag-122TM magneto- meter (Ahonen et al., 1993), while the subject was sitting in a magnetically shielded room with lights turned off. Location of the head with respect to the sensors was determined by measuring the magnetic fields produced by small currents delivered to three coils attached to the scalp. Locations of the coils with respect to the preauricular points and the nasion were measured with a 3-D digitizer. The recording passband was 0.0390 Hz and the sampling rate 300 Hz. The vertical electro- oculogram was recorded simultaneously, and epochs contaminated by eye movements or blinks were rejected. The responses were averaged off-line, time-locked to movement onsets in the stimuli. A minimum of 55 responses was averaged from 1.3 s before to 6.2 s after the movement onsets.
Source Modeling
Due to the duration and temporal variability of the videotaped signs, the averaged evoked responses included mainly low frequency components. To increase the signal-to-noise ratio, the averaged signals of each subject were digitally low-pass filtered at 10 Hz. Mean amplitude levels from 1 to 0.5 s before and from 5.7 to 6.2 s after the movement onsets of the stimuli were used to remove the constant level and linear trends. Thereafter the data were analyzed with a Minimum Current Estimate (MCE) program based on minimum L1-norm estimates (Matsuura and Okabe, 1995). MCE presents the current distribution where the total sum of the current is as small as possible, while it still explains most of the measured signals. The method is able to resolve several local or distributed MEG current sources without explicit a priori information about the number of active areas (Uutela et al., 1999
), and the results are in good agreement with those obtained by multidipole modeling (Nishitani et al., 1999
; Uutela et al., 1999
). To identify the activated cortical sites, the current distributions were superimposed on individual magnetic resonance images (MRIs).
Statistical Analysis
Activities, studied in regions of interest (ROIs), were calculated as weighted averages; the weighting function was a generalized normal distribution with the peak at the center of the ROI. The full-width half-maximum of the ROI was 35 mm. The mean locations of ROIs for signers and non-signers were within 10 mm from each other, and the ROIs used for group comparisons were centered on these mean locations. The current directions within the ROIs were chosen to explain most of the current directions of both groups. The total activity, averaged across subjects, was compared between signers and non-signers 01 s, 12 s, 23 s and 34 s after the movement onsets in the stimuli using the MannWhitney test. Significant differences between the groups in the response onset and peak latencies were not detected, possibly due to the considerable jitter (~200 ms) in the movement onsets in the signs.
![]() |
Results |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
|
Figure 2 illustrates examples of bilaterally activated cortical sites in those individual deaf signers and hearing non-signers who showed clearest activation within these ROIs. Table 1
summarizes, in Talairach coordinates, the mean source locations of the main areas activated for hearing non-signers. MRIs were available from one deaf signer only, but since the mean locations of ROIs for deaf signers and hearing non-signers were within 10 mm from each other, we assumed them to reflect activity within the same cortical regions in both subject groups. The activation of the IFL agrees with the location of the Brodmann's area (BA) 44 (McGuire et al., 1997
; Iacoboni et al., 1999
), and the posterior STS activation with the location of BA 22 (Grafton et al., 1996
; Fiez and Petersen, 1998
). The temporo-occipital activation agrees with the location of the visual motion-specific V5 complex (Decety et al., 1994
; Rees et al., 1997
), and activation of the superior parietal lobule (SPL) with BA 5 (Iacoboni et al., 1999
). Activation close to the primary hand/arm motor area (M1) agrees with the location of BA 4 (Grafton et al., 1993
; Kawashima et al., 1994
), and the more anterior left dorsal premotor cortex (PMd) activation with BA 6 (Parsons et al., 1995
). Activation in the region of the parieto-occipital sulcus (POS) corresponds to the location of the V6 complex (Portin and Hari, 1999
), and activation of the mesial cortex of the paracentral lobule (PCL), just anterior to the central sulcus, to mesial BA 4/6.
|
|
Figure 3 shows differences in the mean activation patterns between the two groups during sign observation. The left PMd, activated in all deaf signers and in five out of seven hearing non-signers, and the right posterior STS region were activated significantly stronger in deaf signers than hearing non-signers (P < 0.05, time interval 01 s for PMd and 12 s for STS). In contrast, activation of the right SPL region was stronger (P < 0.05, 01 s) in hearing non-signers than deaf signers. Hearing non-signers also showed stronger activation of the POS region, activated in two out of seven deaf signers and all hearing non-signers (P < 0.01, 01 s). The PCL was activated in one out of seven deaf signers and six out of seven hearing non-signers (P < 0.005, 23 s).
|
![]() |
Discussion |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Automatic activation of the classical left-hemisphere language areas of the IFL and STS regions by the observed signs is in line with previous findings showing a contribution of these areas both to sign language processing (Corina et al., 1992; Neville et al., 1992
; Hickok et al., 1996
; McGuire et al., 1997
; Neville et al., 1998
) and to action observation in humans (Grafton et al., 1996
; Rizzolatti et al., 1996b
; Nishitani and Hari, 2000
). It thus seems, as suggested earlier by Rizzolatti and Arbib (Rizzolatti and Arbib, 1998
), that language processing and action recognition are indeed closely related. It is of course possible that our hearing non-signers were trying to guess the meaning of the observed signs and thus covertly activated their language system even if they were not told that the movements consisted of real meaningful signs. However, activation of the left-hemisphere IFL and STS regions has been shown to occur during observation of simple, non-linguistic motor actions as well (Grafton et al., 1996
; Rizzolatti et al., 1996b
; Nishitani and Hari, 2000
). On the other hand, meaningful gestures activate the left IFL and STS regions more than meaningless gestures in both native signers (Neville et al., 1998
) and hearing non-signers (Decety et al., 1997
), suggesting that linguistic meaningfulness of the observed motor actions is one of the key factors for this left-hemisphere activa- tion. Surprisingly, there were no significant differences in the left-hemisphere IFL and STS activations between our deaf signers and hearing non-signers although the linguistic meaningfulness of the signs certainly differed for the two groups. Unfortunately, we are not able to differentiate whether activations in our hearing non-signers were associated with possible guessing at the meaning of the motor actions or with pure action viewing.
Corresponding right-hemisphere IFL and STS activations have previously been reported when native deaf and hearing signers process sign language (Bavelier et al., 1998; Neville et al., 1998
). Our results showing clear activation of these right-hemisphere areas also in hearing non-signers suggest a contribution of these areas to action observation as well. On the other hand, the activity within the right posterior STS region was significantly stronger in our deaf signers than in hearing non-signers, in accordance with the previous fMRI findings (Neville et al., 1998
) showing greater recruitment of this region during comprehension of meaningful signs than nonsense gestures in both hearing and deaf signers. These results suggest that the recruitment of the posterior STS region in deaf and hearing signers during sign observation occurs beyond the processing demands of action observation. This conclusion is in line with a previous PET study on non-signing subjects who were observing hand actions (Decety et al., 1997
). Meaningful actions, con- sisting of pantomimes, strongly engaged the left IFL and left temporal regions while meaningless actions, signs derived from the American Sign Language, activated predominantly the right occipitoparietal pathway. Thus at least partly distinct systems in the right hemisphere are related to linguistic processing of signs and to visuospatial processing of biologically relevant motion. However, the exact nature of the right-hemisphere STS activation in deaf and hearing signers is still unclear. For example, it is possible that emotional facial expressions and prosody were processed differently by our two groups since the context was meaningful to the deaf signers and meaningless to hearing non-signers; such differences might result in different degrees of right-hemisphere STS activation. This suggestion is in line with a recent lesion study (Adolphs et al., 2000
) showing impaired naming of facial emotional expressions following right- hemisphere superior temporal lesions.
Unlike present results, other studies on action viewing have demonstrated a left-hemisphere dominance in the activation of the IFL and STS regions in humans (Grafton et al., 1996; Rizzolatti et al., 1996b
; Nishitani and Hari, 2000
). The signs used in the present study were mostly bimanual and more complex than reaching, grasping and manipulating an object with one hand, movements typically used in action viewing tasks (Grafton et al., 1996
; Rizzolatti et al., 1996b
; Nishitani and Hari, 2000
). It is also to be noted that no objects were used in the present study, and the observed signs could not be recognized or named by the hearing non-signers. These aspects could at least in part explain the symmetry in the activation of the IFLSTS network in our hearing non-signers.
Bilateral activation of the primary hand/arm motor regions during sign observation in both groups of subjects is in line with earlier results on action observation (Hari et al., 1998; Nishitani and Hari, 2000
). However, activation within a more anterior PMd region in the left hemisphere was predominant in our deaf signers, possibly reflecting influence of sign language experi- ence and comprehension. The superior area 6 in monkey brain, a homologue to human PMd cortex, is involved in motor prepar- ation, by retrieving from memory the response appropriate to the context, and in execution of the selected movement (Matelli and Luppino, 1997
). According to Fadiga and co-workers, visually triggered discharges in the monkey area F4 (or area 6) reflect potential actions directed to particular spatial locations; visual stimuli may automatically evoke one of the potential actions stored in F4 as a sort of motor vocabulary (Fadiga et al., 2000
). In humans, lesions in the PMd cortex impair arm movements that require temporal coordination of proximal muscles and associating of hand movements with particular sensory cues (Matelli and Luppino, 1997
). Activation of the left-hemisphere PMd area in our deaf signers could reflect automatic retrieval of learned motor sequences from memory. This suggestion is in line with previous PET results suggesting a positive correlation between PMd activation and the degree of improvement in skill during complex motor task training (Kawashima et al., 1998
).
Automatic activation of the dorsal visual pathway in both groups of subjects was an expected finding. The V5 complex has been shown to be activated during observation of right-hand reaching movements (Nishitani and Hari, 2000), as well as by movements of a virtual right hand, perceived as if it were the subject's own hand (Decety et al., 1994
).
Although the anterior SPL region was activated bilaterally in both groups, the right-hemisphere activity was stronger in hearing non-signers; this difference may, however, in part reflect the stronger overall activity in hearing non-signers than in deaf signers. The monkey SPL has been suggested to participate in the visuospatial encoding and visuomotor transformation of hand/arm reaching movements, and it has been shown to have direct neural connections to frontal motor and premotor regions (Kalaska et al., 1990; Caminiti et al., 1996
; Johnson et al., 1996
). In humans, the right anterior SPL region was recently suggested to encode kinesthetic aspects of observed finger movements, and to be activated during imitation of the same movements (Iacoboni et al., 1999
).
The medial surface of the POS, the area of the V6 complex, is also involved in visual guiding of reaching movements (Galletti et al., 1997). The V6 complex receives direct projections from a number of visual areas, including the V5 cortex, and displays a relative emphasis on the visual periphery (Colby et al., 1988
; Portin and Hari, 1999
). Activation of the mesial paracentral lobule has been observed during attention to somatosensory stimulation (Forss et al., 1996
) and during unilateral complex finger movements (Roland et al., 1982
), but the functional significance of this activation remains unsettled at present. The involvement of the dorsal visual pathway was in general weaker in our deaf signers, possibly because they immediately under- stood the observed signs without needing to pay sustained attention to them the way the hearing non-signers had to do. The deaf signers probably processed the signs as symbols, not simply as movements.
Our results demonstrate that cortical representations of sign language and action observation are largely overlapping, in agreement with the hypothesis that language may have dev- eloped from oro-facial and brachio-manual gestures (Rizzolatti and Arbib, 1998). However, our findings also indicate that the neural networks recruited by passive observation of sign language in part depend on the familiarity and linguistic meaningfulness of the actions perceived.
![]() |
Notes |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Address correspondence to Sari Levänen, Harvard Medical School, MGH-NMR Center, Bldg 149, 13th Street, Charlestown, MA 02129-2060, USA. Email: sari{at}nmr.mgh.harvard.edu.
![]() |
References |
---|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---|
Ahonen A, Hämäläinen M, Kajola M, Knuutila J, Laine P, Lounasmaa OV, Parkkonen LP, Tesche C (1993) 122-channel SQUID instrument for investigating the magnetic signals from the human brain. Physica Scripta T49:198205.[ISI]
Bavelier D, Corina D, Jezzard P, Clark V, Karni A, Lalwani A, Rauschecker J, Braun A, Turner R, Neville HJ (1998) Hemispheric specialization for English and ASL: left invarianceright variability. NeuroReport 9:15371542.[ISI][Medline]
Caminiti R, Ferraina S, Johnson PB (1996) The sources of visual information to the primate frontal lobe: a novel role for the superior parietal lobule. Cereb Cortex 6:319328.[Abstract]
Colby CL, Gattass R, Olson CR, Gross CG (1988) Topographical organization of cortical afferents to extrastriate visual area PO in the macaque: a dual tracer study. J Comp Neurol 269:329413.
Corina DP, Vaid J, Bellugi U (1992) The linguistic basis of left hemisphere specialization. Science 255:12581260.[ISI][Medline]
Corina DP, Neville HJ, Bavelier D (1998) Response from Corina, Neville and Bavelier. Trends Cogn Sci 2:468470.[ISI]
Damasio A, Bellugi U, Damasio H, Poizner H, Van Gilder J (1986) Sign language aphasia during left-hemisphere Amytal injection. Nature 322:363365.[ISI][Medline]
Decety J, Perani D, Jeannerod M, Bettinardi V, Tadary B, Woods R, Mazziotta JC, Fazio F (1994) Mapping motor representations with positron emission tomography. Nature 371:600602.[ISI][Medline]
Decety J, Grézes J, Costes N, Perani D, Jeannerod M, Procyk E, Grassi F, Fazio F (1997) Brain activity during observation of actions. Influence of action content and subject's strategy. Brain 120:17631777.[Abstract]
Fadiga L, Fogassi L, Gallese V, Rizzolatti G (2000) Visuomotor neurons: ambiguity of the discharge or motor perception? Int J Psychophysiol 35:165177.[ISI][Medline]
Fiez JA, Petersen SE (1998) Neuroimaging studies of word reading. Proc Natl Acad Sci USA 95:914921.
Forss N, Merlet I, Vanni S, Hämäläinen M, Maguière F, Hari R (1996) Activation of human mesial cortex during somatosensory target detection task. Brain Res 734:229235.[ISI][Medline]
Gallese V, Fadiga L, Fogassi L, Rizzolatti G (1996) Action recognition in the premotor cortex. Brain 119:593609.[Abstract]
Galletti C, Fattori P, Kutz DF, Battaglini PP (1997) Arm movement-related neurons in the visual area V6A of the macaque superior parietal lobule. Eur J Neurosci 9:410413.[ISI][Medline]
Grafton ST, Woods RP and Mazziotta JC (1993) Within-arm somatotopy in human motor areas determined by positron emission tomography imaging of cerebral blood flow. Exp Brain Res 95:172176.[ISI][Medline]
Grafton ST, Arbib MA, Fadiga L, Rizzolatti G (1996) Localization of grasp representations in humans by positron emission tomography. 2. Observation compared with imagination. Exp Brain Res 112:103111.[ISI][Medline]
Hari R, Forss N, Avikainen S, Kirveskari E, Salenius S, Rizzolatti G (1998) Activation of human primary motor cortex during action observation: a neuromagnetic study. Proc Natl Acad Sci USA 95:1506115065.
Hickok G, Bellugi U, Klima ES (1996) The neurobiology of sign language and its implications for the neural basis of language. Nature 381: 699702.[ISI][Medline]
Hickok G, Bellugi U, Klima ES (1998a) The neural organization of language: evidence from sign language aphasia. Trends Cogn Sci 2: 129136.[ISI]
Hickok G, Bellugi U, Klima ES (1998b) What's right about the neural organization of sign language? A perspective on recent neuroimaging results. Trends Cogn Sci 2:465468.[ISI]
Iacoboni M, Woods RP, Brass M, Bekkering H, Mazziotta JC, Rizzolatti G (1999) Cortical mechanisms of human imitation. Science 286: 25262528.
Johnson PB, Ferraina S, Bianchi L, Caminiti R (1996) Cortical networks for visual reaching: physiological and anatomical organization of frontal and parietal lobe arm regions. Cereb Cortex 6:102119.[Abstract]
Kalaska JF, Cohen DAD, Prud'homme M and Hyde ML (1990) Parietal area 5 neuronal activity encodes movement kinematics, not movement dynamics. Exp Brain Res 80:351364.[ISI][Medline]
Kawashima R, Roland PE, O'Sullivan BT (1994) Activity in the human primary motor cortex related to ipsilateral hand movements. Brain Res 663:251256.[ISI][Medline]
Kawashima R, Matsumura M, Sadato N, Naito E, Waki A, Nakamura S, Matsunami K, Fukuda H, Yonekura Y (1998) Regional cerebral blood flow changes in human brain related to ipsilateral and contralateral complex hand movements a PET study. Eur J Neurosci 10:22542260.[ISI][Medline]
Matelli M, Luppino G (1997) Functional anatomy of human motor cortical areas. In: Handbook of neuropsychology, vol. 11 (Boller F and Grafman J, eds), pp. 926. Amsterdam: Elsevier Science.
Matsuura K, Okabe U (1995) Selective minimum-norm solution of the biomagnetic inverse problem. IEEE Trans Biomed Engng 42:608615.[ISI][Medline]
McGuire PK, Robertson D, Thacker A, David AS, Kitson N, Frackowiak RSJ, Frith CD (1997) Neural correlates of thinking in sign language. NeuroReport 8:695698.[ISI][Medline]
Neville HJ, Mills DL, Lawson DS (1992) Fractionating language: different neural subsystems with different sensitive periods. Cereb Cortex 2: 244258.[Abstract]
Neville HJ, Bavelier D, Corina D, Rauschecker J, Karni A, Lalwani A, Braun A, Clark V, Jezzard P, Turner R (1998) Cerebral organization for language in deaf and hearing subjects: biological constraints and effects of experience. Proc Natl Acad Sci USA 95:922929.
Nishitani N, Uutela K, Shibasaki H, Hari R (1999) Cortical visuomotor integration during eye pursuit and eye-finger pursuit. J Neurosci 19:26472657.
Nishitani N, Hari R (2000) Temporal dynamics of cortical representation for action. Proc Natl Acad Sci USA 97:913918.
Parsons LM, Fox PT, Downs JH, Glass T, Hirsch TB, Martin CC, Jerabek PA, Lancaster JL (1995) Use of implicit motor imagery for visual shape discrimination as revealed by PET. Nature 375:5458.[ISI][Medline]
Paulesu E, Mehler J (1998a) Right on in sign language. Nature 392: 233234.[ISI][Medline]
Paulesu E, Mehler J (1998b) Response from Paulesu and Mehler. Trends Cogn Sci 2:471.[ISI]
Portin K, Hari R (1999) Human parieto-occipital visual cortex: lack of retinotopy and foveal magnification. Proc R Soc Lond 266:981985.[ISI][Medline]
Rees G, Frith CD, Lavie N (1997) Modulating irrelevant motion perception by varying attentional load in an unrelated task. Science 278: 16161619.
Rizzolatti G, Arbib MA (1998) Language within our grasp. Trends Neurosci 21:188194.[ISI][Medline]
Rizzolatti G, Fadiga L, Gallese V, Fogassi L (1996a) Premotor cortex and the recognition of motor actions. Cogn Brain Res 3:131141.[ISI][Medline]
Rizzolatti G, Fadiga L, Matelli M, Bettinardi V, Paulesu E, Perani D, Fazio F (1996b) Localization of grasp representations in humans by PET. 1. Observation versus execution. Exp Brain Res 111:246252.[ISI][Medline]
Roland P, Meyer E, Shibasaki T, Yamamoto YL, Thompson CJ (1982) Regional cerebral blood flow changes in cortex and basal ganglia during voluntary movements in normal human volunteers. J Neurophysiol 48:467480.
Uutela KH, Hämäläinen MS, Somersalo E (1999) Visualization of magnetoencephalographic data using minimum current esitmates. NeuroImage 10:173180.[ISI][Medline]