A Comparison of Visual and Auditory Motion Processing in Human Cerebral Cortex

James W. Lewis, Michael S. Beauchamp1 and Edgar A. DeYoe

Department of Cell Biology, Neurobiology, and Anatomy, Medical College of Wisconsin, Milwaukee, WI 53226 and , 1 National Institutes of Health, Bethesda, MD, USA


    Abstract
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
Visual and auditory motion information can be used together to provide complementary information about the movement of objects. To investigate the neural substrates of such cross-modal integration, functional magnetic resonance imaging was used to assess brain activation while subjects performed separate visual and auditory motion discrimination tasks. Areas of unimodal activation included the primary and/or early sensory cortex for each modality plus additional sites extending toward parietal cortex. Areas conjointly activated by both tasks included lateral parietal cortex, lateral frontal cortex, anterior midline and anterior insular cortex. The parietal site encompassed distinct, but partially overlapping, zones of activation in or near the intraparietal sulcus (IPS). A subsequent task requiring an explicit cross-modal speed comparison revealed several foci of enhanced activity relative to the unimodal tasks. These included the IPS, anterior midline, and anterior insula but not frontal cortex. During the unimodal auditory motion task, portions of the dorsal visual motion system showed signals depressed below resting baseline. Thus, interactions between the two systems involved either enhancement or suppression depending on the stimuli present and the nature of the perceptual task. Together, these results identify human cortical regions involved in polysensory integration and the attentional selection of cross-modal motion information.


    Introduction
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
A common characteristic of both visual and auditory perception is the ability to determine the speed and direction of a moving object, such as an automobile passing on the street. The visual and auditory sensory information associated with the automobile presumably merges or becomes coordinated, thereby producing a unified percept of the movement of the object within the environment. Additionally, both systems may interact to coordinate and direct attention to one modality or the other, and to control subsequent action. However, it remains unclear how similar the auditory and visual motion systems might be, and more specifically how and where the two systems interact.

The cortical mechanisms responsible for visual motion perception have received much study in animals and, more recently, in humans. In monkeys, the cortical processing of visual motion is thought to involve a number of anatomically inter-connected visual areas and their subdivisions referred to, here, as the dorsal motion pathway. These include lamina 4B in V1, the thick cytochrome oxidase stripes in V2, areas V3, MT, MST and possibly lateral and ventral intraparietal areas, LIP and VIP (Orban et al., 1986Go; DeYoe and Van Essen, 1988Go; Desimone and Ungerleider, 1989Go; Boussaoud et al., 1990Go). Information from the dorsal motion pathway is then thought to influence distinct portions of prefrontal cortex (Wilson et al., 1993Go; Rao et al., 1997Go), presumably for use in directing behavioral responses or contributing to other cognitive activity.

A similar picture is emerging from neuroimaging and lesion studies in humans. Areas V1 and V2 in humans are responsive to visual motion, but more selective responses can be obtained from extrastriate visual areas located laterally and dorsally in the occipital and parietal lobes. For instance, hMT, the likely homolog of the simian middle temporal visual area, MT, is strongly activated by visual motion stimuli and by tasks involving a visual motion discrimination (Corbetta et al., 1991Go; Zeki et al., 1991Go; Dupont et al., 1994Go; Orban et al., 1995Go; Tootell et al., 1995aGo,bGo; Beauchamp et al., 1997bGo). Additionally, the same stimuli and tasks concurrently activate areas in dorsal occipital cortex and in posterior parietal cortex. Bilateral lesions of lateral occipital cortex (including hMT) and/or posterior parietal cortex can selectively compromise visual motion perception, while leaving auditory and somatosensory motion perception intact (Zihl et al., 1983Go, 1991Go; Rizzo et al., 1995Go). Together, these areas may constitute a dorsal motion processing system that is analogous, if not homologous, to the comparable simian system (Felleman and Van Essen, 1991Go).

Compared to our detailed understanding of visual motion pathways, we know relatively little about pathways for auditory motion processing. Anatomical studies in monkeys suggest that there are two auditory streams (as in vision), one of which includes a system for auditory space analysis that originates in the caudal belt and parabelt region surrounding primary auditory cortex and projects to periarcuate cortex (Azuma and Suzuki, 1984Go; Romanski et al., 1999Go). Animal studies of static sound source localization (Knudsen and Konishi, 1978Go; Brugge and Reale, 1985Go; Phillips and Brugge, 1985Go; Suga, 1994Go) have shown that the location of a sound source can be signaled by interaural time and/or intensity differences (ITD and IID respectively). Presumably, some cells can selectively respond to changes in IID and ITD over time, thereby representing sound source movement. Indeed, electrophysiological studies in cats and monkeys have shown that cells selective for auditory motion exist in primary auditory cortex as well as some subcortical structures (Sovijärvi and Hyvärinen, 1974Go; Reale and Brugge, 1990Go; Ahissar et al., 1992Go; Stumpf et al., 1992Go; Takahashi and Keller, 1992Go; Toronchuk et al., 1992Go; Spitzer and Semple, 1993Go). However, in primates the identification of a specific system of interconnected cortical areas for processing auditory motion per se is currently lacking.

Lesion studies have shown that apparent sound-source movement in humans can be selectively disrupted when right parietal and right insular cortex is compromised (Griffiths et al., 1997Go). Evidence from human neuroimaging and magneto-encephalography has shown activation of several cortical regions by the apparent movement of synthesized sounds, including the right superior temporal sulcus (STS), primary auditory and surrounding cortex (PAC+), right insula, right parietal cortex and right cingulate cortex (Griffiths et al., 1994Go, 1998Go; Mäkelä and McEvoy, 1996Go; Murray et al., 1998Go; Baumgart et al., 1999Go). Despite some inconsistencies across studies, a picture is emerging of several cortical regions that are activated during auditory motion processing and may function as a system for auditory motion analysis.

Where and how the visual and auditory motion systems interact is not well understood. Such interactions must occur if a task requires explicit comparison of information from both modalities. In such instances, information about the direction and speed of moving objects seems to be derived separately within each modality, and then compared after conversion to a common supramodal representation (Stein et al., 1993Go; Ward 1994Go; Stein and Wallace, 1996Go; Driver and Spence, 1998Go; Snyder et al., 1998Go). Presumably, attention is allocated between and within modalities during such tasks to ensure that the appropriate task-relevant information is passed on to decision-making and behavioral-control systems. Where these various cross-modal interactions occur in humans is not known.

In monkeys, several cortical areas have been shown to contain cells that respond to both visual and auditory stimuli, including temporal cortex (Benevento et al., 1977Go; Desimone and Gross, 1979Go; Leinonen et al., 1980Go; Bruce et al., 1981Go; Hikosaka et al., 1988Go; Watanabe and Iwai, 1991Go), prefrontal and periarcuate cortex (Azuma and Suzuki, 1984Go; Tanila et al., 1992Go) orbito- frontal cortex (Benevento et al., 1977Go) and parietal cortex, including the lateral intraparietal area, LIP (Mazzoni, 1994Go; Linden et al., 1996Go; Andersen 1997Go). Anatomical data also indicate that the ventral intraparietal area (VIP) receives direct input from both visual- and auditory-related cortex (Lewis and Van Essen, 2000Go). However, it is uncertain which of these simian areas have human homologs and which areas can specifically contribute to the cross-modal integration of motion information.

Recently, two human imaging studies reported cortical sites involved with audiovisual integration. Calvert et al. (Calvert et al., 1999a,b) identified a region in the right superior temporal sulcus that was more active during integration of aurally and visually presented language stimuli. Bushara et al. (Bushara et al., 1999Go) identified brain areas important for integrating spatial information across domains in the inferior parietal lobule, medial frontal cortex and the right inferior temporal cortex. Suppressive interactions between the auditory and visual systems have also been noted, though it is unclear whether such effects are task specific (Haxby et al., 1994Go; Shulman et al., 1997Go) or whether they reflect uncontrolled cognitive or attentional factors during the control periods (Shulman et al., 1997Go; Binder et al., 1999Go). The systems responsible for these and other cross-modal interactions have yet to be fully explored.

In the present study, we used functional magnetic resonance imaging (fMRI) to examine brain areas subserving visual and auditory motion processing. Brain activity was examined as subjects performed separate visual and auditory motion discrimination tasks. We also examined the pattern of activation when subjects attended to auditory motion, visual motion or combined audiovisual motion. Because the same subjects performed both unimodal and cross-modal tasks, we could distinguish truly convergent cross-modal domains from closely opposed, but unimodal, domains. The results indicate that visual and auditory motion processing tasks engage a number of common cortical regions and pathways that can interact in different ways depending on the stimuli presented and the nature of the auditory or visual task. Preliminary reports of these results have appeared previously (Lewis and DeYoe, 1998aGo,bGo).


    Materials and Methods
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
Subjects

Eleven healthy subjects (three females, eight males; age 22–48 years) were used. Subjects had normal or corrected-to-normal visual acuity and reported having a normal range of hearing. Ten subjects were strongly right-handed and one was left-handed. Informed consent was obtained following guidelines approved by the MCW Human Research Review Committee.

Isolated Auditory Motion Paradigm

Subjects (n=10) were presented with computer-generated auditory stimuli (SoundBlaster AWE 64 Gold, Creative Technology Ltd; and Cool Edit Pro, Syntrillium Software Co.) via electrostatic headphones (Koss Inc., Milwaukee, WI) that elicited the perception of a moving sound. Each stimulus consisted of a 300 Hz square wave of duration 500 ms with a 20 ms onset and offset ramp. Interaural intensity differences (IID) elicited the perception of sound moving through or behind the head from left to right, with the apparent velocity proportional to the rate of IID change. Both leftward and rightward motion were randomly presented at one of three apparent speeds: ~50o/s, ~35o/s or ~20o/s. The volume of the sound stimulus was adjusted for each individual (typically 75–80 dB SPL L-weighted), so that it could be heard over ambient scanner noise and through earplugs. The scanner beeps (primarily 4000 Hz at 120 dB and 2400 Hz at 110 dB) were perceived to be static and roughly positioned on the midline, so they did not interfere with the apparent left-to-right motion of the auditory stimulus. The beeps were continuously present throughout the scan and, consequently, did not generate any detectable cyclic fMRI activation.

As illustrated in Figure 1AGo, each 224 s fMRI scan consisted of an equilibration period (4 s), a baseline period (20 s), and five complete cycles of speed discrimination trials (200 s total). Each cycle consisted of a block of 13 motion stimuli alternating with a control block of only ambient scanner noise. Three to six repetitions of the experimental sequence described above were averaged to increase signal-to-noise. The initial 20 s baseline period of MR signal was recorded while subjects visually fixated, providing a reference for distinguishing relative increases in the BOLD signal (‘activation’) versus decreases (‘suppression’). Throughout the auditory motion task, the visual display consisted of a stationary white cross, centered on a gray background. Subjects maintained fixation on the center of the cross. Since the cross was stationary and continuously present at a fixed location, it did not generate any detectable cyclic activation.



View larger version (32K):
[in this window]
[in a new window]
 
Figure 1. Schematic illustration of the auditory and visual motion paradigms. (A) Left depicts the time line of the auditory motion paradigm (224 s total), with a 20 s pre-task baseline period, and 20 s ON (task) and OFF (control) periods. Middle depicts sound intensity heard in each ear to produce sensation of sound motion based on interaural intensity differences. Steeper slopes correspond to faster perceived motion. Right inset shows the visual fixation target viewed throughout the entire scan. (B) Left shows the timeline for the isolated visual motion paradigm. Right illustrates a snapshot of the visual display. Dotted lines indicate bipartite annulus of coherent motion. Refer to Materials and Methods for details.

 
During discrimination trials, subjects performed a 1-back, speed- comparison task in which each successive stimulus was judged as faster or slower than the preceding stimulus. Subjects made a two-alternative, forced choice and pressed one of two buttons to indicate their decision. During control trials, subjects were instructed to make button presses randomly at approximately the same rate as during the experimental trials. To minimize possible effects of learning during the scan (Petersen et al., 1998Go), subjects received at least one training session on or before the day of fMRI imaging (attaining >75% accuracy).

Isolated Visual Motion Paradigm

To activate visual motion processing areas, we used a dynamic random dot stimulus that had been used successfully in the past to study human motion processing and visual attention (Beauchamp et al., 1997aGo). Subjects (n = 9) fixated a central square while viewing a bipartite annulus (10–20° eccentricity) defined by coherent motion embedded in a background of randomly moving dots, as illustrated in Figure 1BGo. The subject's task was to indicate by button press which half of the annulus contained faster moving points. During each 204 s fMRI scan, experimental discrimination trials were presented every 2 s in blocks of 10, alternating with blocks of 10 control trials for five complete cycles. During the control trials (‘OFF’ periods), only randomly moving background points were presented and subjects responded randomly at roughly the same rate as during experimental periods. This visual motion paradigm was run in isolation with only the ambient scanner noise present.

The isolated audio and visual stimulus paradigms were typically presented during the same experimental session in order to match test conditions and subject alertness level across trials, thereby minimizing inter-session variability and image registration inaccuracies.

Eye Movement Tracking

For three subjects, the auditory motion task was performed outside the scanner while their eye movements were recorded using an infrared eye tracking system (ISCAN Inc., Cambridge, MA). Subjects viewed an identical stimulus display presented on a video screen positioned so that the stimulus covered the same portion of the visual field as in the scanner. Head position was secured with a bite bar.

Imaging Methods

Imaging and data analysis methods have been described in detail previously (DeYoe et al., 1994Go). Briefly, fMRI was used to record changes in blood flow and oxygenation evoked by brain activity when subjects engaged in the experimental tasks described above. A General Electric (Milwaukee, WI) Signa 1.5 T MRI scanner equipped with a commercial head coil (Medical Advances Instruments) was used to acquire 102 or 112 axial, gradient-recalled (TE = 40 ms, TR = 2 s) echo-planar images of the brain with 3.75 mm x 3.75 mm in-plane resolution, with 12 axial slices of 8 mm thickness. T1 weighted anatomical MR images were also collected during each scan session, using a spoiled GRASS pulse sequence (1.0–1.1 mm slices, with 0.9375 mm x 0.9375 mm in-plane resolution).

Data Analysis

Data were viewed and analyzed using the AFNI software package (Cox, 1996Go) (see also http://www.biophysics.mcw.edu) and custom software. The first image in the fMRI series provided a low-resolution anatomical picture that was used for image registration. Repetitions of the experimental scans were averaged, yielding an averaged time series. Voxels exhibiting a statistically significant cyclic response that was time locked to the stimulus presentation were identified by cross-correlation of each voxel's MR time series with a reference sinusoid approximating the neuronal-hemodynamic response to the stimulus (Bandettini et al., 1993Go). The sine reference had a 40 s period, corresponding to the timing of the experimental/control cycle. The phase of the reference waveform was allowed to vary to obtain the maximum correlation for each voxel. Correlation values exceeding a statistical significance of P < 1 x 10–6 indicated valid responses (unless specified otherwise in the text), yielding an overall Bonferroni corrected significance of P < 0.001 for the entire volume.

Response magnitude was calculated as the amplitude of the best-fit reference waveform. Activation maps showing the response amplitude for significantly responding voxels were resampled and interpolated to 1 mm3 resolution and overlaid on the high-resolution anatomical MR images. Trials with artifacts caused by subject motion were discarded.

Averaged functional brain maps were created to identify areas of common activation. Each subject's anatomical brain map, together with their functional maps, were transformed into Talairach space (Talairach and Tournoux, 1988Go), using the AFNI software package. Merged data sets were then created by combining amplitude and correlation values for each interpolated voxel across all subjects. The average amplitude value for each active voxel was computed as the arithmetic mean amplitude across subjects. Individual functional data (correlation and intensity) were low-pass filtered before averaging using a box filter with a width of 4 mm to reduce the effects of local anatomical variability across subjects. Average amplitude values were calculated as the simple mean across subjects. An average statistical measure was calculated by using the Fisher variance-normalizing transform to convert each cross-correlation coefficient to an approximately normal distribution, averaging across subjects, and then applying the inverse transformation. To identify statistically significant activation in the merged data, voxels that exceeded correlation thresholds from each run of each individual were analyzed as a binomial distribution, from which a significance (P-value) was derived. Locations of significant blood flow increases (activation) and decreases (anti-correlation or suppression) were identified anatomically with reference to each subject's sulcal pattern (Ono et al., 1990Go).

Computerized Talairach Atlas Reconstructions and Data Projection

A three-dimensional model of the cortical surface of the Talairach brain (Fig. 3Go) was produced by tracing the gray/white matter boundary on coronal sections from the atlas (Talairach and Tournoux, 1988Go). Intermediate sections were interpolated using custom software, and neighboring sections were aligned, converted to a three-dimensional mesh using the Nuages software package (Geiger, 1993Go), and then smoothed using the software package CARET (Drury et al., 1996Go) (see also http://v1.wustl.edu). The three-dimensional gray/white boundary surface mesh was then converted into a topographically correct and minimally distorted flat map using the software package FLATTEN (Drury et al., 1996Go; Van Essen et al., 1998Go). The average value of the mean areal distortion of the flat map was 8.6% and the absolute areal distortion was 37%. Because cortical flat maps necessarily contain some distortions, linear distances on the map are denoted as ‘map-cm’. They correspond to actual three-dimensional distances on the cortical surface only where there is no distortion on the map. The mean curvature of the surface was calculated and used to mark sulcal boundaries. A gray-level representation of curvature was generated by interpolating between adjacent nodes (points that define the contour outlines). Three-dimensional models were created by expanding the gray/white boundary surface model by half the distance of the average gray matter thickness in order to approximate contours representing layer 4.



View larger version (70K):
[in this window]
[in a new window]
 
Figure 3. Three-dimensional models and flat map representation of Talairach brain (refer to Materials and Methods) of the right hemisphere showing the group-averaged fMRI activity data from Figure 2Go. (A,B) Pattern of activation (red) and suppression (dark green) resulting from the visual motion paradigm. (C,D) Pattern of activation (yellow) and suppression (blue) resulting from the auditory motion paradigm. (E) Flat map representation of regions of fMRI activity overlap are indicated by intermediate colors (see color inset). Visual related suppression (dark green) was omitted for clarity. Left hemisphere activity pattern was similar, except for wider separation of visual and auditory activation foci in the STS, and the presence of visual and auditory suppression near the central sulcus. cc, corpus callosum; CaS, calcarine sulcus; CgS, cingulate sulcus; CoS, collateral sulcus; ILS, inferior limiting sulcus of the LaS; LaS, lateral sulcus; pITS, inferior temporal sulcus (posterior); POS, parietal occipital sulcus; SLS, superior limiting sulcus of the LaS. Other label conventions as in Figure 2Go.

 
The group-averaged fMRI activation patterns were mapped to the Talairach brain model on a voxel-by-voxel basis using a nearest-neighbor algorithm. For each significantly active voxel, the intensity of MR response was mapped to the nearest node on the model surface and all immediately neighboring nodes to approximate a region of activation the same size as the original voxels. When more than one voxel mapped to a given node, the resulting intensity was calculated to be the mean of the values from the different voxels. For display purposes, nodes were colored as described in the text.


    Results
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
Isolated Auditory Motion Task

Figure 2AGo illustrates the group-averaged pattern of activation and suppression obtained with the isolated auditory motion task. Subjects responded to differences in the speed of target sounds during experimental periods, while during control periods they fixated and made sham responses (to control for response production). Consequently, the activation map reflects all factors involved in the discrimination task, including motion analysis, attention and response selection. Table 1Go identifies the center-of-mass locations and relative cluster sizes for several significant sites of cortical activation.



View larger version (116K):
[in this window]
[in a new window]
 
Figure 2. FMRI responses from (A) the isolated auditory motion paradigm and (B) the isolated visual motion paradigm averaged across seven subjects. Yellow to red hues code intensity of response activation, and blue hues indicate decreases in response (Bonferroni corrected P < 0.001 for both maps). Anatomical underlay from one subject. Axial sections (panels) are labeled by their Z-coordinate (mm) in Talairach coordinate space. CeS, central sulcus; PAC+, primary auditory cortex plus immediately surrounding auditory regions; hMT+, human MT complex; IPL, inferior parietal lobule; IPS, intraparietal sulcus; LOS, lateral occipital sulcus; Post-CeS, postcentral sulcus; Pre-CeS, precentral sulcus; SFGm, superior frontal gyrus (medial); STS, superior temporal sulcus.

 

View this table:
[in this window]
[in a new window]
 
Table 1 Center-of-mass coordinates of several regions of cortical activation from the isolated auditory- and visual-motion discrimination tasks, reported in stereotaxic space (Talairach and Tournoux, 1988Go
 
As expected, two of the strongest clusters of activation in Figure 2AGo (orange and yellow hues) were found along the lateral sulci (Z = 10–18), overlapping primary auditory cortex and immediately surrounding cortex (PAC+), with the right hemisphere focus located ~5 mm anterior to the left focus (Penhune et al., 1996Go). Furthermore, in the majority of subjects the most significant activity elicited by our 300 Hz stimuli was found within lateral and anterior portions of Heschl's gyrus (evident at higher threshold settings), consistent with the tonotopy of PAC (Talavage et al., 1997Go; Wessinger et al., 1997Go). The PAC+ focus extended into the superior temporal sulcus (STS) of the right hemisphere (Z = 10), revealing a cluster that appeared as a fused pair of foci at higher threshold settings. These foci were moderate to light in overall activation and were located along the middle portion of the STS (centers-of-mass: 62, –26, 5 and 54, –37, 13), consistent with the location of the putative sound motion processing area proposed by Murray et al. (Murray et al., 1998Go). Strong clusters of activation were evident bilaterally in an elongated swath of lateral frontal cortex, which included cortex along and anterior to the precentral sulcus (Z = 18–58). Strong activation was also present along the anterior midline, encompassing medial portions of the superior frontal gyrus (SFGm) and portions of the anterior cingulate gyrus (Z = 34–58). Strong to moderate activation was present in anterolateral portions of parietal cortex of both hemispheres (Z = 34–50), typically located within and around the intraparietal sulcus (IPS) of each individual subject. Moderate activation was also present in the anterior insulae (Z = 2–10).

In contrast to the areas of activation, some portions of the cortex showed decreased blood flow or ‘suppression’ relative to resting baseline activity (blue hues). One region of particular interest included the bilateral swath of multiple foci extending from lateral occipital to posterior parietal cortex (Z = 18–42). This swath overlapped portions of the dorsal motion system activated during the unimodal visual motion task (described below). Another notable region of suppression included motor cortex along the left central sulcus (Z = 42–58), which presumably reflected differences in response production (button presses) during the control versus experimental periods.

Isolated Visual Motion Task

The same subjects also performed a visual discrimination task designed to generate robust activation of the visual motion pathways, using the paradigm of Beauchamp et al. (Beauchamp et al., 1997aGo). This task was performed in the absence of any auditory stimuli other than the ongoing scanner noise. The resulting average pattern of activation is summarized in Figure 2BGo and Table 1Go, and was similar to that described by Beauchamp et al. (Beauchamp et al., 1997aGo). Activation was primarily restricted to posterior cortex, including a prominent bilateral swath of activation extending from the lateral occipital sulcus (LOS) to posterior parietal cortex, and into posterior portions of the post-central sulcus (Z = 2–58). This included the human middle temporal area (hMT/V5) and immediately surrounding cortex (together designated as hMT+; Z = 2–10). The focus labeled hMT+ most likely includes several extrastriate visual areas whose identities have not yet been established conclusively (e.g. human homologs of MST and FST). This swath of activation extended dorsally including the human motion-related area hV3A (Tootell et al., 1997Go; Sunaert et al., 1999Go). Moderate to light activation also extended from hMT+ ventrally into the fusiform and lingual gyri (see Fig. 3BGo). In anterior portions of cortex, strong activation was present along lateral frontal cortex, including the precentral sulcus plus cortex extending further anterior (Z = 26–58), and light to moderate activation was also present along the anterior midline (SFGm; Z = 42–50).

As with the auditory motion paradigm, the visual motion paradigm revealed regions with decreased fMRI signal during the task, including the left central sulcus (Z = 50–58) and regions along the midline (Z = 34–42). Suppression around primary auditory cortex was present only at a very low significance threshold (P < 0.1).

Convergence/Interaction of Visual and Auditory Motion Pathways

Cortical areas that were activated by both the isolated visual and isolated auditory motion tasks represent sites of potential cross-modal interaction or shared polymodal functionality. Additionally, areas that were activated by one modality but were suppressed by the other could also represent sites of cross-modal interaction. To identify such sites, the data from both isolated motion tasks in Figure 2Go were displayed together on an unfolded representation (‘flat map’) of the cortex, as shown for the right hemisphere in Figure 3Go.

Co-activation

In Figure 3Go, areas active during the isolated visual motion task were uniformly colored red, while areas active in the auditory task were uniformly colored yellow. Regions of co-activation were identified by the intermediate color orange, and included lateral parietal cortex (up and left of center on flat map), lateral frontal cortex (right and center), the anterior midline cortex (CgS and medial SFG; upper right), and a small portion of the anterior insula (significant visual-related activation in 4/8 subjects).

Relative to the activation resulting from the visual motion paradigm (Fig. 3EGo, red), activation from the auditory motion task (yellow) was stronger and more extensive along the anterior midline and anterior insula in both hemispheres. The anterior midline activation included portions of anterior cingulate cortex as well as the superior frontal gyrus along the medial wall. This site was located rostral to the anterior commissure, thereby suggesting that it did not include supplementary motor cortex (Fink et al., 1997Go; Hazeltine et al., 1997Go). In lateral frontal cortex the activation from both tasks was largely coextensive, but responsiveness in ventral regions tended to be stronger for the auditory task (also see Fig. 2A,BGo: Z = 18–58). In lateral parietal cortex, co-activation was characterized by partial overlap, with visual-related activation extending further medial and posterior and auditory-related activation extending further lateral. The group-averaged maps revealed a significant zone of activation overlap in parietal cortex centered at Talairach coordinates 35, –46, 47 in the right hemisphere, and –41, –40, 47 in the left hemisphere.

Within individuals, the regions of parietal overlap typically fell within the IPS. As illustrated in Figure 4AGo, individual subjects had the clusters of auditory-related activation (yellow) partially overlapping visual-related activation (red), as indicated by the orange voxels. Typically, the auditory-related activation was located lateral and anterior to the visual-task activation, though this was not always the case (e.g. middle panel, right hemisphere). The visual-task activation within parietal cortex tended to be more extensive and diffuse than the auditory-task activation.



View larger version (49K):
[in this window]
[in a new window]
 
Figure 4. Individual cases illustrating overlap of fMRI changes during auditory and visual motion paradigms. (A) Parietal cortex showing partial overlap (orange) of auditory (yellow) and visual (red) activation. (B) Dorsal occipital cortex showing suppression during the auditory motion task (blue), activation during the visual motion task (red) and regions of overlap (magenta). Red (visual-related) and blue (auditory-related) averaged waveforms were derived from the magenta voxels only (18–20 voxels in a three-dimensional ROI approximated by white circles). Baseline for each time series (approximated by green lines) was determined from signal during the pre-stimulus period for the auditory paradigm. The 20 s ‘ON-periods’ are indicated by thicker green line segments. Images were transformed into Talairach space. Acceptance thresholds were matched for both tasks in each subject (at least P < 0.01).

 
We were concerned that the degree of overlap in parietal cortex might reflect differences in the spatial extent of the visual targets (central 20°) versus the auditory targets (nearly 180°). However, tests in which the auditory stimuli moved within 20° of the midline did not produce a significantly different activation pattern.

Suppression versus Activation

Also illustrated in Figure 3Go is an extensive zone that was activated during the visual-motion paradigm (red in Fig. 3A,B,EGo) but was suppressed during the auditory motion paradigm (blue in Fig. 3CEGo). This overlap was indicated by the intermediate color magenta in Figure 3EGo, and included irregular patches throughout the inferior parietal lobule (IPL), dorsal occipital cortex, and cortex overlapping hMT+ (cf. Figs 3E and 2A,BGoGo: Z = 18–42). Figure 4BGo illustrates examples from two individuals of signals from such overlapping patches (magenta voxels) near the lateral occipital sulcus. During ‘ON’ periods, the decrease in MR signal below baseline produced by the auditory motion task (blue waveform) had an amplitude similar to the increase in MR signal produced by the visual task (red waveform).

Cross-modal Speed Discrimination Task

To further characterize the regions of interaction identified by the isolated visual and auditory motion tasks, four subjects performed another experiment involving an explicit cross-modal speed comparison. The auditory stimuli were very similar to those used in the isolated auditory task. To facilitate a cross- modal speed comparison, the visual stimulus was modified to consist of a 3o square patch of sinusoidal grating (95% contrast) that moved, as a whole, randomly left- or rightward at one of three possible speeds, approximating those of the auditory stimulus (~50o/s, ~35o/s or ~20o/s). During experimental trials, both visual and auditory stimuli were simultaneously presented (every 2 s) while subjects performed one of three tasks. In the first task, subjects performed the 1-back speed comparison of auditory targets while ignoring the visual targets. In the second task they performed a 1-back speed comparison of the visual targets while ignoring the auditory targets. In the third task, subjects explicitly compared the speeds of the simultaneously presented visual and auditory targets, judging if the visual target was moving faster or slower than the auditory target. (A cross- modal, 1-back, speed comparison was found to be too difficult.)

For all three task conditions, the pattern of activation and suppression was roughly similar to that shown in Figure 2Go, but with reduced activation in medial occipital visual cortex due to the smaller, more restricted, visual target. Additionally, unlike the attend-vision condition, the attend-auditory condition produced activation in the midthalamus (not shown), consistent with an earlier study of attention to auditory versus visual stimuli (Frith and Friston, 1996Go). Moreover, the cross-modal speed comparison produced enhanced activation within restricted portions of cortex that had been co-activated during the isolated motion tasks, most notably including lateral parietal cortex.

Figure 5Go illustrates the enhancement effect for two regions of interest (ROIs) within the IPS from two subjects. As indicated by the averaged MR waveforms (orange lines), the response was greater during the cross-modal speed comparison than for either unimodal task. The effect is illustrated quantitatively by the underlying bar graphs. Enhancement was observed in the left IPS for all four subjects, and in the left anterior insula and anterior midline in three subjects. In two individuals, moderate enhancement was present in the STS and cerebellum. Little or no enhancement was observed within the large region of co-activation along lateral frontal cortex.



View larger version (41K):
[in this window]
[in a new window]
 
Figure 5. Enhancement of response in parietal cortex during cross-modal versus uni-modal speed comparisons (two subjects A,B). Top row: pattern of activation near the IPS for attend-auditory condition (Auditory), cross-modal comparison (X-modal), and attend-visual comparison (Visual). Talairach coordinates of the focus of enhancement for case A was 41, –56, 54, and for case B was –27, –60, 41. Note that the MR intensity color scale (red to yellow) is different from those in Figures 3 and 4GoGo. Middle row: 200 s time series (orange waveforms), averaged across an 18 voxel three-dimensional ROI (approximated by white circles). Green line shows ON/OFF cycles. Bottom row: signal amplitude within ROI, expressed in normalized (x100%) change.

 
Additional Tests

In the course of analyzing the results of the previous tests, two important issues arose that motivated additional tests. One concerned the functional specificity of the pattern of activation and suppression observed during the auditory motion paradigm. The other concerned the effects of eye movements and visual inputs on the pattern of suppression during auditory motion processing.

Auditory Motion Discrimination versus Pitch discrimination

The activation resulting from the isolated auditory motion paradigm reflected both motion-specific and non-specific factors that differed between experimental and control conditions. To identify cortical areas that might be uniquely involved in auditory motion analysis, we compared responses to the auditory motion task with a comparably designed pitch discrimination task.

Subjects (three male, one female) made a 1-back pitch comparison for tones presented on successive discrimination trials. They indicated by button press whether the current tone was higher or lower in pitch than the previous one. Tone frequencies were randomly selected from one of five possibilities: 288, 294, 300, 306 or 312 Hz, and each tone was presented for 500 ms with the amplitude ramped over 20 ms at onset and offset to minimize transients. Sixteen tone beeps were presented per experimental block, thereby making the task roughly as difficult as the auditory motion task. Control trials consisted of ambient scanner noise with random button presses at roughly the same rate as during experimental trials.

Figure 6Go shows the resulting pattern of activation. Overall, the group-averaged pattern was similar to that produced by the auditory motion task (cf. Figs 2A and 6GoGo), including regions identified as co-activated during the auditory and visual motion paradigms (precentral, anterior midline, parietal and anterior insula). The pitch task also produced suppression in portions of the dorsal visual motion system. However, there were fewer responsive voxels for the pitch task and there were subtle differences in the exact location and extent of some activation foci. For example, the pitch discrimination task resulted in a more medial focus of activity in parietal cortex near the post-CeS (Z = 50). Additionally, activation in the right STS (Z = 10) appeared more prominent during the auditory motion discrimination task than during the pitch discrimination task, consistent with its proposed involvement in auditory motion processing (Murray et al., 1998Go; Baumgart et al., 1999Go).



View larger version (124K):
[in this window]
[in a new window]
 
Figure 6. Activation (white) and suppression (black) produced by pitch discrimination task averaged across four subjects who also performed the motion tasks. Threshold setting was (P < 0.01). Other conventions as in Figure 2Go.

 
Origin of Suppression in Lateral Occipital Cortex

One possible explanation for the suppression of visual motion pathways during the auditory discrimination tasks might be unintentional visual stimulation due to residual eye movements and the accompanying displacements of the retinal image. To test this, we monitored eye movements in three subjects while they performed the auditory motion task outside the scanner and found no evidence for any systematic change in horizontal or vertical eye movements between experimental and control blocks (minimum detectable movement ~0.5°). However, this would not rule out effects due to microsaccades and drifts during the actual scan session.

To identify fMRI activation directly related to eye movements, four subjects performed the isolated auditory motion task under three different conditions. In the first condition, the eyes were open and fixated, as in the original auditory motion task. In the second condition, the subjects' eyes were closed and held as motionless as possible. In the third condition, subjects explicitly pursued the apparent auditory motion with their eyes open while viewing the stationary, visual fixation target.

The results are illustrated in Figure 7Go for a ROI including hMT+ and dorsal occipital cortex (see blue regions in posterior cortex in Fig. 2AGo; Z = 18 and Z = 34). The degree of suppression in this ROI was least in the eyes-fixated condition, slightly greater in the eyes-closed condition and greatest in the eyes-moving condition. The presence of suppression in dorsal-occipital cortex during both eyes-open (and fixated) and eyes-closed conditions shows that it was not related to retinal stimulation associated with incidental eye movements. However, when subjects intentionally produced large eye movements, increased suppression was observed, consistent with an earlier study involving large saccades (Paus et al., 1995Go). Unlike the eyes-open and fixated condition, the eyes-moving condition was accompanied by strong concurrent activation of primary visual cortex, probably due to the induced motion of the retinal image sliding across the retina. Together, these results suggest that the suppression may be related to the production of eye movements, but not to the specific presence or absence of incidental retinal image motion.



View larger version (26K):
[in this window]
[in a new window]
 
Figure 7. Suppression in dorso-lateral occipital cortex (Z = 10–18 and Z = 34) during the isolated auditory motion task with eyes (A) open and fixated, (B) closed and stationary, or (C) open and tracking the moving auditory targets. Data averaged from four subjects (subset from Fig. 2Go).

 

    Discussion
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
Figure 8Go shows a schematic diagram of the cortical systems activated by the visual and auditory motion tasks used in this study. Prominent activation foci from each task are represented by ovals (light gray for visual, white for auditory). Areas that were co-activated by both tasks are indicated by overlapping symbols (dark gray). Regions that exhibited response enhancement during the explicit cross-modal speed comparison are represented by small black ovals. Visual areas that showed suppression during the auditory-motion task are indicated by partially dashed outlines to indicate that only portions of these regions were involved. The organizational scheme of inter- connecting lines reflects inferences made from the monkey, and to a lesser extent human, anatomical literature (see legend).



View larger version (42K):
[in this window]
[in a new window]
 
Figure 8. Schematic diagram summarizing cortical areas engaged by visual, auditory and cross-modal motion tasks used in this study. Areas activated by auditory-only task shown in white. Light gray indicates areas activated by the visual-only task. Co-activated systems are shown as overlapping ellipses (dark gray) where relative sizes of the ellipses indicate either comparable or unequal volume of activation. Dashed outline indicates areas that were suppressed during auditory-only task. Black ovals show sites where enhancement was observed during the cross-modal speed comparison. Thin connecting lines reflect known anatomical connections for simian cortex. A brief list of references include: (1–3) Van Buren and Borke, 1972; Morel et al., 1993; Pandya, 1995. (4,11) Romanski et al., 1999. (5) Lewis, 1997. (6) Kennedy and Bullier, 1985; Yeterian and Pandya, 1989. (7) Ungerleider and Desimone, 1986. (8) Ungerleider and Desimone, 1986; Boussaoud et al., 1990. (9) Maunsell and Van Essen, 1983. (10, 12,17) Seltzer and Pandya, 1991. (13,14) Barbas, 1988. (15,16) Cavada and Goldman-Rakic, 1989; Stanton et al., 1977. (18) Luppino et al., 1993.

 
This study yielded several important findings: (i) Each unimodal motion task resulted in the unique activation of cortical areas extending from the respective primary sensory area (V1, PAC) to parietal cortex. (ii) Co-activation by both visual- and auditory-motion tasks was observed in portions of lateral parietal cortex, lateral frontal cortex (including the precentral sulcus), the anterior midline (SFGm and anterior cingulate) and anterior insula. (iii) When a motion stimulus was present in only one modality, there was specific suppression within the non-attended modality, though the effect was strongest for suppression of visual cortex during the auditory-motion task. The simultaneous presence of a salient (but unattended) visual- motion stimulus during the auditory task was sufficient to counteract the suppression. (iv) During explicit cross-modal speed comparisons, enhancement above the combined fMRI signal levels of the unimodal tasks was observed predominantly in the IPS (left > right). Less consistent enhancement was found in the anterior midline and anterior insula.

Modality-specific Motion Systems

Auditory Motion

The overall pattern of activation we observed with the isolated auditory-motion discrimination task (Fig. 2AGo) was similar to patterns obtained previously with other attentionally demanding auditory tasks (Pugh et al., 1996Go; Binder et al., 1997Go; O'Leary et al., 1997Go; Tzourio et al., 1997Go). These results are also consistent with studies identifying the STS, insula and parietal cortex in the right hemisphere as sites specifically involved in the analysis of auditory motion (Griffiths et al., 1994Go, 1998Go; Murray et al., 1998Go). However, unlike these previous auditory motion studies, we found bilateral activation rather than strong lateralization.

The network activated by the auditory motion task was similar to the network activated by our pitch discrimination task (cf. Fig. 6Go). Some areas, especially STS and portions of antero-lateral parietal cortex, were more active in the motion task than the pitch-discrimination task. This indicates that most of the cortical network activated by the two tasks is not exclusively concerned with the processing of motion information, perhaps consistent with animal literature (Ahissar et al., 1992Go). However, we cannot rule out the possibility that within this system there might be functionally specialized subdivisions beyond the resolution of our imaging technique.

Visual Motion (and Suppression Effects)

Results obtained with the isolated visual motion task are concordant with a variety of previous studies that have identified a system of cortical areas responsive to visual motion stimuli and motion-related tasks (Corbetta et al., 1991Go; Zeki et al., 1991Go, 1993Go; McCarthy et al., 1995Go; Tootell et al., 1995bGo, 1997Go; Beauchamp et al., 1997aGo; Culham et al., 1998Go; Sunaert et al., 1999Go). The evidence suggests the existence of a dorsal visual motion processing system that is responsible for the perception of visual movement (Zihl et al., 1983Go; Desimone and Ungerleider, 1986Go, 1989Go; Newsome and Pare, 1988Go). This system includes a swath of cortex extending from the hMT complex, hV3A, and cortex reaching further dorsally into posterior parietal cortex. This expanse of cortex undoubtedly includes a number of different areas, some of which are likely to be homologous with macaque visual areas, such as MST and FST (Desimone and Ungerleider, 1986Go), parietal areas LIP and VIP (Colby et al., 1993Go, 1996Go) and possibly polysensory areas such as STPp (Bruce et al., 1981Go).

Although the dorsal areas activated during the visual motion task (hMT+, V3A, posterior parietal) appeared to be primarily unimodal, we observed fMRI suppression below the resting baseline within these regions during the isolated auditory motion task. One possible explanation of this effect is that it reflects suppression of the task-irrelevant modality during the auditory discrimination (Haxby et al., 1994Go; Kawashima et al., 1995Go; Shulman et al., 1997Go). However, this suppression must be subtle since it was overridden by the simultaneous presence of a salient visual stimulus during the cross-modal task. An alternative explanation was suggested by our additional tests of the suppression under different viewing conditions (eyes closed, fixated or tracking). These tests suggest that the suppression was not a simple artifact due to poor visual fixation or image slip, but, rather, may have been related to aspects of eye fixation control and/or suppression of visual tracking of the auditory stimulus.

Other sites of suppression (e.g. posterior midline, evident during both tasks) may be of a different origin, reflecting inadvertent attentional effects during the ‘rest’ or ‘OFF’ periods where subjects may be engaged in uncontrolled cognitive activity (monitoring for novel inputs, day-dreaming, etc.), which become disengaged during the more attention demanding sensory tasks (Shulman et al., 1997Go; Binder et al., 1999Go). Although plausible, the specific functional role(s) of suppression in these midline sites remains uncertain.

Comparison of the Two Motion Systems

It would be theoretically satisfying if both the visual and auditory motion systems contained comparable dorsally directed pathways extending from primary sensory areas into parietal cortex, as suggested by Figure 8Go. Although the auditory motion task did produce activation extending from Heschl's gyrus (primary auditory cortex) toward parietal cortex, it is not clear if this constitutes a distinct interconnected pathway comparable to the dorsal visual motion system in monkeys. Since there is little information concerning the anatomical connectivity of these regions in humans, we must rely on animal data to provide an organizational schema for the connectivity. Consequently, the interconnecting lines shown in Figure 8Go have been added to reflect inferences based mostly on animal literature. In particular, the primate data suggests that there are direct connections from modality specific systems, such as PAC+ and hMT+, to the co-activated systems such as parietal and lateral frontal cortex (see legend).

Connectivity aside, the degree to which the human auditory and visual motion systems are functionally equivalent remains unsettled. Certainly, it is difficult to establish a functional equivalency on an area-by-area basis. For instance, it is not yet clear whether the auditory system has a functional equivalent of area MT (Griffiths et al., 1994Go), which plays a key role in the processing of visual motion. In particular, the functional characteristics that uniquely identify area MT, such as large receptive field size and responsiveness to complex second- and third-order motion (Adelson and Bergen, 1985Go; Chubb and Sperling, 1988Go; Chubb et al., 1994Go), may not have a functional equivalent in the auditory modality. Consequently, there is currently insufficient information to either establish or refute the functional equivalency of specific stages within each modality.

Although establishing an area-by-area comparison between the two modalities is premature at this juncture, some common organizational principles are beginning to emerge. One important parallel seems to be the multifunctionality of the systems activated by visual and auditory motion tasks. For instance, the pathways activated by the auditory motion task closely resembled those activated by a pitch-discrimination task. Similarly, visual pathways that are specialized for the processing of motion information do not respond exclusively to visual movement. In fact, many cells in macaque visual area MT respond well to stationary stimuli as long as they are temporally dynamic (Mikami et al., 1986Go; Newsome et al., 1986Go). This may also be true for the auditory ‘motion’ system and could explain the activation observed in our pitch discrimination task using short, temporally dynamic ‘beeps’. Griffiths et al. (Griffiths et al., 1998Go) explored the relative role of temporal dynamics versus motion by using fMRI to compare responses to static versus moving sounds that were equated for dynamic modulation. As in the present study, they obtained activation throughout a wide network but found that responses to moving stimuli were strongest in parietal cortex and the insula (as well as prefrontal cortex and cerebellum). Together, these data suggest that, in both modalities, motion processing may involve a subset of regions within a more generalized system.

Another important similarity between the two modalities is that motion processing is not limited exclusively to a single ‘dorsal’ pathway. In addition to the traditional dorsal visual motion system, ventral visual areas such as V4 can respond strongly to visual motion stimuli (Ferrera et al., 1994Go). In humans, a region in lateral occipital cortex (region KO) that is distinct and posterior to hMT has been shown to be sensitive to boundaries defined by kinetic motion rather than luminance (Van Oostende et al., 1997Go). These results suggest that visual motion information also supports other functions such as figure–ground segmentation and the perception of motion- defined figures (Allman et al., 1985Go; DeYoe and Van Essen, 1988Go; Dupont et al., 1994Go; Van Oostende et al., 1997Go). Similarly, some of the auditory-related sites that we observed may represent an alternate route for using motion information to segment and identify particular auditory sources (objects) from the ongoing flow of auditory input (Gaschler-Markefski et al., 1998Go).

Polymodal Systems

By examining both visual and auditory motion systems within the same individuals, we were able to conclusively identify cortical areas that were co-activated during the two motion- discrimination tasks (see Fig. 8Go). Additionally, we looked for sites that were activated only during the explicit cross-modal speed comparison. However, there were no uniquely ‘polymodal’ sites, responding exclusively during the cross-modal task. This is consistent with earlier cross-modal studies (Ettlinger and Wilson, 1990Go; Hadjikhani and Roland, 1998Go). However, we did find enhanced activation within three of the four major co-activation regions identified in the isolated motion tasks, including the IPS, anterior midline, and anterior insula but not lateral frontal cortex.

The polymodal effects that we observed could have reflected both specific and non-specific task factors. Specific aspects of the task included attentional tracking of the target as well as selection and computation of the relevant motion parameter (target speed), comparison of the speeds between targets, and selection of the appropriate response. Non-specific functions common to the different tasks included suppression of unwanted eye movements plus storage and retrieval of information from working memory.

Task-specific Polymodal Integration

To perform the cross-modal speed discrimination, the relevant motion information had to be extracted and stored for each target. Then, each of the unimodal target speeds had to be compared to determine which one moved faster. Lateral parietal cortex appears to be a likely site for such computations since it is a probable locus of anatomical convergence for the modality- specific motion pathways. Careful comparison of the patterns of activation in individual subjects showed that the unimodal activation occupied partially overlapping yet distinct portions of posterior and lateral parietal cortex in and around the IPS. Within this zone of overlap, responses were enhanced during the cross-modal speed comparison, thereby suggesting an important role in modality integration. In monkeys, individual neurons in the lateral intraparietal area, LIP, have been shown to respond selectively to the locations of both visual and auditory targets, suggesting that they might support a supramodal representation of space (Mazzoni, 1994Go; Linden et al., 1996Go; Stricanne et al., 1996Go). Additionally, the simian ventral intraparietal area, VIP, is known to receive direct projections from motion-related visual areas MT, MST and surrounding polymodal cortex, as well as from auditory-related cortex (Maunsell and Van Essen, 1983Go; Lewis and Van Essen, 2000Go). Recent electrophysiological experiments have further shown that neurons in VIP and LIP can represent visuospatial information in a frame of reference that is non-retinotopic (e.g. head- or world-centered) (Duhamel et al., 1998Go; Snyder et al., 1998Go). If such systems are equally capable of representing motion information, then the parietal site identified by our cross-modal task may be the first cortical locus at which cross-modal speed comparisons can occur. However, in monkeys, parietal cortex projects heavily to periarcuate cortex of the frontal lobes (Pandya and Kuypers, 1969Go; Godschalk et al., 1984Go; Cavada and Goldman-Rakic, 1989Go; Lewis and Van Essen, 2000Go). Like parietal cortex, this region utilizes different types of sensory information (including vision and audition) for the localization of objects or events in external space (Vaadia et al., 1986Go; Graziano et al., 1999Go). Parietal and frontal cortex in humans is also heavily interconnected, at least functionally (see below); thus, it is possible that either parietal or frontal cortex or perhaps both may mediate the cross-modal comparison of motion information.

In our tasks, once the target speeds were compared, a response button had to be selected based on previously stored instructions and then pressed. Earlier studies have implicated fronto-parietal systems in such sensory-to-motor mappings of visual and auditory information (Andersen, 1995Go; Kalaska and Crammond, 1995Go; Wise et al., 1996Go, 1997Go; Deiber et al., 1997Go; Iacoboni et al., 1998Go). Thus, at least some of the activation observed in the present study within the fronto-parietal network may be associated with the response selection aspect of our tasks.

A Supramodal Attention Network

The regions of co-activation that we observed in the precentral sulcus and parietal cortex (and possibly anterior midline cortex) also appear to be part of a network that is important for the control of attention (Driver and Spense, 1998; Mesulam, 1998Go). Indeed, numerous studies have reported activity in all or portions of this network to varying degrees and extents when attention is directed to vision (Posner et al., 1987Go; Posner and Petersen, 1990Go; Corbetta et al., 1993Go; Haxby et al., 1994Go; Deiber et al., 1997Go; Corbetta, 1998Go; Culham et al., 1998Go; LaBar et al., 1999Go), audition (Pugh et al., 1996Go; Binder et al., 1997Go), cross-modal stimuli (O'Leary et al., 1997Go; Bushara et al., 1999Go), and when attention is directed to the expected location of a sensory stimulus (Kastner et al., 1999Go). Thus in the present experiments, this network may have acted to direct attention to targets within the same or different sensory modalities as required by each type of motion task. Moreover, recent theories of visual motion processing have implicated attention directly in the tracking and velocity estimation of moving targets (Blaser et al., 1999Go). This suggests the possibility that the control of attention and the motion computations themselves may be intimately intertwined and mediated by common or partially overlapping mechanisms within the fronto-parietal system.

Co-activation or even suppression involving anterior midline (anterior cingulate and/or pre-SMA) and retrosplenial structures may also reflect involvement of a supramodal attentional network (Mesulam, 1998Go). Midline structures are reported to be involved in high-level processing of complex stimuli (Posner et al., 1988Go; Pardo et al., 1990Go, 1991Go) and motivational/affective aspects of difficult tasks (Barch et al., 1997Go). Enhancement of anterior midline cortex observed during our cross-modal audio- visual comparison may reflect the particularly demanding aspects of the task such as cross-modal attentional allocation or error detection and compensation (Corbetta et al., 1993Go; Barch et al., 1997Go).

Foveal Fixation System

In our tasks, subjects were required to maintain visual fixation throughout the fMRI scans. Although they could readily comply with this requirement, the motion-discrimination tasks placed additional demands on the systems responsible for suppressing both saccades and overt visual tracking of the moving targets. Earlier studies suggested that specific oculomotor systems, which overlap portions of the co-activated cortex in our experiments, may mediate the inhibition of reflexive eye movements (Sheliga et al., 1995Go; Law et al., 1997Go; Petit et al., 1999Go). In fact, the activity that we observed in the dorsal precentral sulcus overlapped the ‘frontal eye fields’ (FEF) as defined in fMRI studies of saccadic eye movements (Luna et al., 1998Go; Petit et al., 1999Go). However, cortex mediating overt saccadic eye movements may also mediate covert shifts of visual attention (Corbetta et al., 1998Go). This raises the possibility that auditory spatial attention could be closely associated with oculomotor control systems traditionally thought to be under visual control.

Working Memory

An additional consideration regarding lateral frontal and anterior cingulate activation in this study was the involvement of working memory. In the tone discrimination task and the unimodal, 1-back speed comparison, subjects were required to use working memory to recall the speed of the immediately preceding target and then respond. Concordant with this notion, we observed activation in the lateral precentral and superior frontal sulci overlapping cortex reported to be involved in spatial working memory (Jonides et al., 1993Go; McCarthy et al., 1994Go; Courtney et al., 1998Go; LaBar et al., 1999Go). Similarly, activity along the medial wall (pre-SMA and anterior cingulate cortex) overlapped cortex reported to be active during working memory delays, especially with regard to maintaining a state of preparedness for selecting a motor response (Petit et al., 1998Go). Thus, aspects of working memory in our tasks may account for a portion of the activation observed in the anterior midline as well as lateral frontal cortex.

Other Polymodal Systems

Based on previous reports, we had expected to observe poly- modal co-activation in the anterior insula and in, or near, the STS. In an earlier study, Griffiths et al. (Griffiths et al., 1994Go) implicated the right anterior insula in auditory motion processing. We too observed activation of this region (bilaterally) during our auditory motion task but also during our pitch and visual motion tasks, thereby suggesting a non-specific functional role for this area. Similarly we had expected to find polymodal activation in the STS since, in monkeys, this region is known to contain cells responsive to multiple modalities (Bruce et al., 1981Go; Hikosaka et al., 1988Go). However, the STS responses we observed were typically weak and scattered, and did not approach the robustness of responses observed at other sites. Calvert et al. (Calvert et al., 1999b) observed speech-related audiovisual co-activation in the STS. Thus, polymodal activation of the STS may be dependent on stimulus or task factors not present in our paradigms. (Our uncertainty concerning the possible polysensory role of the STS is indicated in Fig. 8Go by the ‘?’ between the ovals representing auditory and visual activation.)

Conclusions

Overall, the results of this study indicate that the integration and comparison of motion information between the visual and auditory modalities involves a specific network of both unimodal and polymodal cortical areas. Parietal cortex, and perhaps lateral frontal cortex, appear to be optimally situated to mediate the integration and attentional selection of motion information across modalities. However, interactions between the two modalities can involve both enhancing and suppressive effects, depending on the nature of the stimuli and the task being performed by the subject.


    Notes
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
We thank Koss Inc. (Milwaukee, WI) for the production of custom MRI compatible electrostatic headphones used in this study; Jon Wieser and Kelly Williams for assistance with data processing and construction of the three-dimensional and flattened Talairach maps; and David Van Essen and Heather Drury for use of the cortical flattening algorithm. This work was supported by grants EY0676702 to J.W.L. and EY10244 and MH15358 to E.A.D.

Address correspondence to James Lewis, Ph.D., Department of Cell Biology, Neurobiology, and Anatomy, Medical College of Wisconsin, 8701 Watertown Plank Road, Milwaukee, WI 53226, USA. Email: james{at}mcw.edu


    References
 Top
 Abstract
 Introduction
 Materials and Methods
 Results
 Discussion
 Notes
 References
 
Adelson EH, Bergen JR (1985) Spatiotemporal energy models for the perception of motion. J Opt Soc Am 2:284–299.[ISI][Medline]

Ahissar M, Ahissar E, Bergman H, Vaadia E (1992) Encoding of sound-source location and movement: activity of single neurons and interactions between adjacent neurons in the monkey auditory cortex. J Neurophysiol 67:203–215.[Abstract/Free Full Text]

Allman J, Miezin F, McGuiness E (1985) Stimulus specific responses from beyond the classical receptive field: neurophysiological mechanisms for local-global comparisons in visual neurons. Annu Rev Neurosci 8:407–430.[ISI][Medline]

Andersen RA (1995) Encoding of intention and spatial location in the posterior parietal cortex. Cereb Cortex 5:456–469.

Andersen RA (1997) Multimodal representation of space in the posterior parietal cortex and its use in planning movements. Annu Rev Neurosci 20:303–330.[ISI][Medline]

Azuma M, Suzuki H (1984) Properties and distribution of auditory neurons in the dorsolateral prefrontal cortex of the alert monkey. Brain Res 298:343–346.[ISI][Medline]

Bandettini PA, Jesmanowicz A, Wong EC, Hyde JS (1993) Processing strategies for functional MRI of the human brain. Magn Reson Med 30:161–173.[ISI][Medline]

Barch DM, Braver TS, Nystrom LE, Forman SD, Noll DC, Cohen JD (1997) Dissociating working memory from task difficulty in human prefrontal cortex. Neuropsychologia 35:1373–1380.[ISI][Medline]

Baumgart F, Gaschler-Markefski B, Woldorff MG, Heinze H-J, Scheich H (1999) A movement-sensitive area in auditory cortex. Nature 400:724–725.[ISI][Medline]

Beauchamp MS, Cox RW, DeYoe EA (1997a) Graded effects of spatial and featural attention on human area MT and associated motion processing areas. J Neurophysiol 78:516–520.[Abstract/Free Full Text]

Beauchamp MS, Cox RW, DeYoe EA (1997b) Gradients of attention in the human visual motion processing system. Soc Neurosci Abstr 23:457.

Benevento LA, Fallon J, Davis BJ, Rezak M (1977) Auditory–visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey. Exp Neurol 57:849–872.[ISI][Medline]

Binder JR, Frost JA, Hammeke TA, Bellgowan PSF, Rao SM, Cox RW (1999) Conceptual processing during the conscious resting state: a functional MRI study. J Cogn Neurosci 11:80–95.[Abstract/Free Full Text]

Binder JR, Frost JA, Hammeke TA, Cox RW, Rao SM, Prieto T (1997) Human brain language areas identified by functional magnetic resonance imaging. J Neurosci 17:353–362.[Abstract/Free Full Text]

Blaser E, Sperling G, Lu ZL (1999) Measuring the amplification of attention. Proc Natl Acad Sci USA 96:11681–11686.[Abstract/Free Full Text]

Boussaoud D, Ungerleider LG, Desimone R (1990) Pathways for motion analysis: cortical connections of the middle superior temporal and fundus of the superior temporal visual areas in the macaque. J Comp Neurol 296:462–495.[ISI][Medline]

Bruce C, Desimone R, Gross CG (1981) Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. J Neurophysiol 46:369–384.[Free Full Text]

Brugge JF, Reale RA (1985) Auditory cortex. In: Cerebral cortex (Peters A, Jones EG, eds), pp. 229–271. New York: Plenum.

Bushara KO, Weeks RA, Ishii K, Catalan M-J, Tian B, Rauschecker JP, Hallett M (1999) Modality-specific frontal and parietal areas for auditory and visual spatial localization in humans. Nature Neurosci 2:759–766.[ISI][Medline]

Calvert GA, Brammer MJ (1999) FMRI evidence of a multimodal response in human superior temporal sulcus. NeuroImage 9:S1038.

Calvert G, Bullmore ET, Brammer MJ, Campbell R, Williams SCR, McGuire PK, Woodruff PWR, Eversen SD, Anthony S (1999) Activation of auditory cortex during silent lip reading. Science 276:593–596.[Abstract/Free Full Text]

Cavada C, Goldman-Rakic PS (1989) Posterior parietal cortex in rhesus monkey: II. Evidence for segregated corticocortical networks linking sensory and limbic areas with the frontal lobe. J Comp Neurol 287:422–445.[ISI][Medline]

Chubb C, Sperling G (1988) Drift-balanced random stimuli: a general basis for studying non-Fourier motion perception. J Opt Soc Am A Opt Image Sci 5:1986–2007.[ISI]

Chubb C, McGowan J, Sperling G, Werkoven P (1994) Non-Fourier motion analysis. In: Higher order processing in the visual system (Ciba Foundation Symposium), pp. 193–271. New York: Wiley.

Colby CL, Duhamel J, Goldberg ME (1993) Ventral intraparietal area of the macaque: anatomical location and visual response properties. J Neurosci 69:902–914.

Colby CL, Duhamel JR, Goldberg ME (1996) Visual, presaccadic, and cognitive activation of single neurons in monkey lateral intraparietal area. J Neurophysiol 76:2841–2852.[Abstract/Free Full Text]

Corbetta M (1998) Frontoparietal cortical networks for directing attention and the eye to visual locations: identical, independent, or overlapping neural systems? Proc Natl Acad Sci USA 95:831–838.[Abstract/Free Full Text]

Corbetta M, Miezin FM, Dobmeyer S, Shulman GL, Petersen SE (1991) Selective and divided attention during visual discrimination of shape, color, and speed: functional anatomy by positron emission tomography. J Neurosci 11:2383–2402.[Abstract]

Corbetta M, Miezin FM, Shulman GL, Petersen SE (1993) A PET study of visuospatial attention. J Neurosci 13:1202–1226.[Abstract]

Corbetta M, Akbudak E, Conturo TE, Snyder AZ, Ollinger JM, Drury HA, Linenweber MR, Petersen SE, Raichle ME, Van Essen DC, Shulman GL (1998) A common network of functional areas for attention and eye movements. Neuron 21:761–773.[ISI][Medline]

Courtney SM, Petit L, Maisog JM, Ungerleider LG, Haxby JV (1998) An area specialized for spatial working memory in human frontal cortex. Science 279:1347–1351.[Abstract/Free Full Text]

Cox RW (1996) AFNI: Software for analysis and visualization of functional magnetic resonance neuroimages. Comput Biomed Res 29:162–173.[ISI][Medline]

Culham JC, Brandt SA, Cavanagh P, Kanwisher NG, Dale AM, Tootell RBH (1998) Cortical fMRI activation produced by attentive tracking of moving targets. J Neurophysiol 80:2657–2670.[Abstract/Free Full Text]

Deiber M-P, Wise SP, Honda M, Catalan MJ, Grafman J, Hallett M (1997) Frontal and parietal networks for conditional motor learning: a positron emission tomography study. J Neurophysiol 78:977–991.[Abstract/Free Full Text]

Desimone R, Gross CG (1979) Visual areas in the temporal cortex of the macaque. Brain Res 178:363–380.[ISI][Medline]

Desimone R, Ungerleider LG (1986) Multiple visual areas in the caudal superior temporal sulcus of the macaque. J Comp Neurol 248:164–189.[ISI][Medline]

Desimone R, Ungerleider LG (1989) Neural mechanisms of visual processing in monkeys. In: Handbook of neuropsychology (Boller F, Grafman J, eds), pp. 267–299. Amsterdam: Elsevier.

DeYoe EA, Van Essen DC (1988) Concurrent processing streams in monkey visual cortex. Trends Neurosci 11:219–226.[ISI][Medline]

DeYoe EA, Bandettini P, Neitz J, Miller D, Winans P (1994) Functional magnetic resonance imaging (FMRI) of the human brain. J Neurosci Methods 54:171–187.[ISI][Medline]

Driver J, Spence C (1998) Attention and the crossmodal construction of space. Trends Cogn Sci 2:254–262.[ISI]

Drury HA, Van Essen DC, Anderson CH (1996) Computerized mappings of the cerebral cortex: a multiresolution flattening method and a surface-based coordinate system. J Cogn Neurosci 8:1–28.[ISI][Medline]

Duhamel J-R, Colby CL, Goldberg ME (1998) Ventral intraparietal area of the macaque: congruent visual and somatic response properties. J Physiol 79:126–136.

Dupont P, Orban GA, De Bruyn B, Verbruggen A, Mortelmans L (1994) Many areas in the human brain respond to visual motion. J Neurophysiol 72:1420–1424.[Abstract/Free Full Text]

Ettlinger G, Wilson WA (1990) Cross-modal performance: behavioural processes, phylogenetic considerations and neural mechanisms. Behav Brain Res 40:169–192.[ISI][Medline]

Felleman DJ, Van Essen DC (1991) Distributed hierarchical processing in the primate cerebral cortex. Cereb Cortex 1:1–47.[Abstract]

Ferrera VP, Rudolph KK, Maunsell JHR (1994) Responses of neurons in the parietal and temporal visual pathways during a motion task. J Neurosci 14:6171–6186.[Abstract]

Fink GR, Frackowiak RSJ, Pietrzyk U, Passingham RE (1997) Multiple non- primary motor areas in the human cortex. J Physiol 77:2164–2174.

Frith CD, Friston KJ (1996) The role of the thalamus in top down modulation of attention to sound. NeuroImage 4:210–215.[ISI][Medline]

Gaschler-Markefski B, Baumgart F, Tempelmann C, Woldorff MG, Scheich H (1998) Activation of human auditory cortex in retrieval experiments: an fMRI study. Neural Plasticity 6:69–75.[ISI][Medline]

Geiger B (1993) Three-dimensional modeling of human organs and its application to diagnosis and surgical planning. Technical report 2105. Institut National de Recherche en Informatique et Automatique.

Godschalk M, Lemon RN, Kuypers HGJM, Ronday HK (1984) Cortical afferents and efferents of monkey postarcuate area: an anatomical and electrophysiological study. Exp Brain Res 56:410–424.[ISI][Medline]

Graziano MSA, Reiss LAJ, Gross CG (1999) A neural representation of the location of nearby sounds. Nature 397:428–430.[ISI][Medline]

Griffiths TD, C.J. B, Frackowiak RSJ (1994) Human cortical areas selectively activated by apparent sound movement. Curr Biol 4:892–895.[ISI][Medline]

Griffiths TD, Rees A, Witton C, Cross PM, Shakir RA, Green GGR (1997) Spatial and temporal auditory processing deficits following right hemisphere infarction. Brain 120:785–794.[Abstract]

Griffiths TD, Rees G, Rees A, Green GGR, Witton C, Rowe D, Buchel C, Turner R, Frackowiak RSJ (1998) Right parietal cortex is involved in the perception of sound movement in humans. Nature Neurosci 1:74–79.[ISI][Medline]

Hadjikhani N, Roland PE (1998) Cross-modal transfer of information between the tactile and the visual representations in the human brain: a positron emission tomographic study. J Neurosci 18:1072–1084.[Abstract/Free Full Text]

Haxby JV, Horwitz B, Ungerleider LG, Maisog JM, Pietrini P, Grady CL (1994) The functional organization of human extrastriate cortex: a PET–rCBF study of selective attention to faces and locations. J Neurosci 14:6336–6353.[Abstract]

Hazeltine E, Grafton ST, Ivry R (1997) Attention and stimulus characteristics determine the locus of motor-sequence encoding. A PET study. Brain 120:123–140.[Abstract]

Hikosaka K, Iwai E, Saito H, Tanaka K (1988) Polysensory properties of neurons in the anterior bank of the caudal superior temporal sulcus of the macaque monkey. J Neurophysiol 60:1615–1637.[Abstract/Free Full Text]

Iacoboni M, Woods RP, Mazziotta JC (1998) Bimodal (auditory and visual) left frontoparietal circuitry for sensorimotor integration and sensorimotor learning. Brain 121:2135–2143.[Abstract]

Jonides J, Smith EE, Koeppe RA, Awh E, Minoshima S, Mintun MA (1993) Spatial working memory in humans as revealed by PET. Nature 363:623–625.[ISI][Medline]

Kalaska JF, Crammond DJ (1995) Deciding not to GO: neuronal correlates of response selection in a GO/NOGO task in primate premotor and parietal cortex. Cereb Cortex 5:410–428.[Abstract]

Kastner S, Pinsk MA, De Weerd P, Desimone R, Ungerleider LG (1999) Increased activity in human visual cortex during directed attention in the absence of visual stimulation. Neuron 22:751–761.[ISI][Medline]

Kawashima R, O'Sullivan BT, Roland P.E. (1995) Positron-emission tomography studies of cross-modality inhibition in selective attentional tasks: closing the ‘mind's eye’. Proc Natl Acad Sci USA 92:5969–5972.[Abstract/Free Full Text]

Kennedy H, Bullier J (1985) A double-labeling investigation of the afferent connectivity to cortical areas V1 and V2 of the macaque monkey. J Neurosci 5:2815–2830.[Abstract]

Knudsen EI, Konishi M (1978) A neural map of auditory space in the owl. Science 200:795–797.[ISI][Medline]

LaBar KS, Gitelman DR, Parrish TB, Mesulam M-M (1999) Neuroanatomic overlap of working memory and spatial attention networks: a functional MRI comparison within subjects. NeuroImage 10:695–704.[ISI][Medline]

Law I, Svarer C, Hom S, Paulson OB (1997) The activation pattern in normal humans during suppression, imagination and performance of saccadic eye movements. Acta Physiol Scand 161:419–434.[ISI][Medline]

Leinonen L, Hyvärinen J, Sovijärvi ARA (1980) Functional properties of neurons in the temporo-parietal association cortex of awake monkey. Exp Brain Res 39:203–215.[ISI][Medline]

Lewis JW (1997) The intraparietal sulcus of the macaque and connected cortical regions: anatomical parcellation and connections throughout the hemisphere. Doctoral dissertation, California Institute of Technology.

Lewis JW, DeYoe EA (1998a) Cortical activation and suppression in response to sound and visual motion processing. NeuroImage 7:S378.

Lewis JW, DeYoe EA (1998b) Cortical interaction of visual and auditory motion processing. Soc Neurosci Abstr 24:529.

Lewis JW, Van Essen DC (2000) Cortico-cortical connections of visual, sensorimotor, and multimodal processing areas in the parietal lobe of the macaque monkey. J Comp Neurol (in press).

Linden JF, Grunewald A, Andersen RA (1996) Auditory sensory responses in area LIP? Soc Neurosci Abstr 22:398.

Luna B, Thulborn KR, Strojwas MH, McCurtain BJ, Berman RA, Genovese CR, Sweeney JA (1998) Dorsal cortical regions subserving visually guided saccades in humans: an fMRI study. Cereb Cortex 8:40–47.[Abstract]

Mäkelä JP, McEvoy L (1996) Auditory evoked fields to illusory sound source movements. Exp Brain Res 110:446–453.[ISI][Medline]

Maunsell JHR, Van Essen DC (1983) The connections of the middle temporal visual area (MT) and their relationship to a cortical hierarchy in the macaque monkey. J Neurosci 3:2563–2586.[Abstract]

Mazzoni P (1994) Spatial perception and movement planning in the posterior parietal cortex. Doctoral dissertation, University of California, San Diego.

McCarthy G, Blamire AM, Puce A, Nobre AC, Bloch G, Hyder F, Goldman-Rakic P, Shulman RG (1994) Functional magnetic resonance imaging of human pre-frontal cortex activation during a spatial working memory task. Proc Natl Acad Sci USA 91:8690–8694.[Abstract]

McCarthy G, Spicer M, Adrignolo A, Luby M, Gore J, Allison T (1995) Brain activation associated with visual motion studied by functional magnetic resonance imaging in humans. Hum Brain Map 2:234–243.

Mesulam M-M (1998) From sensation to cognition. Brain 121:1013–1052.[Abstract]

Mikami A, Newsome WT, Wurtz RH (1986) Motion selectivity in macaque visual cortex. I. Mechanisms of direction and speed selectivity in extrastriate area MT. J Neurophysiol 55:1308–1327.[Abstract/Free Full Text]

Murray SO, Newman AJ, Roder B, Mitchell TV, Takahashi T, Neville HJ (1998) Functional organization of auditory motion processing in humans using fMRI. Soc Neurosci Abstr 24:1401.

Newsome WT, Pare EB (1988) A selective impairment of motion perception following lesions of the middle temporal visual area (MT). J Neurosci 8:2201–2211.[Abstract]

Newsome WT, Mikami A, Wurtz RH (1986) Motion selectivity in macaque visual cortex. III. Psychophysics and physiology of apparent motion. J Neurophysiol 55:1340–1339.[Abstract/Free Full Text]

O'Leary DS, Andreasen NC, Hurtig RR, Torres IJ, Flashman LA, Kesler ML, Arndt SV, Cizadlo TJ, Ponto LLB, Watkins GL, Hichwa RD (1997) Auditory and visual attention assessed with PET. Hum Brain Map 5:422–436.[ISI]

Ono M, Kubik S, Abernathey CD (1990) Atlas of the cerebral sulci. New York: Thieme.

Orban GA, Dupont P, De Bruyn B, Vogels R, Vandenberghe R, Mortelmans L (1995) A motion area in human visual cortex. Proc Natl Acad Sci USA 92:993–997.[Abstract]

Orban GA, Kennedy H, Bullier J (1986) Velocity sensitivity and direction selectivity of neurons in areas V1 and V2 of the monkey: influence of eccentricity. J Neurophysiol 56:462–480.[Abstract/Free Full Text]

Pandya DN (1995) Anatomy of the auditory cortex. Rev Neurol 151:486–494.[Medline]

Pandya DN, Kuypers GJM (1969) Cortico-cortical connections in the rhesus monkey. Brain Res 13:13–36.[ISI][Medline]

Pardo JV, Fox PT, Raichle ME (1991) Localization of a human system for sustained attention by positron emission tomography. Nature 349:61–64.[ISI][Medline]

Pardo JV, Pardo PJ, Janer KW, Raichle ME (1990) The anterior cingulate cortex mediates processing selection in the Stroop attentional conflict paradigm. Proc Natl Acad Sci USA 87:256–259.[Abstract]

Paus T, Marrett S, Worsley KJ, Evans AC (1995) Extraretinal modulation of cerebral blood flow in the human visual cortex: implications for saccadic suppression. J Neurophysiol 74:2179–2183.[Abstract/Free Full Text]

Penhune VB, Zatorre RJ, MacDonald JD, Evans AC (1996) Inter- hemispheric anatomical differences in human primary auditory cortex: probabilistic mapping and volume measurement from magnetic resonance scans. Cereb Cortex 6:661–672.[Abstract]

Petersen SE, Van Mier H, Fiez JA, Raichle ME (1998) The effects of practice on the functional anatomy of task performance. Proc Natl Acad Sci USA 95:853–860.[Abstract/Free Full Text]

Petit L, Courtney SM, Ungerleider LG, Haxby JV (1998) Sustained activity in the medial wall during working memory delays. J Neurosci 18:9429–9437.[Abstract/Free Full Text]

Petit L, Dubois S, Tzourio N, Dejardin S, Crivello F, Michel C, Etard O, Denise P, Roucoux A, Mazoyer B (1999) PET study of the human foveal fixation system. Hum Brain Map 8:28–43.[ISI][Medline]

Phillips DP, Brugge JF (1985) Progress in neurophysiology of sound localization. Annu Rev Psychol 36:245–274.[ISI][Medline]

Posner MI, Petersen SE (1990) The attention system of the human brain. Annu Rev Neurosci 13:25–42.[ISI][Medline]

Posner MI, Walker JA, Friedrich FA, Rafal RD (1987) How do the parietal lobes direct covert attention? Neuropsychologia 25:135–145.[ISI][Medline]

Posner MI, Petersen SE, Fox PT, Raichle ME (1988) Localization of cognitive functions in the human brain. Science 240:1627–1631.[ISI][Medline]

Pugh KR, Shaywitz BA, Shaywitz SE, Fulbright RK, Byrd D, Skudlarski P, Shankweiler DP, Katz L, Constable RT, Fletcher J, Lacadie C, Narchione K, Gore JC (1996) Auditory selective attention: an fMRI investigation. NeuroImage 4:159–173.[ISI][Medline]

Rao SC, Rainer G, Miller EK (1997) Integration of what and where in the primate prefrontal cortex. Science 276:821–824.[Abstract/Free Full Text]

Reale RA, Brugge JF (1990) Auditory cortical neurons are sensitive to static and continuously changing interaural phase cues. J Neuro- physiol 64:1247–1260.[Abstract/Free Full Text]

Rizzo M, Mawrot M, Zihl J (1995) Motion and shape perception in cerebral akinetopsia. Brain 118:1105–1127.[Abstract]

Romanski LM, Bates JF, Goldman-Rakic PS (1999) Auditory belt and parabelt projections to the prefrontal cortex in the rhesus monkey. J Comp Neurol 403:141–157.[ISI][Medline]

Sheliga BM, Riggio L, Rizzolatti G (1995) Spatial attention and eye movements. Exp Brain Res 105:261–275.[ISI][Medline]

Shulman GL, Corbetta M, Buckner RL, Raichle ME, Fiez JA, Miezin FM, Petersen SE (1997) Top-down modulation of early sensory cortex. Cereb Cortex 7:193–206.[Abstract]

Snyder LH, Grieve KL, Brotchie P, Andersen RA (1998) Separate body- and world-referenced representations of visual space in parietal cortex. Nature 394:887–891.[ISI][Medline]

Sovijärvi AR, Hyvärinen J (1974) Auditory cortical neurons in the cat sensitive to the direction of sound source movement. Brain Res 73:455–471.[ISI][Medline]

Spitzer MW, Semple MN (1993) Responses of inferior colliculus neurons to time-varying interaural phase disparity: effects of shifting the locus of virtual motion. J Neurophysiol 69:1245–1263.[Abstract/Free Full Text]

Stein BE, Meredith MA, Wallace MT (1993) The visually responsive neuron and beyond: Multisensory integration in cat and monkey. Progr Brain Res 95:79–90.[ISI][Medline]

Stein BE, Wallace MT (1996) Comparisons of cross-modality integration in midbrain and cortex. Progr Brain Res 112:289–299.[ISI][Medline]

Stricanne B, Andersen RA, Mazzoni P (1996) Eye-centered, head- centered, and intermediate coding of remembered sound locations in area LIP. J Neurophysiol 76:2071–2076.[Abstract/Free Full Text]

Stumpf E, Toronchuk JM, Cynader MS (1992) Neurons in cat primary auditory cortex sensitive to correlates of auditory motion in three-dimensional space. Exp Brain Res 88:158–168.[ISI][Medline]

Suga N (1994) Multi-function theory for cortical processing of auditory information: implications of single-unit and lesion data for future research. J Comp Physiol 175(2):135–144.

Sunaert S, Van Hecke P, Marchal G, Orban GA (1999) Motion-responsive regions of the human brain. Exp Brain Res 127:355–370.[ISI][Medline]

Takahashi TT, Keller CH (1992) Simulated motion enhances neuronal selectivity for a sound localization cue in background noise. J Neurosci 12:4381–4390.[Abstract]

Talairach J, Tournoux P (1988) Co-planar stereotaxic atlas of the human brain. New York: Thieme.

Talavage TM, Ledden PJ, Sereno MI, Rosen BR, Dale AM (1997) Multiple phase-encoded tonotopic maps in human auditory cortex. Neuro- Image 5:S8.

Tanila H, Carlson S, Linnankoski I, Lindroos F, Kahila H (1992) Functional properties of dorsolateral prefrontal cortical neurons in awake monkey. Exp Brain Res 47:169–180.

Tootell RBH, Mendola JD, Hadjikhani NK, Ledden PJ, Liu AK, Reppas JB, Sereno MI, Dale AM (1997) Functional analysis of V3A and related areas in human visual cortex. J Neurosci 15:7060–7078.

Tootell RBH, Reppas JB, Dale AM, Look RB, Sereno MI, Malach R, Brady TJ, Rosen BR (1995a) Visual motion aftereffect in human cortical area MT revealed by functional magnetic resonance imaging. Nature 375:139–141.[ISI][Medline]

Tootell RBH, Reppas JB, Kwong KK, Malach R, Born RT, Brady TJ, Rosen BR, Belliveau JW (1995b) Functional analysis of human MT and related visual cortical areas using magnetic resonance imaging. J Neurosci 15:3215–3230.[Abstract]

Toronchuk JM, Stumpf E, Cynader MS (1992) Auditory cortex neurons sensitive to correlates of auditory motion: underlying mechanisms. Exp Brain Res 88:169–180.[ISI][Medline]

Tzourio N, El Massioui F, Crivello F, Joliot M, Renault B, Mazoyer B (1997) Functional anatomy of human auditory attention studied with PET. NeuroImage 5:63–77.[ISI][Medline]

Ungerleider LG, Desimone R (1986) Projections to the superior temporal sulcus from the central and peripheral field representations of V1 and V2. J Comp Neurol 248:147–163.[ISI][Medline]

Vaadia E, Benson DA, Hienz RD, Goldstein MHJ (1986) Unit study of monkey frontal cortex: active localization of auditory and of visual stimuli. J Neurophysiol 56:934–952.[Abstract/Free Full Text]

Van Buren JM, Borke RC (1972) Variations and connections of the human thalamus. New York: Springer-Verlag.

Van Essen DC, Drury HA, Joshi S, Miller MI (1998) Functional and structural mapping of human cerebral cortex: solutions are in the surfaces. Proc Natl Acad Sci USA 95:788–795.[Abstract/Free Full Text]

Van Oostende S, Sunaert S, Van Hecke P, Marchal G, Orban GA (1997) The kinetic occipital (KO) region in man: an fMRI study. Cereb Cortex 7:690–701.[Abstract]

Ward LM (1994) Supramodal and modality-specific mechanisms for stimulus-driven shifts of auditory and visual attention. Can J Exp Psychol 48:242–259.[ISI][Medline]

Watanabe J, Iwai E (1991) Neuronal activity in visual, auditory and polysensory areas in the monkey temporal cortex during visual fixation task. Brain Res Bull 26:583–592.[Medline]

Wessinger CM, Buonocore MH, Kussmaul CL, Mangun GR (1997) Tonotopy in human auditory cortex examined with functional magnetic resonance imaging. Hum Brain Map 5:18–25.[ISI]

Wilson FAW, Scalaidhe SPO, Goldman-Rakic PS (1993) Dissociation of object and spatial processing domains in primate prefrontal cortex. Science 260:1955–1858.[ISI][Medline]

Wise SP, Boussaoud D, Johnson PB, Caminiti R (1997) Premotor and parietal cortex: Corticocortical connectivity and combinatorial computations. Annu Rev Neurosci 20:25–42.[ISI][Medline]

Wise SP, di Pellegrino G, Boussaoud D (1996) The premotor cortex and nonstandard sensorimotor mapping. Can J Physiol Pharmacol 74:469–482.[ISI][Medline]

Yeterian EH, Pandya DN (1989) Thalamic connections of the cortex of the superior temporal sulcus in the rhesus monkey. J Comp Neurol 282:80–97.[ISI][Medline]

Zeki S, Watson JDG, Lueck CJ, Friston KJ, Kennard C, Frackowiak RSJ (1991) A direct demonstration of functional specialization in human visual cortex. J Neurosci 11:641–649.[Abstract]

Zeki S, Watson DG, Frackowiak RSJ (1993) Going beyond the information given: the relation of illusory visual motion to brain activity. Proc R Soc Lond B 252:215–222.[ISI][Medline]

Zihl J, Von Cramon D, Mai N (1983) Selective disturbance of movement vision after bilateral brain damage. Brain 106:313–340.[Abstract]

Zihl J, Von Cramon D, Mai N, Schmid CH (1991) Disturbance of movement vision after bilateral posterior brain damage. Brain 114:2235–2252.[Abstract]