Although we describe the neural processes underlying reach and grasp separately, the two actions are usually coordinated. Coordination is achieved through reciprocal axonal connections between reach- and grasp-related populations both within the same cortical areas and between different areas and through populations of neurons that discharge in connection with components of both reach and grasp.
Space Is Represented in Several Cortical Areas with Different Sensory and Motor Properties
The planning of a reaching movement is usually defined as the neural process by which the location of an object in space is translated into an arm movement that brings the hand into contact with the object. Our intuitive conception of space as a single continuous expanse—one that extends in all directions and within which objects have locations relative to one another and to ourselves—has long influenced neuroscience.
According to classical neurology the neural counterpart of the space that we experience is a single map in the parietal lobe constructed by inputs from different sensory modalities. This unified, multimodal neural replica of the world is assumed to provide all the information necessary for acting on an object and is shared by the different motor circuits that control the eyes, arm, hand, and other effectors.
An alternative view is that there are many maps each related to a different motor effector and adapted to its specific needs. These spatial representations are created when the individual interacts with its environment, defining a series of motor relations determined by the properties of a particular effector. For example, a rodent has a locomotion map in the hippocampus and adjacent entorhinal cortex representing the animal's current location and direction of motion. This alternative hypothesis suggests that our intuitive sense of space arises at least in part from our motor interactions with the world.
Evidence collected in recent years clearly does not support the notion of a single topographically organized representation of space in the parietal cortex. First, the parietal cortex is organized as a series of areas working in parallel. Second, near space or peripersonal space, the space within our reach, is encoded in areas different from those that represent far space, the space beyond our reach. Third, the functional properties of the neurons in parietal and frontal areas of cortex involved in spatial coding vary depending on the body part controlled, such as the eyes versus the arm.
These findings support the idea there are many spatial maps, some located in the parietal cortex and others in the frontal cortex, whose properties are tuned to the motor requirements of different effectors. Moreover, the spatial maps in each cortical area are not maps in the usual sense of a faithful point-to-point representation of surrounding space, but rather dynamic maps that may expand or shrink according to the motor requirements necessary to interact with a given stationary or moving object.
The Inferior Parietal and Ventral Premotor Cortex Contain Representations of Peripersonal Space
In monkeys several areas in the inferior parietal cortex and interconnected parts of the premotor cortex contain representations of peripersonal space. One such area, the ventral intraparietal area, is located in the fundus of the intraparietal sulcus (Figure 38–4A). It receives visual projections from components of the dorsal visual stream, including areas MST (medial superior temporal cortex) and MT (medial temporal cortex), that are involved in the analysis of optic flow and visual motion.
Separate parietofrontal pathways are involved in the visuomotor transformations for reaching and grasping.
A. The visuomotor transformation necessary for reaching is mediated by the parietofrontal network shown here. The areas located within the intraparietal sulcus are shown in an unfolded view of the sulcus. Two serial pathways are involved in the organization of reaching movements. The ventral stream has its principal nodes in the ventral intraparietal area (VIP) and area F4 of the ventral premotor cortex, whereas the dorsal stream has synaptic relays in the superior parietal lobe (MIP, V6A) and the dorsal premotor cortex (PMd), which includes area F2. (Parietal areas include AIP, anterior intraparietal area; LIP, lateral intraparietal area; and V6A, the parietal portion of the parieto-occipital area.) PEc and PEip are parietal areas according to the nomenclature of von Economo. Somatosensory areas 1, 2, and 3 and area PE, which provide somatosensory input to M1 (F1), are not shown in the figure. Precentral areas include F5, a subdivision of PMv, the ventral premotor cortex, and the primary motor cortex (M1, F1).
B. The visuomotor transformation necessary for grasping is mediated by the parietofrontal network shown here. The AIP and PFG areas are concerned mostly with hand movements, whereas area PF is concerned with mouth movements. PF and PFG are parietal areas according to the nomenclature of von Economo. Area F5 in PMv is concerned with both hand and mouth motor acts. Some grasping neurons have been found in F2, the ventral part of PMd. Area M1 (or F1) contains a large sector that controls the fingers, hand, and wrist (see Figure 37–2A). Other abbreviations are explained in part A.
Some ventral intraparietal neurons respond only to visual stimuli and respond preferentially either to expanding (looming) or contracting (receding) stimuli or to stimuli moving in the horizontal or vertical plane. Others have polymodal receptive fields within which inputs from different sensory modalities lie in spatial register (Figure 38–5A). These neurons respond to tactile stimuli, most often near the mouth or on the face but also on the arm or trunk, as well as to visual stimuli located immediately adjacent to the tactile receptive field. Some even respond to auditory stimuli in the same spatial location. Certain polymodal neurons respond to both visual and tactile stimuli moving in the same direction whereas others are strongly activated by visual stimuli that move toward their tactile receptive field but only if the path of motion will eventually intersect the tactile receptive field.
Some neurons in the parietal and premotor cortex respond to both tactile and visual stimuli within receptive fields that are spatially in register.
A. Some neurons in the ventral intraparietal cortex have tactile and visual receptive fields that are aligned in a congruent manner. Orange areas on the monkey represent tactile receptive fields; purple areas on the screen in front of the monkey's face and centered on its nose represent visual receptive fields. Many of the neurons also share directional preferences for movement of tactile and visual stimuli (arrows). (Reproduced, with permission, from Duhamel, Colby, and Goldberg 1998.)
B. Neurons in ventral premotor cortex area F4 respond to either tactile or visual stimulation. Orange areas are tactile receptive fields; purple lines indicate the three-dimensional receptive fields within which visual stimuli activate the neuron. (Reproduced, with permission, from Fogassi et al. 1996.)
Ventral intraparietal neurons appear to represent an early stage in the construction of a peripersonal spatial map that is more fully expressed in a caudal part of the ventral premotor cortex, area F4, with which it is strongly interconnected. Virtually all neurons in area F4 respond to somatosensory inputs, especially tactile stimuli. The tactile receptive fields are located primarily on the face, neck, arms, and hands. Half of the neurons also respond to visual stimuli and a few to auditory stimuli.
As with ventral intraparietal neurons, the modality-specific receptive fields in area F4 lie in register (Figure 38–5B). This suggests that the visual receptive fields are not defined by the location of the visual stimulus on the retina, as in most neurons in the visual cortex, but are anchored to specific parts of the individual's body. One striking feature of such a polymodal neuron, especially in the ventral premotor cortex, is that its visual receptive field remains aligned with the tactile receptive field when the monkey looks in different directions, but moves with the tactile receptive field to a different part of peripersonal space when the monkey moves the corresponding part of its body.
Nevertheless, area F4 is a motor area and its neurons also discharge in association with movements, most often of the arm, wrist, neck, and face. The neurons in this area control movements of the head and arm toward different parts of the body, or toward objects close to the body, to permit the animal to grasp them with its mouth or hand. Some neurons discharge during the entire action of bringing the hand to the mouth and opening the mouth to ingest food, as well as during arm reaching and associated neck- and trunk-orienting movements. Activity in other neurons is correlated not only with reaching but also with other behaviors such as the avoidance of threatening stimuli. The sensory representation of peripersonal space in area F4 contributes to the planning and execution of those behaviors.
The Superior Parietal Cortex Uses Sensory Information to Guide Arm Movements Toward Objects in Peripersonal Space
A key requirement for efficient reaching is knowledge of where the arm is before and during the action. Lesion studies suggest that this information is represented in Brodmann's area 2, the primary somatosensory area (S-I), and in the superior parietal lobule. Patients with lesions of these regions are unable to reach toward objects efficiently, even though they do not have the deficits of spatial perception, such as spatial neglect, that are typical of lesions in the inferior parietal lobe (see Chapter 19).
Although single-neuron studies confirm the role of these areas in providing information about arm location, there are clear functional differences between the two areas. Neurons in area 2 usually respond to tactile input from a limited part of the body or to movements of a single joint or a few adjacent joints in specific directions and most commonly on the contralateral side of the body. In contrast, many neurons in the superior parietal lobule discharge during combined movements of multiple joints, the assumption of specific postures, or movements of the limbs and the body. Some cells also respond during combined movements of the arms and hind limbs or bilateral movements of both arms.
These findings indicate that, unlike neurons in area 2 that encode the positions and movements of specific parts of the body, neurons in the superior parietal lobe integrate information on the positions of individual joints as well as the positions of limb segments with respect to the body. This integration creates a body schema that provides information on where the arm is located with respect to the body and how the different arm segments are positioned with respect to one another. This schema provides fundamental information for the proprioceptive guidance of arm movements.
More posterior and medial sectors of the superior parietal cortex also receive input from areas V2 and V3 of the extrastriate visual cortex. Important nodes in this network include areas V6A and PEc and an area of parietal cortex involved in reaching described by Richard Andersen and colleagues and which most likely corresponds to the medial intraparietal area (MIP) and nearby parts of the superior and inferior parietal cortex (see Figure 38–4A). In these areas the spatial representation for reaching is not based on body-centered coordinates. For example, neurons in V6A and PEc often signal the retinal location of possible targets for reaching, but their activity is also strongly modulated by complex combinations of inputs related to the direction of gaze and the current arm posture and hand position.
Andersen and his associates propose that the reach-related region of parietal cortex is particularly important for specifying the goal or target of reaching but not how the action should be performed. The activity of many neurons in this area varies with the location of the target relative to the hand. Remarkably, however, this motor error signal is not centered on the current location of the hand or target but rather on the current direction of gaze. Each time the monkey looks in a different direction the reach-related activity in the neurons changes (Figure 38–6). In contrast, the reach-related activity of many neurons in area PEip is less gaze-centered and more related to the current hand position and arm posture.
Neurons in the parietal reach area encode target location in eye-centered coordinates.
An upright board contains an array of pushbuttons. The four panels show the possible behavioral conditions at the beginning of a trial. The initial hand position and point of visual fixation are indicated by the green and orange buttons, respectively. Histograms of activity in a single neuron are arranged to correspond to the locations of the buttons on the board that serve as the target of a reaching movement from the start position in different trials. The firing pattern of this neuron does not vary with changes in initial limb position (A, B), but shifts with a change in the initial direction of gaze (C, D). The neuron thus signals the target location relative to the current direction of gaze, independent of the direction of arm movement required to reach the target. (Modified, with permission, from Andersen and Buneo 2002.)
Another important property of neurons in the parietal reach region is that they respond not only to passive sensory inputs but also before the onset of movements and during the planning period of delayed-reaching tasks. This behavior indicates that these neurons receive centrally generated signals about motor intentions prior to movement onset, likely through their reciprocal connections with precentral motor areas. Recent theoretical and experimental findings suggest that this combination of peripheral sensory and central motor inputs permits the parietal reach region to integrate sensory input with efference copies of outgoing motor commands to compute a continuously updated estimate of the current arm state and a prediction about how the arm will respond to the motor command. This forward internal model of the arm could be used to make rapid corrections for errors in ongoing arm movements and to acquire motor skills.
The functional properties of areas in the superior parietal cortex concerned with reaching suggest an intriguing explanation of the clinical phenomenon of optic ataxia. Patients with a lesion of the superior parietal cortex have difficulty with visually guided arm movements toward an object. Making errors in the frontal or sagittal plane, the arm gropes for the target until it encounters the object almost by chance. The deficit is severe when the target is in the peripheral part of the visual field, less when the target lies in the parafoveal region, and negligible when the patient fixates the target. The symptoms of optic ataxia may result from failure of the neural circuits that convert sensory information about targets and the arm into motor plans or from failure of the circuits that contribute to a predictive forward model of the arm's current state.
Premotor and Primary Motor Cortex Formulate More Specific Motor Plans About Intended Reaching Movements
The reach-related areas of the parietal cortex are reciprocally connected to several precentral motor areas, including the primary motor cortex, dorsal and ventral premotor cortex, and supplementary motor area. Neurons in all of these areas contribute to sensorimotor transformations that provide increasingly detailed information about the desired spatial kinematics and causal mechanical details of the movements.
For example, the reach-related neurons in the dorsal premotor cortex are much less strongly influenced by the direction of gaze than are neurons in the parietal reach area. Instead they are driven by the direction of the intended reaching movements during the planning period of delayed-reaching tasks and during the reaching movement itself. Furthermore, during the planning period many dorsal premotor neurons signal the direction of movement to the target whether the left or right arm is used to reach for the target (Figure 38–7). This finding suggests that the premotor neurons represent the appropriate extrinsic spatial kinematics of the reaching movement independent of the arm that will perform it. In contrast, the activity of most reach-related neurons in the primary motor cortex is related to movement of the contralateral arm.
Reaching movement is represented differently in the premotor and primary motor cortex during planning and execution of the movement.
(Modified, with permission, from Cisek, Crammond, and Kalaska 2003.)
A. Activity of a dorsal premotor cortex neuron in a monkey during an instructed-delay reaching task. The animal is trained to reach for targets in eight directions from a central starting position using either arm. During testing one arm is contralateral and one arm is ipsilateral to the recording site. During the planning period—the time between the presentation of the target cue and the delayed onset of movement—the neuron is directionally tuned with a preference for rightward movements. The directional tuning is identical whether the left or right arm is used. The neuron is relatively inactive during movement execution. In each raster plot the left vertical line indicates presentation of the target cue, and the right vertical line indicates the onset of arm movement. The thick tics to the left and right of the movement-onset line in each trial indicate, respectively, presentation of the go cue and the end of movement.
B. Activity of a primary motor cortex neuron during the same task as in part A. The neuron is strongly active and directionally tuned toward the lower targets when the contralateral arm is used but only during the execution phase. It is essentially inactive when the ipsilateral arm is used.
In other studies a monkey was trained to make arm movements to move a cursor on a computer monitor. In some trials the motions of the arm and cursor were collinear. In other trials they were decoupled in one of three ways: by rotating the cursor motion at a 90 degree angle to the arm movement, applying a mirror-image transformation, or requiring the monkey to make elliptical motions of its arm to draw a circle with the cursor. Some neurons, especially in the primary motor cortex, signaled the motions of the arm in both the collinear and decoupled conditions. Other neurons concentrated in the dorsal and ventral premotor cortex signaled the desired motions of the cursor under the different visuomotor conditions.
These findings indicate that premotor cortex neurons can generate an abstract representation of the goal of the motor output, in this case the motion of the cursor that the monkey was moving, independent of the arm movements that control the cursor's motion. Other neurons in the premotor and primary motor cortex translate that abstract representation into signals about what the arm must do to produce the desired cursor movements.
Although these results suggest that the motor system initially plans reaching movements in extrinsic spatial coordinates, we move by contracting muscles. Many neurophysiological studies have therefore sought the neural correlates of the transformation of a desired spatiotemporal movement pattern into its causal forces and muscle activity. The consensus is that the primary motor cortex plays an important role in that transformation (see Chapter 37). However, the final motor command for the muscle-activity patterns required to execute the desired reaching movement is probably generated by spinal motor circuits.
In summary, neurophysiological studies have provided support for the general hypothesis that reaching movements involve neuronal processes that implement a sequence of transformations between sensory input and motor output. These processes occur in a dynamic, distributed network of cortical areas rather than in a strictly serial pathway. There are no abrupt transitions of cellular properties between cortical areas; instead there is a progression. Neural correlates of each putative transformation can be seen in both parietal and precentral areas, whose true nature and functions are still not fully known (Box 38–1).
Box 38–1 The Cortical Motor System Does Not Solve Newtonian Equations
Understanding the cortical mechanisms underlying the planning and execution of reaching movements requires insight into how single neurons and neuronal populations encode different properties of intended movements and how they transform that information into motor commands.
For many years the study of the cerebral cortical mechanisms of motor control has been guided by terminology and concepts borrowed from physics, engineering, and control theory. Many studies have therefore sought and found statistical correlations between the activity of neurons in movement-related cortical areas and such movement-related parameters as target location, the velocities of hand displacement and reach trajectory, motor output force, and joint torque.
It is unlikely, however, that the motor system controls movements by encoding them in the familiar but arbitrary terms of Newtonian mechanics or by solving equations derived from the Newtonian laws of motion. Even though neural responses are consistent with a sequence of sensorimotor transformations, it is improbable that neural circuits explicitly solve the trigonometric and algebraic equations that define those transformations.
The cortical mechanisms for the planning and control of reaching movements are not based on the formalisms and first principles of physics, mechanics, and mathematics. They are determined by the stream of signals provided by peripheral sensors, by the force-generating properties of muscles, by the emergent dynamic mechanical properties of the arm, and by the properties of the spinal motor circuitry that converts the descending motor commands into muscle activity and movements.