Anatomy and Function of the Prefrontal Cortex
The prefrontal cortex is the region that subserves executive function, the ability to exert cognitive control over decision making and behavior. It is also important for anticipating reward and punishment and for empathy and complex emotions. This brain area has grown disproportionately large in our near primate relatives and especially in humans compared with other mammals 14–1. Different subregions 14–2 participate in a large number of distinct circuits that permit the integration and processing of diverse types of information. The prefrontal cortex receives not only inputs from other cortical regions, including association cortex, but also, via the thalamus, inputs from subcortical structures subserving emotion and motivation, such as the amygdala (Chapter 15) and ventral striatum (or nucleus accumbens; Chapter 16). The prefrontal cortex is also innervated by widely projecting neurotransmitter systems, such as norepinephrine, dopamine, serotonin, acetylcholine, and orexin/hypocretin. These diverse inputs and back projections to both cortical and subcortical structures put the prefrontal cortex in a position to exert what is often called “top-down” control or cognitive control of behavior. Because behavioral responses in humans are not rigidly dictated by sensory inputs and drives, behavioral responses can instead be guided in accordance with short- or long-term goals, prior experience, and the environmental context. The response to a delicious-looking dessert is different depending on whether a person is alone staring into his or her refrigerator, is at a formal dinner party attended by his or her punctilious boss, or has just formulated the goal of losing 10 lb. The response to a rattlesnake will differ depending on whether a person is a novice hiker or a herpetologist looking for specimens. Adaptive responses depend on the ability to inhibit automatic or prepotent responses (eg, to ravenously eat the dessert or run from the snake) given certain social or environmental contexts or chosen goals and, in those circumstances, to select more appropriate responses. In conditions in which prepotent responses tend to dominate behavior, such as in drug addiction, where drug cues can elicit drug seeking (Chapter 16), or in attention deficit hyperactivity disorder (ADHD; described below), significant negative consequences can result.
Phylogenetic comparison of the proportion of the brain taken up by prefrontal cortex in five different mammalian species. (Adapted with permission from Kandel ER, Schwartz JH, Jessel TM. Principles of Neural Science, 3rd ed. New York: Elsevier; 1991:837.)
Regions of the prefrontal cortex involved in complex cognitive function. Left panel, lateral view showing the dorsolateral prefrontal cortex (DLPFC). Right panel, sagittal section showing the medial prefrontal cortex (MPFC), orbital frontal cortex (OFC), and anterior cingulate cortex (ACC).
Because the prefrontal cortex does not subserve primary sensory or motor functions, language production, or basic intelligence (eg, arithmetic calculation), it was once thought to be silent or nearly so. As late as the 1950s, prefrontal lobotomy was used to treat schizophrenia and other severe mental disorders and was rationalized, in part, because basic intelligence survived the surgery. However, damage to the prefrontal cortex has a significant deleterious effect on social behavior, decision making, and adaptive responding to the changing circumstances of life. An important role for regions of the prefrontal cortex was demonstrated as early as the 19th century by the case of Phineas Gage 14–1.
14–1 Phineas Gage
Phineas Gage was a 25-year-old construction foreman, whose team was laying new track for the Rutland and Burlington Railroad when, on a September day in 1848, he became the victim of a dramatic accident. While using a tamping iron to pack blasting powder into a hole drilled in rock, he accidentally set off an explosion that sent the fine-pointed iron through his face, skull, and brain (see figure). The force of the explosion was such that the iron landed yards away. Remarkably, Gage rapidly regained consciousness and was able to talk and, with the help of his men, to walk. As remarkably, during the preantibiotic era, he returned to physical health and showed no signs of paralysis, impaired speech, or loss of memory or general intelligence.
Despite his recovery, Gage was a transformed man, and not for the better. Prior to the accident, he had been polite, responsible, capable, and well socialized; indeed, he had earned the role of foreman at an early age. After the accident he became unreliable and could not be trusted to keep his commitments and thus lost his job. He no longer observed social convention; it is documented, for example, that his language became quite profane. He wandered over the next several years and died in the custody of his family, never having held a responsible position again.
Gage’s physician, John Harlow, who had related Gage’s altered behavior to his brain injury, learned of Gage’s death from his family approximately 5 years after its occurrence and convinced them to exhume the body. As a result, Gage’s skull and the tamping iron, with which he had been buried, are available for study, thus permitting the reconstruction shown in the figure: left, reconstruction of path of tamping bar through the skull; right, midlevel transverse section of the brain showing damaged area of medial prefrontal cortex and preserved Broca area (yellow), motor cortex (red), Wernicke area (blue), and sensory cortex (green).
Gage’s lesion involved portions of the left orbital prefrontal cortex and portions of both left and right anterior medial prefrontal cortices. Based on Harlow’s reports of Gage’s behavior, and on what we now know of the functioning of these brain regions (see text), Gage’s lesions explain the degradation of his social behavior and his inability to guide his behavior in accordance with long-term goals. It appears that Gage’s dorsolateral prefrontal cortex was spared, thus preserving other domains of cognitive control.
Since Gage’s time, there have been many studies of human brain damage affecting the prefrontal cortex and more recently functional imaging studies. The case of Phineas Gage illustrates the central importance of the prefrontal cortex in integrating emotion and cognition in the service of executive function.
(From Damasio H, Grabowski T, Frank R, Galaburda AM, Damasio AR. The return of Phineas Gage: clues about the brain from the skull of a famous patient. Science. 1994;264:1102–1105.)
Several subregions of the prefrontal cortex 14–2 have been implicated in partly distinct aspects of cognitive control, although these distinctions remain somewhat vaguely defined. The anterior cingulate cortex is involved in processes that require correct decision making, as seen in conflict resolution (eg, the Stroop test), or cortical inhibition (eg, stopping one task and switching to another). The medial prefrontal cortex is involved in supervisory attentional functions (eg, action–outcome rules) and behavioral flexibility (the ability to switch strategies). The dorsolateral prefrontal cortex, the last brain area to undergo myelination during development in late adolescence, is implicated in matching sensory inputs with planned motor responses. The ventromedial prefrontal cortex seems to regulate social cognition, including empathy. The orbitofrontal cortex is involved in social decision making and in representing the valuations assigned to different experiences. It is also implicated in impulsive and compulsive behaviors.
Executive function depends on working memory, a short-term, capacity-limited cognitive buffer that maintains a representation of sensory information. Working memory permits the integration and manipulation of this information to guide thought, emotion, and behavior. Working memory can be demonstrated in a range of mammals and has been studied extensively in nonhuman primates as well as humans. Findings drawn from primate research have been used to examine the mechanisms of working memory and its role in executive function. A classic experiment involved the placement of bilateral lesions in the prefrontal cortex of a chimpanzee and subsequent testing of the animal for delayed spatial responses. The chimpanzee watched as a piece of food was placed under one of several opaque containers. After a brief delay, the animal was allowed to choose a container. Unlesioned control animals uniformly chose the container with food, whereas lesioned animals made random selections. After several additional experiments were performed to determine the types of cognitive deficits involved, it became apparent that basic sensory and cognitive functions were intact in the lesioned chimps. What the chimpanzees lacked was the ability to maintain an internal representation of the food and its significance. These chimps needed ongoing sensory stimulation to track the food.
Our understanding of working memory has expanded considerably since these initial experiments. Subsequent studies, for example, have examined the electrical activity of particular neurons in prefrontal cortex in experimental paradigms that require working memory. In one such study, a monkey was conditioned to fix its eyes on a central point on a video screen 14–3. Subsequently a box was displayed briefly in one of eight areas on the screen. After a 3- to 6-second delay, the central fixation point was removed from the screen, and the monkey was trained to shift its gaze to the area where the box previously had been displayed. This study enabled the identification of neurons in prefrontal cortex that are specific for the region of the screen where the box was displayed. Such neurons become more active during the delay phase of the task and return to baseline levels of activity when the gaze returns to the area in which the box appeared. The increased activity of neurons in the prefrontal cortex during the delay phase of a task is one signature of working memory. This activity appears to provide an internal representation of the box even when it is not visible.
Repeated recordings from a single neuron during many trials over which a rhesus monkey performed an oculomotor delayed-response working memory task. During a test session, the monkey’s ability to make correct memory-guided responses is evaluated 10 to 12 times per target location. The neuron’s response is collated over all the trials for a given target location (eg, 135°, 45°) as a histogram of the average response per unit time for that location. The activity is also shown in relation to task events (C, cue; D, delay; R, response) on a trial-by-trial basis for each target location. The particular neuron being recorded fires maximally during the delay period when the target at the 135° location disappears and the monkey is maintaining fixation. This neural activity is maintained throughout the delay period until a response is made. Activity is also seen in the 90° and 180° targets during the delay period but less than that observed for this neuron’s best direction. Different neurons code different spatial locations providing a spatial map in working memory. (Reproduced with permission from Funahashi S, Bruce CJ, Goldman-Rakic, PS. Mnemonic coding of visual space in the primate dorsolateral prefrontal cortex. J Neurophysiol. 1989;Feb;61(2):331–349.)
The pharmacologic manipulation of working memory has been the focus of considerable investigation, in part because of the working memory deficits that characterize schizophrenia (Chapter 17) and in part because working memory may be impaired in ADHD and in several other conditions, including severe stress.
Mild dopaminergic stimulation of the prefrontal cortex enhances working memory; in contrast, higher levels of stimulation profoundly disrupt this function. Because stress is known to increase dopaminergic transmission from the ventral tegmental area to the prefrontal cortex, the actions of dopamine in this brain region may explain why low levels of stress can enhance performance in working memory tasks, whereas higher levels of stress can disrupt performance. These findings are consistent with evidence that working memory depends on an optimal level of stimulation of D1 dopamine receptors (see also Chapter 17). D1 agonists have, therefore, been studied as enhancers of working memory, although it has not yet been possible to generate such agonists devoid of troubling side effects such as nausea and vomiting.
Manipulation of the norepinephrine system also affects working memory. For example, α2-adrenergic agonists such as clonidine and guanfacine (14–4; see Chapter 6) enhance working memory, a finding that may explain the utility of these agents in the treatment of ADHD. The selective norepinephrine reuptake inhibitor (NRI), atomoxetine 14–4, is approved for the treatment of ADHD, although its effects on working memory per se have not been established. Atomoxetine does not appear to be distinct in its therapeutic properties from older tricyclic NRIs (eg, desipramine) that are approved for the treatment of depression (Chapter 15), although it does have milder side effects.
Medications used in the treatment of ADHD.
Therapeutic (relatively low) doses of psychostimulants, such as methylphenidate and amphetamine 14–4, improve performance on working memory tasks in individuals with ADHD and, at higher doses, in normal subjects. Positron emission tomography (PET) demonstrates that methylphenidate decreases regional cerebral blood flow in the dorsolateral prefrontal cortex and posterior parietal cortex while improving performance of a spatial working memory task. This suggests that cortical networks that normally process spatial working memory become more efficient in response to the drug. Both methylphenidate and amphetamines act by increasing synaptic levels of dopamine, norepinephrine, and serotonin, actions mediated via the plasma membrane transporters of these neurotransmitters and via the shared vesicular monoamine transporter (Chapter 16). Based on animal studies with micro-iontophoretic application of selective D1 dopamine receptor agonists (such as the partial agonist SKF38393 or the full agonist SKF81297) and antagonists (such as SCH23390), and clinical evidence in humans with ADHD, it is now believed that dopamine and norepinephrine, but not serotonin, produce the beneficial effects of stimulants on working memory. At abused (relatively high) doses, stimulants can interfere with working memory and cognitive control, as will be discussed below. It is important to recognize, however, that stimulants act not only on working memory function but also on general levels of arousal and, within the nucleus accumbens, improve the saliency of tasks. Thus, stimulants improve performance on effortful, but tedious tasks, probably acting at different sites in the brain through indirect stimulation of dopamine and norepinephrine receptors.
Although the animal studies described above involve very simple mental representations, working memory is at the heart of many complex mental tasks in higher mammals. When viewed as a rudimentary form of abstraction, working memory can be seen as crossing from the realm of the brain into that of the mind, where internal representations of the external world are consistent and reliable. The ability to think in abstract terms presumably allows humans to create a sense of identity, to establish goals, and to plan for the future.
In the mass of sensory stimulation by which an animal is bombarded at all times, it must have mechanisms to select the information that is most relevant for its particular situation and ultimately its survival. Selection of relevant information is the role of attention. Once attended to, information gains access to working memory, and can thereby be used to plan appropriate responses. In humans, responses may be simple, but may also involve complex trains of cognition, the activation of emotional circuits, and production of elaborate behaviors. It must be emphasized that much important sensory information is processed by the brain unconsciously, and need not be made conscious to elicit significant responses. For example, subliminal processing of fearful faces can activate physiologic aspects of the fear response in humans without ever reaching consciousness (Chapter 15). However, information that is processed consciously is, at some point, attended to and entered into working memory.
Attention is not a simple unitary function. It may be commanded by “bottom-up” sensory information such as a stabbing pain or the sudden appearance of a loud noise or bright flashing light. The concept of attention also includes effortful “top-down” processing involving the prefrontal cortical circuits that connect with other specialized brain regions. For instance, we can purposefully allocate attention (eg, to a particular spatial location, which involves parietal cortex), we can pay selective attention to specific features of the world (eg, the color of an object, which involves the inferior temporal cortex), and we divide our attention, suppress distractions, and concentrate (ie, sustain attention over time).
In one model, four basic cognitive processes are considered to be the building blocks of attention 14–5: (1) working memory, as mentioned earlier, is required to exert top-down control over those sensory representations that will be attended to and for effortful direction of “the searchlight” of attention. Working memory acts by guiding orientation, such as the direction of the body, the head, or the eyes, and also produces signals that influence the sensitivity of neural circuits that represent information. (2) The allocation of attention, for example, to a point in space, can be shown by functional neuroimaging in humans and by physiologic recordings in monkeys to alter activity in relevant brain regions or relevant neurons. Thus, working memory can exert sensitivity control over neural representations and thereby influence the selection of representations that are attended to. Patients with ADHD have difficulty sustaining attention or ignoring distractions, suggesting problems with sensitivity control. (3) Diverse sensory information is subjected to salience filters so that irrelevant information does not gain access to working memory. Patients with psychotic disorders including schizophrenia attend to irrelevant and to hallucinatory stimuli, suggesting a failure of filtering processes. Salience is determined, for example, by innate or learned predictors of threat or reward or by the appearance of rare or high-intensity stimuli. (4) Sensory representations that pass salience filters are subjected to competitive processes that select the strongest signals for access to working memory.
Cognitive building blocks of attention. These cognitive modules can have distributed implementation in the brain. The processes that contribute to attention are in red type. Bottom-up processing occurs when sensory information is permitted to pass by salience filters tuned to innate or learned survival-relevant stimuli and to novel or highly salient stimuli. The neural representations of these stimuli are processed in circuits relevant to the type of information they contain (eg, different sensory modalities, interoceptive information). These neural representations then enter a competitive process that selects the one with the highest signal strength for access to working memory. Working memory directs gaze and other orienting behaviors as well as signals that modulate sensitivity to representations for access to working memory. (Adapted with permission from Knudsen EI. Fundamental components of attention. Annu Rev Neurosci. 2007;30:57–78.)
The monoamine neurotransmitters and orexin have a critical permissive role in attention by regulating arousal (Chapters 6 and 13). Performance on cognitive tasks has different optimal levels of arousal depending on the degree of effort required. This is captured in simplistic form by the Yerkes–Dodson principle 14–6 that expresses the relationship of arousal to performance for specific tasks as an inverted U-shaped curve. Such a curve also captures the effects of pharmacologic agents that influence arousal, ranging from psychostimulants and caffeine (Chapter 13) among those drugs that increase arousal to β-adrenergic antagonists and benzodiazepines that might decrease maladaptive arousal in an extremely anxious person. Beyond these general permissive effects, dopamine (acting primarily via D1 receptors) and norepinephrine (acting at several receptors) can, at optimal levels, enhance working memory and aspects of attention. Drugs used for this purpose include, as stated above, methylphenidate, amphetamines, atomoxetine, and desipramine. Modafinil is effective in improving both arousal and attention; it is believed to act (like methylphenidate) by inhibiting the dopamine transporter, although this is not established with certainty (Chapter 13).
Yerkes–Dodson Principle. This principle (dating from 1908) captures the inverted U-shaped relationship between arousal and performance. Performance on diverse cognitive tasks improves with arousal, but only up to a point; when arousal becomes too great, performance declines. Shown here are two Yerkes–Dodson curves illustrating that lower levels of arousal are optimal for hard tasks (eg, tasks that demand greater cognitive resources) and higher levels for easy tasks or tasks that require greater persistence.
Attention deficit hyperactivity disorder
ADHD is characterized by symptoms in three dimensions of behavior: inattention, impulsiveness, and hyperactivity. Hyperactivity may be absent, in which case the term attention deficit disorder (ADD) may be used. These dimensions are continuous with normality, but, when severe, ADHD produces significant impairment. ADHD can be a profound obstacle to success in school or work, despite normal intelligence. It also increases the risk for substance abuse and accidents, as well as for comorbid depression, anxiety disorders, and conduct disorders.
ADHD begins in childhood, often very early, but is typically diagnosed when children enter school. While symptoms often remit during teen years, a substantial fraction of individuals with ADHD remain symptomatic in adulthood. Based on current diagnostic criteria, worldwide prevalence ranges between 3% and 5%, but widely divergent diagnostic practices in different regions of the world produce varying local estimates. Indeed, the diagnosis and treatment of ADHD varies widely in the United States, with far higher rates in affluent, suburban communities compared with those in poorer, inner city areas.
ADHD is highly heritable, with genetic factors comprising ~75% of the risk. However, like all major psychiatric disorders, including depression, bipolar disorder, and schizophrenia, ADHD is genetically complex, making the identification of risk alleles very challenging. Elucidating genetic factors that contribute to ADHD risk awaits large-scale genome-wide studies.
ADHD can be conceptualized as a disorder of executive function; specifically, ADHD is characterized by reduced ability to exert and maintain cognitive control of behavior. Compared with healthy individuals, those with ADHD have diminished ability to suppress inappropriate prepotent responses to stimuli (impaired response inhibition) and diminished ability to inhibit responses to irrelevant stimuli (impaired interference suppression). Such deficits have been documented as well in functional MRI studies.
The ability to suppress prepotent responses is thought to require the action of frontal-striatal-thalamic circuits 14–7. A series of parallel loops connect the prefrontal cortex with specific regions of the basal ganglia and, via the thalamus, project back to prefrontal cortex. These loops are thought to be involved in the initiation and control of motor behavior, attention, cognition, and reward responses. Functional neuroimaging in humans demonstrates activation of the prefrontal cortex and caudate nucleus (part of the dorsal striatum) in tasks that demand inhibitory control of behavior. Subjects with ADHD exhibit less activation of the medial prefrontal cortex than healthy controls even when they succeed in such tasks and utilize different circuits.
Cortical-striatal-thalamic loops. The basal ganglia are a set of related subcortical nuclei consisting of the striatum (composed of the caudate nucleus, the putamen, and the ventral striatum), the globus pallidus, the substantia nigra (composed of the dopaminergic pars compacta and the pars reticulata), and the subthalamic nucleus. The striatum receives major inputs from the cerebral cortex, thalamus, and brainstem (including dopamine from the substantia nigra) and projects via the globus pallidus and substantia nigra pars reticulata to the thalamus and thence to the prefrontal cortex. As shown, this arrangement gives rise to a series of parallel loops. Different sets of cortico-striatal-thalamic loops have specialized functions depending on the cortical areas that give rise to them and that receive their input. A motor loop that passes through the putamen may be involved in Tourette syndrome, while a cognitive loop through the caudate nucleus may be involved in the obsessions that characterize obsessive–compulsive disorder. See 18–10 for more detail on striatal circuits.
Early results with structural MRI show a thinner cerebral cortex, across much of the cerebrum, in ADHD subjects compared with age-matched controls, including areas of prefrontal cortex involved in working memory and attention. Longitudinal studies comparing ADHD and control populations find that cortical thickness tends to normalize during adolescence and young adulthood at least in some patients. These results, while still preliminary, suggest that ADHD might involve an abnormal rate of cortical maturation rather than a unique pattern of cortical development. It is hypothesized that residual symptoms in adulthood might result from incomplete normalization, but at this point the data are not available. Volumetric differences have also been reported for several subcortical structures.
As suggested by the previous discussions of the pharmacology of working memory and attention, ADHD can be treated with low doses of psychostimulants, such as methylphenidate and amphetamine, which remain by far the most effective treatments available. The most widely used treatments are sustained-release preparations of methylphenidate that compensate for its short half-life, or mixtures of amphetamine derivatives with different half-lives to provide both early and extended treatment during the day. The main side effects of stimulants are appetite suppression, growth delay, and insomnia. Atomoxetine or other NRI antidepressants such as desipramine represent an alternative for patients who do not tolerate stimulants, although these drugs are not as effective as stimulants in most cases. The α2-adrenergic agonists, clonidine and guanfacine, show beneficial effects in a subset of patients, but their use is not well supported by clinical trial data. Children and adults with ADHD also benefit from behavioral treatments aimed at enhancing top-down control of behavior.
Stimulant prescriptions have engendered controversy because these drugs are potentially abusable (when the drugs are used at higher than therapeutic doses), because they are used even in young children (as early as age 3), and because they may be used with benefit by individuals with only mild or even absent symptoms, in which case they are being used for cognitive enhancement rather than treatment. Controversy notwithstanding, there are extensive clinical trial data that demonstrate that stimulants effectively reduce ADHD symptoms, albeit with little data bearing on long-term academic and employment outcomes. Several longitudinal studies suggest that untreated ADHD is associated with elevated risk of substance abuse and conduct disorders, as stated earlier, and that stimulant treatment decreases that risk. Two factors may be at play. First, supervised use of stimulants at therapeutic doses may decrease risk of experimentation with drugs to self-medicate symptoms. Second, untreated ADHD may lead to school failure, peer rejection, and subsequent association with deviant peer groups that encourage drug misuse.
Obsessive–Compulsive Disorder Spectrum Disorders
Obsessive–compulsive disorder (OCD) is characterized by obsessions (intrusive, unwanted thoughts) and compulsions (highly ritualized behaviors intended to neutralize the anxiety and negative thoughts resulting from the obsessions). Individuals with OCD experience the obsessions as unwanted and even nonsensical, but powerfully insistent. Attempts to resist the performance of compulsions result in high levels of anxiety. Typical symptom patterns are remarkably similar across cultures, such as repetitive hand washing, sometimes hours a day to the point of skin damage, to neutralize fears of contamination, or repeatedly checking the front door because affected individuals cannot be sure the door is locked. OCD often begins in childhood or teen years, although later onset can occur. It is estimated to have a heritability of 40% to 50%, although specific genes that contribute to this risk have not yet been identified.
There is a high degree of comorbidity between OCD and Tourette syndrome and related tic disorders: about 20% of OCD patients have a chronic tic disorder, while about 50% of Tourette patients have or will develop OCD. OCD is also often accompanied by depression or by body dysmorphic disorder (relentless obsessions about one’s supposed bodily deformities). Tourette syndrome and body dysmorphic disorder are thought to be related etiologically or pathophysiologically to OCD and are sometimes described as part of an OCD spectrum.
Tourette syndrome is characterized by motor and phonic tics that wax and wane in severity over time. Tics are habitual movements or vocalizations that appear suddenly and that mimic fragments of normal behaviors. They usually begin during childhood, and may start as early as age 3. Patients with Tourette syndrome may also have OCD, ADHD, and other behavioral disorders characterized by poor impulse control. While stress can exacerbate the symptoms of Tourette syndrome, as seen for most neurologic and psychiatric disorders, there is no evidence that stress per se contributes to the pathogenesis of the illness. Indeed, Tourette syndrome is estimated to be highly heritable (~60%), with no bona fide risk genes yet identified.
Frontal-striatal-thalamic circuits are thought to be involved in implicit learning that consolidates fragments of motor behavior, speech, or thought into smooth, but flexible routines, for example, the ability to drive one’s car home without effortful attention, or the ability of an actor to deliver his or her lines without thinking. These well-learned assembled routines are often described as habits. The striatum and other nuclei of the basal ganglia are thought to recode both cortical (eg, cognitive and motor) and subcortical (eg, emotional and motivational) inputs into sequences, sometimes called “chunks” that permit appropriate cues to release automatic responses.
Structural MRI studies of OCD patients show reduced gray matter volume in the medial frontal gyrus, medial orbitofrontal cortex, and other regions. Functional imaging studies of OCD have reported increased activity in components of cortical-striatal-thalamic circuitry including the orbitofrontal cortex and the caudate nucleus. Similarly, in Tourette syndrome, increased activity has been observed in other regions of the prefrontal cortex as well as in the caudate. These and similar observations, combined with increasing basic knowledge of the functioning of basal ganglia circuits (see below), have led to the hypothesis that OCD and Tourette syndrome result from abnormal activity in different cortical-striatal-thalamic loops inappropriately releasing unwanted intrusive thoughts (OCD) or fragments of motor behavior (Tourette). Consistent with this hypothesis, genetic deletion of certain key synaptic proteins (eg, SLITRK5 and SAPAP3) in the striatum of mice elicits dramatic excessive grooming, which at least phenocopies symptoms of OCD. Whether such mechanisms contribute to OCD in humans remains unknown.
Despite the hypothesized similarities in pathophysiology and the frequent co-occurrence of OCD in Tourette patients, the pharmacologic treatments differ. OCD is treated with selective serotonin reuptake inhibitors (SSRIs) such as fluoxetine or sertraline, or with tricyclic antidepressants that have relative selectivity for the serotonin transporter such as clomipramine (Chapter 15). OCD is also responsive to cognitive behavioral therapies focused on stopping unwanted thoughts and rituals, presumably acting to facilitate top-down cognitive control. The doses of SSRIs required for OCD are usually higher than those for depression, and the onset of therapeutic benefit is often even slower (up to 10 weeks). SSRIs are also effective in the treatment of body dysmorphic disorder. Deep brain stimulation, targeting the anterior limb of the internal capsule (which includes the nucleus accumbens), is also approved for the treatment of severe OCD.
In contrast to OCD, Tourette syndrome is most often treated with both first- and second-generation D2 dopamine receptor antagonist antipsychotic drugs at lower doses than those used to treat schizophrenia. α2-Adrenergic agonists (eg, clonidine and guanfacine) are also effective in many patients. These drugs are not efficacious in treating OCD. Such differences in treatment response suggest that ascending serotonin and dopamine projections may modulate parallel but separate cortical-striatal-thalamic loops quite differently.