Synaptic Plasticity I: Facilitation and Depression

From NeuroWiki
Jump to: navigation, search

Introduction

  • In the last two units, we have focused our attention on synaptic transmission, with a particular focus on chemical synaptic transmission.
  • The rationale for doing this, as was mentioned earlier, is that changes in the strength of chemical synapses form one important locus of plasticity, i.e., the ability of the nervous system to manifest longterm changes as a function of experience, which forms an important basis for learning and memory.
  • In this unit, we will begin by discussing different forms of learning and memory. We will then discuss some of the neuronal mechanisms that may underlie learning and memory. You will then do virtual experiments on a model plastic synapse that utilizes one of these neuronal mechanisms.

Forms of Learning and Memory

  • The rationale for attending classes is that, ideally, the experience will lead to longterm changes so that if you are asked to do something that shows mastery of the material that you learned in class, you will do a much better job after taking the class than you did before taking the class. In other words, the assumption is that you will learn something as you take the class, and then demonstrate that you have formed memories of the material that was presented in class after you have taken it.
  • We take this capability for granted, though we often do not exercise it to its fullest extent. Indeed, we may have a better memory for television trivia, tricks and techniques for video games that we like to play, names and food preferences of close friends, and obscure sports statistics, than we have for any material presented in any of the classes that we take. This illustrates that learning is intimately connected with our motivational state, and with our use of the material we've learned. If we regard a new piece of information as very important, we are far more likely to remember it later. In contrast, if we find the new piece of information no more or less interesting than anything else that is happening around us at the time, we may be unable to remember it even relatively shortly after it was presented to us. Similarly, if we are very sleep deprived, our ability to assimilate new information is greatly reduced. Of course, students often impose such a state on themselves, especially prior to midterms and finals, and so it is no wonder that they may not perform as well as they would like!
  • We tend to think of memory and learning as primarily about verbal and symbolic material: for example, dates, names, equations, or ideas. However, a large part of our brain is devoted to other memories that are equally important, such as our ability to walk (which we learned as young children), to ride a bicycle or drive a car, to play piano or tennis, and many other complex motor skills that we can do "without thinking" (though much of our nervous system is very intensely engaged as we perform these skilled behaviors).
  • Although we tend to think of all the different forms of learning and memory as essentially the same, clinical studies have shown that lesions of specific brain regions can cause marked deficits in one kind of memory, while sparing other kinds completely. This has provided a neurological basis for classifying memory and learning.
  • For example, a famous patient (H.M.) had a bilateral medial temporal lobectomy, and although this surgery was reasonably successful in reducing his epileptic seizures, he was essentially unable to form new verbal memories. In contrast, he was able to master complex motor skills, although he had no recollection of ever having been trained to do them previously, and was surprised by his skill at them whenever the tasks were again presented to him.
  • These and other clinical studies led to the division of memory into two major areas: the first is referred to as explicit or declarative memory, which consists of semantic memory (memory for facts, such as the dates of the American Revolution), and episodic memory (memory for specific events in an individual's life, such as the time and date of their first kiss).
  • The second kind of memory is referred to as implicit or nondeclarative memory. Examples of implicit memory are priming, in which seeing a partial set of stimuli (such as part of a picture, or the first few letters of a word) increases the likelihood that a subject will correctly guess the rest of the image or the word; procedural learning, i.e., learning of skills or habits (such as walking, biking, or biting your fingernails).
  • Two forms of implicit learning that have been the focus of a great deal of attention, because they can be studied both in humans and in animals, are associative and nonassociative learning.
  • Associative learning is usually classified further using the different ways in which a sensory stimulus and a motor output can be linked to one another: classical conditioning and operant conditioning.
  • In classical conditioning, an initially neutral stimulus is paired repeatedly with a salient stimulus, that on its own can evoke a particular response. Over time, the neutral stimulus can now evoke this response.
    • To make this concrete, consider the original example of classical conditioning. Pavlov was studying gastric secretion in dogs. He had set up cannulas so he could measure secretions of saliva in response to food. To his surprise, he found that the dogs' secretions increased when they saw him come into the lab, and then realized that his coming into the lab was associated with feeding. Instead of figuring out a way to come in so the dogs wouldn't sense him, he instead decided to study this phenomenon.
    • The sight of food (the unconditioned stimulus or UCS) would evoke increased salivation on the part of the dogs (the unconditioned response, or UCR). Now, if he paired the sound of a bell (the conditioned stimulus, or CS) with the sight of food, after repeated pairings, ringing the bell alone would induce salivation (the conditioned response, or CR).
  • In operant conditioning, an animal's behavior is paired with a reward or a punishment, and this alters the frequency with which the animal performs that behavior.
    • For example, if a cat is placed in a wire frame cage, and there is a door in the cage operated by a latch, and on the other side of the cage (opposite from the door) there is a dish outside the cage filled with delicious food, the cat will initially try to scratch its way through the side of the cage closest to the food. Once it finds that this is futile, it will explore the remainder of the cage, and it may bat at the lever. If doing this a few times leads the cage door to open, the cat will be out of the cage and at the food in a flash. If the cat is now replaced in the cage with the door closed, the cat will take almost no time to open the lever and get to the food.
    • These behaviors were studied extensively by B. F. Skinner, and others, using cages that had levers that (for example) a rat could press to obtain food or avoid a shock to its feet. If the reward occurred after a certain number of bar presses (constant ratio reinforcement), or after a fixed amount of time (constant interval reinforcement), the pattern of bar presses changed to reflect the reward contingencies.
  • Nonassociative learning involves long term changes in behavior as a consequence of experience, but does not involve the association of a particular stimulus with a particular response. Examples of nonassociative learning are habituation and sensitization.
    • A stimulus that is continuously present but does not change, and is not salient to the immediate needs of the organism, will be ignored. This process is referred to as habituation. To give three salient examples:
      • (1) You are wearing clothing. If they are not uncomfortable, you cease to be aware of them unless this is pointed out, and then you can notice that sensory input is coming from the entire surface of your body, which you are essentially filtering out;
      • (2) Your tongue is in your mouth. Once your attention is drawn to it, you definitely sense all of the things that it is sensing; but unless these change, you will again filter this information out;
      • (3) Most rooms have heating/cooling/ventilation systems that generate a steady background noise. As long as this does not change, you can filter this sound out.
    • A stimulus that is painful or startling may increase the response of an organism to subsequent stimuli. This is known as sensitization.
      • (1) If you hear the sound of a gunshot, then you will be extremely alert to any other sound immediately afterwords, such as the buzzing of a fly, even if it is not salient to the original stimulus.
      • (2) If someone gives you a painful pinch, then an unexpected light touch (from someone else, and in different part of your body) may cause a much stronger withdrawal response than you would ordinarily have given to that light touch.
  • Another important distinction that is often made is between short term and long term memory.
  • Short term memory or working memory allows us to maintain a relatively small number of items (5 to 7 for most people) in mind for a relatively short period of time. Before the advent of programmable smart phones, the phone company made every effort to keep phone numbers to seven digits, because large phone numbers could not be retained in memory during the time it took to enter the numbers in the phone (in the old days, to dial the numbers, but even now, to enter the numbers on your digital keypad).
  • Long term memory allows us to remember material for days, weeks or years. The name of our best friend (or worst enemy) from grade school, or the address of our home at the age of ten, is likely to stay with us for many, many years.
  • Transferring memories from short term memory to long term memory is not easy, as anyone who has studied for an exam will attest.
  • Studies have shown that being re-exposed in multiple trials to training materials with breaks (including time for sleep) leads to far better retention of material, even if there are distractions between trials. In contrast, applying all the learning trials in one large training sequence (massed trials) is less effective.
  • Thus, we have made an attempt to repeat materials from earlier units in later units, and then build upon this material. We have also made an effort to provide you with multiple ways of assimilating and reinforcing the material: by being able to re-read it multiple times on NeuroWiki, by discussing it with your teammates and table mates in the context of working through problems, by discussing it with your instructors, and by writing it down yourselves in your answers to the questions. We hope that some of this material will last in your long term memories beyond the semester!
  • The other aspect of learning we are incorporating into the class is that we are encouraging you to develop a dynamic understanding of the processes that occur within the nervous system, rather than memorizing words that simply describe a static process. Thus, many high school students know that during the action potential, sodium ions move in and potassium ions move out. If you've mastered the material so far, you have an understanding of the electrochemical forces that induce the ions to move; and rather than knowing words like "threshold", "rising phase", "falling phase", "after hyper polarization", "refractory period", you have an understanding of the underlying biophysical processes (the changing configurations of the m, n and h gates), and how they vary over time.
  • Creating an internalized "model" of a process is one of the best ways to really learn it; in addition, combining this with an actual quantitative model and with experimental manipulations is the best way to determine both the strengths and the weakness of your internal model. If you learn some of these skills in this class, they will serve you in good stead for the rest of your career.

Behavioral versus Cellular Aspects of Memory

  • Although many aspects of the nervous system can lead to memories that can last from milliseconds to many years, it is important to distinguish between behavioral observations and neural mechanisms of plasticity, and not to assume that a behavioral change must imply a specific change in the properties of a circuit or individual neurons.
  • The reason is that the underlying mechanisms for a behavioral change may involve many neuronal circuits and many different cellular properties within those circuits.
  • It is also possible that identical cellular mechanisms could contribute to very different changes at the behavioral level.
  • Finally, although it is often implicitly assumed that plastic mechanisms are somehow distinct from other parts of the nervous system, this may simply not be true, and learning and memory may simply reflect slower dynamical processes throughout the nervous system.
  • A major unsolved problem in neuroscience, to which some of you might contribute, is the relationship between local changes in plasticity at the level of a single synapse, and global changes in the overall dynamics of the nervous system that lead to the behavioral changes we have described. Perhaps the system in which the most progress has been made so far has been in the nematode Caenorhabditis elegans (C. elegans). Recent work on this system is reviewed here.

Cellular and Circuit Mechanisms of Plasticity

  • Many of the properties of the nervous system that you have (in theory!) already learned about in earlier units could serve as the basis for different forms of memory.
  • Properties of the passive membrane: the capacitance and the resistance (conductance) of the passive membrane define the time constant of that region of the nerve cell, and this determines the time it takes for an input signal to decay away. Thus, the rate at which inputs come into the passive membrane, relative to the time it takes for signals to decay away, determine whether the membrane will act as if nothing has happened previously, or whether it responds differently to the new inputs (and thus shows a very simple form of memory).
  • The action potential: Because the sodium inactivation gates take a long time to return to their initial values, inputs to a neuron shortly after an action potential will have a different effect than they would if the neuron had not been fired for a long time previously. This also constitutes a form of memory, and can last many hundreds of milliseconds.
  • Other conductances: Other conductances constitute a form of memory, as we saw. We explored how multiple conductances could be combined to generate bursting behavior, which can maintain a memory of inputs that interrupt bursts for an extended period of time (i.e., by changing the phase of the burst in response to a perturbation). In some cells, depending on the input to the cell, the cell can be put into a quiescent, non firing state, or, with a different input, be switched into a bursting state, and these two different forms of behavior constitute another kind of longterm memory. Switching a system between two qualitatively different output states, in each of which it can remain indefinitely, is called bistability. Note that this form of memory could persist for very long periods of time.
  • Chemical networks: Within nerve cells, other potential forms of memory are changes of chemical compounds and networks. We have already seen that build up in the level of calcium ions within a nerve cell can serve as the basis for longer term changes in a nerve cell's behavior (for example, by activation of the calcium-dependent potassium ion channels). There are many second messenger networks within nerve cells that can also serve to form a longer term memory, because they may alter a nerve cell's excitability.
  • Synaptic changes: Many investigators naturally assume that when one described plasticity in the nervous system, one is describing changes at chemical synapses. These are clearly important, but as this section illustrates by listing many other possibilities, they are only one of a variety of potential changes that can occur that may maintain memory. Some of the synaptic changes that have been studied are:
    • Synaptic depression: When some neurons are repeatedly activated, their synaptic connection may weaken. Recent investigations have suggested several mechanisms for synaptic depression:
      • Reduction in the size of the readily releasable pool of vesicles (i.e., those at docking sites near the presynaptic terminal);
      • Inactivation of release sites, which may be due to the time needed to recycle the vesicular membrane that fused with the presynaptic terminal when a vesicle released its contents;
      • Calcium-dependent calcium channel inactivation: Influx of calcium can be sensed by calcium-binding proteins (e.g., calmodulin) that then act to inhibit the voltage-dependent calcium channels, thus reducing further calcium influx and suppressing transmitter release.
    • Synaptic facilitation: When some neurons are repeatedly activated, their synaptic connection may become stronger, and this effect may last for many milliseconds. Recent investigations have suggested several mechanisms for synaptic facilitation:
      • Increased residual calcium after each spike, which could increase release probability;
      • The influx of calcium saturates the molecules that bind and buffer calcium. Calcium buffer saturation in turn leads to increased levels of calcium in the presynaptic terminal, and thus increased release;
      • Calcium-dependent calcium channel activation: Binding of calcium to proteins could lead to enhancement of the activity of voltage-dependent calcium channels, and thus an increase in calcium influx;
      • Increasing action potential duration in the presynaptic terminal could lead to a larger calcium influx, and thus a larger release of transmitter. This could be due to alterations in some of the other conductances that affect spike duration (such as the fast potassium current).
    • Note that this is not the same as the sign of the synapse and that you can have a facilitating inhibitory synapse or a depressing excititory synapse.
    • Augmentation and post-tetanic potentiation: After repeated stimulation, a synaptic connection may be enhanced for a short period of time (augmentation, 5 - 10 seconds), or, if the presynaptic neuron has been subjected to a long and intense burst of activity (a tetanus), for a much longer period of time (post-tetanic potentiation, minutes to hours). Recent investigations suggest several mechanisms may contribute to these forms of plasticity:
      • Residual calcium: The time course of decay of residual calcium in some synapses parallels the enhancement of release;
      • Increased quantal size: In the calyx of Held, increased calcium influx induces vesicles to fuse with one another, and as a consequence, the actual miniature postsynaptic potential size increases;
      • Changes in the readily releasable pool: Some vesicles that were more distant from the presynaptic terminal become releasable by action potentials, thus effectively increasing the readily releasable pool.
        • A recent review of short-term synaptic changes is here.
    • Spike-timing dependent plasticity: If a neuron fires, and rapidly induces a post-synaptic neuron to fire, the connection between the two neurons is enhanced for a long period of time. If the postsynaptic neuron fires just before the presynaptic neuron fires, the connection is depressed. This observation confirms a hypothesis that was proposed by D. O. Hebb in the middle of the twentieth century, which has often been summarized more colorfully as "neurons that fire together wire together." Initially, many of these phenomena were observed by imposing tetanic stimuli on the presynaptic neuron; more recently, these phenomena have been observed even using much more physiological patterns of activation, but depend critically on positive or negative correlations between activity in the presynaptic and postsynaptic neurons.
      • Positive correlations that enhance synaptic transmission are referred to as long term potentiation (LTP).
      • Negative correlations that reduce synaptic transmission are referred to as long term depression (LTD).
      • These effects act at both the presynaptic and postsynaptic terminals, and may also have long-lasting effects on the excitability of the neurons. A recent review of this extensive literature is here.
    • Homeostatic synaptic plasticity: Each of the synaptic mechanisms described so far leads to a potential problem: they could create a positive feedback loop that ends up saturating the synapse at its maximum possible value, and potentially creating pathologically excited circuitry, or conversely could push the synapse down to a minimum value such that it effectively no longer functions. Under some circumstances, this could be an appropriate outcome. However, this would imply that synapses are essentially "used up" as a consequence of plasticity. Alternatively, mechanisms could operate that restore the synaptic strength over time to previous values, allowing them to be used again and again for plastic changes, or that scale down synaptic strengths, or create decreases in strengths of some synapses to allow other synapses to be stronger. This process is referred to as homeostatic synaptic plasticity, and has recently become an area of intensive investigation. A variety of molecular mechanisms could allow synapses to maintain their relative efficacy even if the circuits of which they are a part become more or less active. A recent review of studies of this important phenomenon is here.
  • Morphological changes: As we learned in the units on cable properties, the shapes of neurons affect how they transform inputs in time and space. Since neurons are living cells, their processes and connections can change over time as a function of their experience, so that their connections and shape can serve as a form of longterm memory.
  • Reverberating circuits: Although we have not dealt with neuronal circuits that are larger than two neurons, if neurons excite one another in a closed cycle, or even inhibit one another in a closed cycle, under conditions in which the inhibition can wear off and the inhibited neuron can begin to fire (as we saw, conductances like I_h and low threshold calcium channels make this kind of behavior possible), it is possible to create reverberating circuits that maintain memory though their continuing ongoing pattern of activity. This form of memory could also persist for long periods of time.
  • Combinations: It is unlikely that any of the changes we have described occur in isolation from one another. If they are properly coordinated, then (for example) the changes in synaptic weights and morphologies can alter how sensory inputs are processed, and make certain dynamic patterns of activity (like a reverberating circuit) more or less likely to occur in response to an input.
  • An excellent overview of the development of our understanding of synaptic plasticity over the last forty years, authored by one of the pioneers in this area (Eric Kandel) can be found here.

Exploring Synaptic Plasticity

In the simulation you will study in this unit, two cells are connected by a single chemical synapse. We have intracellular electrodes in both the presynaptic and postsynaptic cells, in addition to calcium sensitive dyes that allow us to track the level of intracellular calcium in both the presynaptic and the postsynaptic neurons. We will use these to explore one type of synaptic plasticity.


Question 1: What do you see when running the simulation with the default parameters? Give three different mechanisms that could cause the initial failure to generate an action potential in the postsynaptic neuron, followed by successful induction of action potentials. Hint: there are presynaptic and postsynaptic mechanisms that you could propose from this unit, and from previous units.

Question 2: How could you use quantal analysis to support or refute some of these explanations? Relate the results of quantal analysis (i.e., measures of m and q) to the mechanisms you suggested in Question 1.

Question 3: Let us explore the plasticity of the synapse by studying the effects of presynaptic and postsynaptic calcium levels on the strength of the synapse. First, note the changes in presynaptic calcium concentration in the second panel of the simulation, and describe what is happening to the levels over time. In the lab, you can inject calcium buffering agents (such as BAPTA) into a cell to cause the calcium levels to drop more quickly. To do an equivalent manipulation on the simulation, please lower the presynaptic calcium time constant to 10 ms. What effect does this have on the calcium levels and on the strength of the synapse?

Question 4: Press the Reset button. What are the effects of lowering the postsynaptic calcium time constant to 10 ms? Please describe the effects on the postsynaptic calcium levels (fourth panel of the simulation) and on the effectiveness of the synapse between the two neurons.

Question 5: One can also reduce the ability of a cell to get rid of calcium by blocking various calcium pumps (e.g., with digoxin or phospholamban). Press the Reset button. What effect does raising the presynaptic calcium time constant to 150 ms have on the calcium levels in the presynaptic neuron and on the strength of the synapse?

Question 6: What effect does raising the postsynaptic calcium time constant to 500 ms have on the calcium levels in the postsynaptic neuron and on the strength of the synapse? (Again, make sure to press the Reset button before you make the change).

Question 7: Based on what you've seen, is the plasticity in this synapse postsynaptic or presynaptic? Would this be an example of synaptic facilitation or synaptic depression? Explain.