A New Model of Synaptic Plasticity: Neurons Depend on Their Neighbours for Learning

Post by Laura Maile

The takeaway

Plasticity, or strengthening of synapses over time, is dependent on a complex interaction involving both excitatory and inhibitory inputs from a network of nearby neurons. When we learn, we rely not on single inputs and outputs, but on dynamic communication between networks of neurons.  

What's the science?

Synaptic plasticity is the collection of changes to both excitatory and inhibitory connections between neurons that occur when we learn. Historically, the understanding has been that this plasticity functions at the single synapse level, relying on the activity of a single presynaptic neuron and the response of its partner across the synapse. Hebb’s theory of learning states that when a presynaptic neuron repeatedly fires and activates a neighboring neuron, their connection is strengthened. More recent evidence has shown that learning and plasticity are more complex, integrating both excitatory and inhibitory information from other neighboring synapses, dependent on the circuitry of nearby networks. Scientists have not yet agreed upon a framework to explain this concept of interdependent synaptic plasticity in biological models. This week in Nature Neuroscience, Agnes and colleagues describe a new model of synaptic plasticity that relies on the activity of multiple neighboring synapses.  

How did they do it?

The authors created a theoretical model consisting of a set of rules integrating the timing, strength, distance, and identity of excitatory and inhibitory inputs to describe the interaction between sets of neurons during learning. They first utilized two excitatory neurons isolated from other inputs and presented different stimulation patterns to the pre and post-synaptic neurons. To increase the system's complexity, they introduced neighboring synapses to the same excitatory postsynaptic neuron. Next, to determine the influence of multiple neighboring inputs, they modeled a single postsynaptic neuron with several presynaptic inputs spaced apart uniformly. The authors then sought to understand how synaptic plasticity influences the receptive fields of neurons, or their ability to respond to different stimuli. To do this, they simulated a neuron receiving eight different inputs, to mimic the input of eight different frequencies of sound. They mimicked the learning period by using inhibitory inputs to gate or limit the time the postsynaptic neuron could be influenced by excitatory inputs. They next wanted to test their rules on a more spatially and structurally complex dendritic tree, mimicking the tree-like organization of synapses onto a single neuron. To do this, they connected dendritic compartments that could be independently activated, to a single neuron.  

What did they find?

Their model demonstrated that long-term potentiation (LTP), the process of synaptic strengthening upon repeated activation of a presynaptic neuron, can be initiated by the presynaptic neuron, and increases when the pre and postsynaptic neurons fire in synchrony. When neighboring synapses also increase their firing, the postsynaptic neuron showed increased LTP, in a fashion dependent on time and distance from the postsynaptic neuron. When multiple neighboring neurons targeted the same postsynaptic neuron at equal distances, their distance and ability to influence one another drove competition.

In experiments modeling receptive fields, the authors demonstrated that learning occurred when the gating inhibitory neurons were shut down, allowing the postsynaptic neuron to experience strong stimulation by excitatory inputs. With dendritic tree modeling, they showed that the plasticity of the excitatory synapses was dependent on inhibitory gating, distance from the cell body, and co-activity of surrounding inputs. Inhibition was found to directly influence excitatory synaptic plasticity. Inhibitory plasticity is slower than excitatory, and has strong control over excitatory plasticity, preventing too much change in excitatory weights and stabilizing learning. Finally, they created a model of a neuronal network, in which setpoints are used to balance LTP with synaptic weakening, creating a stable network that allows for learning without too much runaway excitation.  

What's the impact?

This study found that synaptic plasticity depends on a network of nearby synapses. The model developed can help explain how clusters of synapses develop and strengthen into stable systems. This work helps neuroscientists better represent and understand the complex dynamics of neural connections that change over time as we learn.  

Access the original scientific publication here.