Presynaptic inhibition rapidly stabilises recurrent excitation in the face of plasticity.
PLoS Comput Biol 2020;
16:e1008118. [PMID:
32764742 PMCID:
PMC7439813 DOI:
10.1371/journal.pcbi.1008118]
[Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 08/19/2020] [Accepted: 07/01/2020] [Indexed: 01/23/2023] Open
Abstract
Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unstable positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be complemented by rapid compensatory processes. We suggest presynaptic inhibition as a candidate that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weakens recurrent interactions on sub-second timescales. We study the stabilising effect of presynaptic inhibition in recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of firing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory networks. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibition provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.
Synapses between neurons change during learning and memory formation, a process termed synaptic plasticity. Established models of plasticity rely on strengthening synapses of co-active neurons. In recurrent networks, mutually connected neurons tend to be co-active. The emerging positive feedback loop is believed to be counteracted by homeostatic mechanisms that aim to keep neural activity at a given set point. However, theoretical work indicates that experimentally observed forms of homeostasis are too slow to maintain stable network activity. In this article, we suggest that presynaptic inhibition can alleviate this problem. Presynaptic inhibition is an inhibitory mechanism that weakens synapses rather than suppressing neural activity. Using mathematical analyses and computer simulations, we show that presynaptic inhibition can compensate the strengthening of recurrent connections and thus stabilises neural networks subject to synaptic plasticity, even if homeostasis acts on biologically plausible timescales.
Collapse