1
|
Liang Y, Krotov D, Zaki MJ. Modern Hopfield Networks for graph embedding. Front Big Data 2022; 5:1044709. [PMID: 36466714 PMCID: PMC9713410 DOI: 10.3389/fdata.2022.1044709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2022] [Accepted: 10/31/2022] [Indexed: 09/19/2023] Open
Abstract
The network embedding task is to represent a node in a network as a low-dimensional vector while incorporating the topological and structural information. Most existing approaches solve this problem by factorizing a proximity matrix, either directly or implicitly. In this work, we introduce a network embedding method from a new perspective, which leverages Modern Hopfield Networks (MHN) for associative learning. Our network learns associations between the content of each node and that node's neighbors. These associations serve as memories in the MHN. The recurrent dynamics of the network make it possible to recover the masked node, given that node's neighbors. Our proposed method is evaluated on different benchmark datasets for downstream tasks such as node classification, link prediction, and graph coarsening. The results show competitive performance compared to the common matrix factorization techniques and deep learning based methods.
Collapse
Affiliation(s)
- Yuchen Liang
- Department of Computer Science, Rensselaer Polytechnic Institute, Troy, NY, United States
| | - Dmitry Krotov
- MIT-IBM Watson AI Lab, IBM Research, Cambridge, MA, United States
| | - Mohammed J. Zaki
- Department of Computer Science, Rensselaer Polytechnic Institute, Troy, NY, United States
| |
Collapse
|
2
|
Zeng G, Huang X, Jiang T, Yu S. Short-term synaptic plasticity expands the operational range of long-term synaptic changes in neural networks. Neural Netw 2019; 118:140-147. [DOI: 10.1016/j.neunet.2019.06.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2018] [Revised: 04/23/2019] [Accepted: 06/03/2019] [Indexed: 02/04/2023]
|
3
|
Sase T, Katori Y, Komuro M, Aihara K. Bifurcation Analysis on Phase-Amplitude Cross-Frequency Coupling in Neural Networks with Dynamic Synapses. Front Comput Neurosci 2017; 11:18. [PMID: 28424606 PMCID: PMC5371682 DOI: 10.3389/fncom.2017.00018] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/23/2016] [Accepted: 03/13/2017] [Indexed: 11/13/2022] Open
Abstract
We investigate a discrete-time network model composed of excitatory and inhibitory neurons and dynamic synapses with the aim at revealing dynamical properties behind oscillatory phenomena possibly related to brain functions. We use a stochastic neural network model to derive the corresponding macroscopic mean field dynamics, and subsequently analyze the dynamical properties of the network. In addition to slow and fast oscillations arising from excitatory and inhibitory networks, respectively, we show that the interaction between these two networks generates phase-amplitude cross-frequency coupling (CFC), in which multiple different frequency components coexist and the amplitude of the fast oscillation is modulated by the phase of the slow oscillation. Furthermore, we clarify the detailed properties of the oscillatory phenomena by applying the bifurcation analysis to the mean field model, and accordingly show that the intermittent and the continuous CFCs can be characterized by an aperiodic orbit on a closed curve and one on a torus, respectively. These two CFC modes switch depending on the coupling strength from the excitatory to inhibitory networks, via the saddle-node cycle bifurcation of a one-dimensional torus in map (MT1SNC), and may be associated with the function of multi-item representation. We believe that the present model might have potential for studying possible functional roles of phase-amplitude CFC in the cerebral cortex.
Collapse
Affiliation(s)
- Takumi Sase
- Graduate School of Information Science and Technology, The University of TokyoTokyo, Japan
| | - Yuichi Katori
- Institute of Industrial Science, The University of TokyoTokyo, Japan.,The School of Systems Information Science, Future University HakodateHokkaido, Japan
| | - Motomasa Komuro
- Center for Fundamental Education, Teikyo University of ScienceYamanashi, Japan
| | - Kazuyuki Aihara
- Graduate School of Information Science and Technology, The University of TokyoTokyo, Japan.,Institute of Industrial Science, The University of TokyoTokyo, Japan
| |
Collapse
|
4
|
Uzuntarla M, Torres JJ, So P, Ozer M, Barreto E. Double inverse stochastic resonance with dynamic synapses. Phys Rev E 2017; 95:012404. [PMID: 28208458 DOI: 10.1103/physreve.95.012404] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2016] [Indexed: 06/06/2023]
Abstract
We investigate the behavior of a model neuron that receives a biophysically realistic noisy postsynaptic current based on uncorrelated spiking activity from a large number of afferents. We show that, with static synapses, such noise can give rise to inverse stochastic resonance (ISR) as a function of the presynaptic firing rate. We compare this to the case with dynamic synapses that feature short-term synaptic plasticity and show that the interval of presynaptic firing rate over which ISR exists can be extended or diminished. We consider both short-term depression and facilitation. Interestingly, we find that a double inverse stochastic resonance (DISR), with two distinct wells centered at different presynaptic firing rates, can appear.
Collapse
Affiliation(s)
- Muhammet Uzuntarla
- Department of Biomedical Engineering, Bulent Ecevit University, 67100 Zonguldak, Turkey
| | - Joaquin J Torres
- Department of Electromagnetism and Physics of the Matter and Institute Carlos I for Theoretical and Computational Physics, University of Granada, E-18071 Granada, Spain
| | - Paul So
- Department of Physics and Astronomy and the Krasnow Institute for Advanced Study, George Mason University, Fairfax, Virginia 22030, USA
| | - Mahmut Ozer
- Department of Electrical and Electronics Engineering, Bulent Ecevit University, 67100 Zonguldak, Turkey
| | - Ernest Barreto
- Department of Physics and Astronomy and the Krasnow Institute for Advanced Study, George Mason University, Fairfax, Virginia 22030, USA
| |
Collapse
|
5
|
Uzuntarla M, Ozer M, Ileri U, Calim A, Torres JJ. Effects of dynamic synapses on noise-delayed response latency of a single neuron. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2015; 92:062710. [PMID: 26764730 DOI: 10.1103/physreve.92.062710] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/04/2015] [Indexed: 06/05/2023]
Abstract
The noise-delayed decay (NDD) phenomenon emerges when the first-spike latency of a periodically forced stochastic neuron exhibits a maximum for a particular range of noise intensity. Here, we investigate the latency response dynamics of a single Hodgkin-Huxley neuron that is subject to both a suprathreshold periodic stimulus and a background activity arriving through dynamic synapses. We study the first-spike latency response as a function of the presynaptic firing rate f. This constitutes a more realistic scenario than previous works, since f provides a suitable biophysically realistic parameter to control the level of activity in actual neural systems. We first report on the emergence of classical NDD behavior as a function of f for the limit of static synapses. Second, we show that when short-term depression and facilitation mechanisms are included at the synapses, different NDD features can be found due to their modulatory effect on synaptic current fluctuations. For example, an intriguing double NDD (DNDD) behavior occurs for different sets of relevant synaptic parameters. Moreover, depending on the balance between synaptic depression and synaptic facilitation, single NDD or DNDD can prevail, in such a way that synaptic facilitation favors the emergence of DNDD whereas synaptic depression favors the existence of single NDD. Here we report the existence of the DNDD effect in the response latency dynamics of a neuron.
Collapse
Affiliation(s)
- M Uzuntarla
- Department of Biomedical Engineering, Bulent Ecevit University, Engineering Faculty, 67100 Zonguldak, Turkey
- The Krasnow Institute for Advanced Study, George Mason University, Fairfax, Virginia 22030, USA
| | - M Ozer
- Department of Electrical and Electronics Engineering, Bulent Ecevit University, Engineering Faculty, 67100 Zonguldak, Turkey
| | - U Ileri
- Department of Biomedical Engineering, Bulent Ecevit University, Engineering Faculty, 67100 Zonguldak, Turkey
| | - A Calim
- Department of Biomedical Engineering, Bulent Ecevit University, Engineering Faculty, 67100 Zonguldak, Turkey
| | - J J Torres
- Department of Electromagnetism and Physics of the Matter and Institute "Carlos I" for Theoretical and Computational Physics, University of Granada, Granada E-18071, Spain
| |
Collapse
|
6
|
Dao Duc K, Parutto P, Chen X, Epsztein J, Konnerth A, Holcman D. Synaptic dynamics and neuronal network connectivity are reflected in the distribution of times in Up states. Front Comput Neurosci 2015; 9:96. [PMID: 26283956 PMCID: PMC4518200 DOI: 10.3389/fncom.2015.00096] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2015] [Accepted: 07/13/2015] [Indexed: 11/13/2022] Open
Abstract
The dynamics of neuronal networks connected by synaptic dynamics can sustain long periods of depolarization that can last for hundreds of milliseconds such as Up states recorded during sleep or anesthesia. Yet the underlying mechanism driving these periods remain unclear. We show here within a mean-field model that the residence time of the neuronal membrane potential in cortical Up states does not follow a Poissonian law, but presents several peaks. Furthermore, the present modeling approach allows extracting some information about the neuronal network connectivity from the time distribution histogram. Based on a synaptic-depression model, we find that these peaks, that can be observed in histograms of patch-clamp recordings are not artifacts of electrophysiological measurements, but rather are an inherent property of the network dynamics. Analysis of the equations reveals a stable focus located close to the unstable limit cycle, delimiting a region that defines the Up state. The model further shows that the peaks observed in the Up state time distribution are due to winding around the focus before escaping from the basin of attraction. Finally, we use in vivo recordings of intracellular membrane potential and we recover from the peak distribution, some information about the network connectivity. We conclude that it is possible to recover the network connectivity from the distribution of times that the neuronal membrane voltage spends in Up states.
Collapse
Affiliation(s)
- Khanh Dao Duc
- IBENS, Ecole Normale Supérieure, Applied Mathematics and Computational Biology Paris, France
| | - Pierre Parutto
- IBENS, Ecole Normale Supérieure, Applied Mathematics and Computational Biology Paris, France
| | - Xiaowei Chen
- Institute of Neuroscience, Technische Universität München Munchen, Germany
| | - Jérôme Epsztein
- Institut de Neurobiologie de la Méditerranée-INSERM U901 Marseille, France
| | - Arthur Konnerth
- Institute of Neuroscience, Technische Universität München Munchen, Germany
| | - David Holcman
- IBENS, Ecole Normale Supérieure, Applied Mathematics and Computational Biology Paris, France
| |
Collapse
|
7
|
Torres JJ, Kappen HJ. Emerging phenomena in neural networks with dynamic synapses and their computational implications. Front Comput Neurosci 2013; 7:30. [PMID: 23637657 PMCID: PMC3617396 DOI: 10.3389/fncom.2013.00030] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2012] [Accepted: 03/20/2013] [Indexed: 11/29/2022] Open
Abstract
In this paper we review our research on the effect and computational role of dynamical synapses on feed-forward and recurrent neural networks. Among others, we report on the appearance of a new class of dynamical memories which result from the destabilization of learned memory attractors. This has important consequences for dynamic information processing allowing the system to sequentially access the information stored in the memories under changing stimuli. Although storage capacity of stable memories also decreases, our study demonstrated the positive effect of synaptic facilitation to recover maximum storage capacity and to enlarge the capacity of the system for memory recall in noisy conditions. Possibly, the new dynamical behavior can be associated with the voltage transitions between up and down states observed in cortical areas in the brain. We investigated the conditions for which the permanence times in the up state are power-law distributed, which is a sign for criticality, and concluded that the experimentally observed large variability of permanence times could be explained as the result of noisy dynamic synapses with large recovery times. Finally, we report how short-term synaptic processes can transmit weak signals throughout more than one frequency range in noisy neural networks, displaying a kind of stochastic multi-resonance. This effect is due to competition between activity-dependent synaptic fluctuations (due to dynamic synapses) and the existence of neuron firing threshold which adapts to the incoming mean synaptic input.
Collapse
Affiliation(s)
- Joaquin J. Torres
- Granada Neurophysics Group at Institute “Carlos I” for Theoretical and Computational Physics, University of GranadaGranada, Spain
| | - Hilbert J. Kappen
- Donders Institute for Brain Cognition and Behaviour, Radboud University NijmegenNijmegen, Netherlands
| |
Collapse
|
8
|
Stroffek J, Marsalek P. Short-term potentiation effect on pattern recall in sparsely coded neural network. Neurocomputing 2012. [DOI: 10.1016/j.neucom.2011.08.021] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
9
|
Huber DE, O'Reilly RC. Persistence and accommodation in short-term priming and other perceptual paradigms: temporal segregation through synaptic depression. Cogn Sci 2010. [DOI: 10.1207/s15516709cog2703_4] [Citation(s) in RCA: 54] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022]
|
10
|
Igarashi Y, Oizumi M, Otsubo Y, Nagata K, Okada M. Statistical mechanics of attractor neural network models with synaptic depression. ACTA ACUST UNITED AC 2009. [DOI: 10.1088/1742-6596/197/1/012018] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
11
|
Mejias JF, Torres JJ. Maximum Memory Capacity on Neural Networks with Short-Term Synaptic Depression and Facilitation. Neural Comput 2009; 21:851-71. [DOI: 10.1162/neco.2008.02-08-719] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In this work, we study, analytically and employing Monte Carlo simulations, the influence of the competition between several activity-dependent synaptic processes, such as short-term synaptic facilitation and depression, on the maximum memory storage capacity in a neural network. In contrast to the case of synaptic depression, which drastically reduces the capacity of the network to store and retrieve “static” activity patterns, synaptic facilitation enhances the storage capacity in different contexts. In particular, we found optimal values of the relevant synaptic parameters (such as the neurotransmitter release probability or the characteristic facilitation time constant) for which the storage capacity can be maximal and similar to the one obtained with static synapses, that is, without activity-dependent processes. We conclude that depressing synapses with a certain level of facilitation allow recovering the good retrieval properties of networks with static synapses while maintaining the nonlinear characteristics of dynamic synapses, convenient for information processing and coding.
Collapse
Affiliation(s)
- Jorge F. Mejias
- Department of Electromagnetism and Matter Physics and Institute Carlos I for Theoretical and Computational Physics, University of Granada, E-18071 Granada, Spain
| | - Joaquín J. Torres
- Department of Electromagnetism and Matter Physics and Institute Carlos I for Theoretical and Computational Physics, University of Granada, E-18071 Granada, Spain
| |
Collapse
|
12
|
Torres JJ, Marro J, Cortes JM, Wemmenhove B. Instabilities in attractor networks with fast synaptic fluctuations and partial updating of the neurons activity. Neural Netw 2008; 21:1272-7. [PMID: 18701255 DOI: 10.1016/j.neunet.2008.07.002] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2007] [Revised: 07/15/2008] [Accepted: 07/19/2008] [Indexed: 11/24/2022]
Abstract
We present and study a probabilistic neural automaton in which the fraction of simultaneously-updated neurons is a parameter, rhoin(0,1). For small rho, there is relaxation towards one of the attractors and a great sensibility to external stimuli and, for rho > or = rho(c), itinerancy among attractors. Tuning rho in this regime, oscillations may abruptly change from regular to chaotic and vice versa, which allows one to control the efficiency of the searching process. We argue on the similarity of the model behavior with recent observations, and on the possible role of chaos in neurobiology.
Collapse
Affiliation(s)
- J J Torres
- Institute "Carlos I" for Theoretical and Computational Physics, and Departamento de Electromagnetismo y Física de la Materia, Universidad de Granada, E-18071, Granada, Spain.
| | | | | | | |
Collapse
|
13
|
Wang Z, Fan H. Dynamics of a continuous-valued discrete-time Hopfield neural network with synaptic depression. Neurocomputing 2007. [DOI: 10.1016/j.neucom.2007.01.004] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
14
|
Torres JJ, Cortes JM, Marro J, Kappen HJ. Competition between synaptic depression and facilitation in attractor neural networks. Neural Comput 2007; 19:2739-55. [PMID: 17716010 DOI: 10.1162/neco.2007.19.10.2739] [Citation(s) in RCA: 29] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behaviors, including associative memory and switching of activity between different attractors. We conclude that synaptic facilitation enhances the attractor instability in a way that (1) intensifies the system adaptability to external stimuli, which is in agreement with experiments, and (2) favors the retrieval of information with less error during short time intervals.
Collapse
Affiliation(s)
- J J Torres
- Institute Carlos I for Theoretical and Computational Physics, and Department of Electromagnetism and Matter Physics, University of Granada, Granada E-18071, Spain.
| | | | | | | |
Collapse
|
15
|
Cortes JM, Torres JJ, Marro J, Garrido PL, Kappen HJ. Effects of Fast Presynaptic Noise in Attractor Neural Networks. Neural Comput 2006. [DOI: 10.1162/neco.2006.18.3.614] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short timescale compared to that for the neuron dynamics and it produces short-time synaptic depression. This is inspired in recent neurobiological findings that show that synaptic strength may either increase or decrease on a short timescale depending on presynaptic activity. We thus describe a mechanism by which fast presynaptic noise enhances the neural network sensitivity to an external stimulus. The reason is that, in general, presynaptic noise induces nonequilibrium behavior and, consequently, the space of fixed points is qualitatively modified in such a way that the system can easily escape from the attractor. As a result, the model shows, in addition to pattern recognition, class identification and categorization, which may be relevant to the understanding of some of the brain complex tasks.
Collapse
Affiliation(s)
| | | | | | - P. L. Garrido
- Institute Carlos I for Theoretical and Computational Physics and Department of Electromagnetism and Physics of Matter, University of Granada, 18071 Granada, Spain
| | - H. J. Kappen
- Department of Biophysics, Radboud University of Nijmegen, 6525 EZ Nijmegen, The Netherlands
| |
Collapse
|
16
|
|