1
|
Gou S, Fu J, Sha Y, Cao Z, Guo Z, Eshraghian JK, Li R, Jiao L. Dynamic spatio-temporal pruning for efficient spiking neural networks. Front Neurosci 2025; 19:1545583. [PMID: 40201191 PMCID: PMC11975901 DOI: 10.3389/fnins.2025.1545583] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2024] [Accepted: 03/03/2025] [Indexed: 04/10/2025] Open
Abstract
Spiking neural networks (SNNs), which draw from biological neuron models, have the potential to improve the computational efficiency of artificial neural networks (ANNs) due to their event-driven nature and sparse data flow. SNNs rely on dynamical sparsity, in that neurons are trained to activate sparsely to minimize data communication. This is critical when accounting for hardware given the bandwidth limitations between memory and processor. Given that neurons are sparsely activated, weights are less frequently accessed, and potentially can be pruned to less performance degradation in a SNN compared to an equivalent ANN counterpart. Reducing the number of synaptic connections between neurons also relaxes memory demands for neuromorphic processors. In this paper, we propose a spatio-temporal pruning algorithm that dynamically adapts to reduce the temporal redundancy that often exists in SNNs when processing Dynamic Vision Sensor (DVS) datasets. Spatial pruning is executed based on both global parameter statistics and inter-layer parameter count and is shown to reduce model degradation under extreme sparsity. We provide an ablation study that isolates the various components of spatio-temporal pruning, and find that our approach achieves excellent performance across all datasets, with especially high performance on datasets with time-varying features. We achieved a 0.69% improvement on the DVS128 Gesture dataset, despite the common expectation that pruning typically degrades performance. Notably, this enhancement comes with an impressive 98.18% reduction in parameter space and a 50% reduction in time redundancy.
Collapse
Affiliation(s)
- Shuiping Gou
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Jiahui Fu
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Yu Sha
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Zhen Cao
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Zhang Guo
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Jason K. Eshraghian
- Department of Electrical and Computer Engineering, University of California, Santa Cruz, Santa Cruz, CA, United States
| | - Ruimin Li
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| | - Licheng Jiao
- Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi'an, China
| |
Collapse
|
2
|
Wei H, Li F. The storage capacity of a directed graph and nodewise autonomous, ubiquitous learning. Front Comput Neurosci 2023; 17:1254355. [PMID: 37927548 PMCID: PMC10620732 DOI: 10.3389/fncom.2023.1254355] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2023] [Accepted: 09/20/2023] [Indexed: 11/07/2023] Open
Abstract
The brain, an exceedingly intricate information processing system, poses a constant challenge to memory research, particularly in comprehending how it encodes, stores, and retrieves information. Cognitive psychology studies memory mechanism from behavioral experiment level and fMRI level, and neurobiology studies memory mechanism from anatomy and electrophysiology level. Current research findings are insufficient to provide a comprehensive, detailed explanation of memory processes within the brain. Numerous unknown details must be addressed to establish a complete information processing mechanism connecting micro molecular cellular levels with macro cognitive behavioral levels. Key issues include characterizing and distributing content within biological neural networks, coexisting information with varying content, and sharing limited resources and storage capacity. Compared with the hard disk of computer mass storage, it is very clear from the polarity of magnetic particles in the bottom layer, the division of tracks and sectors in the middle layer, to the directory tree and file management system in the high layer, but the understanding of memory is not sufficient. Biological neural networks are abstracted as directed graphs, and the encoding, storage, and retrieval of information within directed graphs at the cellular level are explored. A memory computational model based on active directed graphs and node-adaptive learning is proposed. First, based on neuronal local perspectives, autonomous initiative, limited resource competition, and other neurobiological characteristics, a resource-based adaptive learning algorithm for directed graph nodes is designed. To minimize resource consumption of memory content in directed graphs, two resource-occupancy optimization strategies-lateral inhibition and path pruning-are proposed. Second, this paper introduces a novel memory mechanism grounded in graph theory, which considers connected subgraphs as the physical manifestation of memory content in directed graphs. The encoding, storage, consolidation, and retrieval of the brain's memory system correspond to specific operations such as forming subgraphs, accommodating multiple subgraphs, strengthening connections and connectivity of subgraphs, and activating subgraphs. Lastly, a series of experiments were designed to simulate cognitive processes and evaluate the performance of the directed graph model. Experimental results reveal that the proposed adaptive connectivity learning algorithm for directed graphs in this paper possesses the following four features: (1) Demonstrating distributed, self-organizing, and self-adaptive properties, the algorithm achieves global-level functions through local node interactions; (2) Enabling incremental storage and supporting continuous learning capabilities; (3) Displaying stable memory performance, it surpasses the Hopfield network in memory accuracy, capacity, and diversity, as demonstrated in experimental comparisons. Moreover, it maintains high memory performance with large-scale datasets; (4) Exhibiting a degree of generalization ability, the algorithm's macroscopic performance remains unaffected by the topological structure of the directed graph. Large-scale, decentralized, and node-autonomous directed graphs are suitable simulation methods. Examining storage problems within directed graphs can reveal the essence of phenomena and uncover fundamental storage rules hidden within complex neuronal mechanisms, such as synaptic plasticity, ion channels, neurotransmitters, and electrochemical activities.
Collapse
Affiliation(s)
- Hui Wei
- Laboratory of Algorithms for Cognitive Models, School of Computer Science, Shanghai Key Laboratory of Data Science, Fudan University, Shanghai, China
| | | |
Collapse
|
3
|
Torres JJ, Marro J. Physics Clues on the Mind Substrate and Attributes. Front Comput Neurosci 2022; 16:836532. [PMID: 35465268 PMCID: PMC9026167 DOI: 10.3389/fncom.2022.836532] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Accepted: 02/07/2022] [Indexed: 11/16/2022] Open
Abstract
The last decade has witnessed a remarkable progress in our understanding of the brain. This has mainly been based on the scrutiny and modeling of the transmission of activity among neurons across lively synapses. A main conclusion, thus far, is that essential features of the mind rely on collective phenomena that emerge from a willful interaction of many neurons that, mediating other cells, form a complex network whose details keep constantly adapting to their activity and surroundings. In parallel, theoretical and computational studies developed to understand many natural and artificial complex systems, which have truthfully explained their amazing emergent features and precise the role of the interaction dynamics and other conditions behind the different collective phenomena they happen to display. Focusing on promising ideas that arise when comparing these neurobiology and physics studies, the present perspective article shortly reviews such fascinating scenarios looking for clues about how high-level cognitive processes such as consciousness, intelligence, and identity can emerge. We, thus, show that basic concepts of physics, such as dynamical phases and non-equilibrium phase transitions, become quite relevant to the brain activity while determined by factors at the subcellular, cellular, and network levels. We also show how these transitions depend on details of the processing mechanism of stimuli in a noisy background and, most important, that one may detect them in familiar electroencephalogram (EEG) recordings. Thus, we associate the existence of such phases, which reveal a brain operating at (non-equilibrium) criticality, with the emergence of most interesting phenomena during memory tasks.
Collapse
|
4
|
Namiranian R, Rahimi Malakshan S, Abrishami Moghaddam H, Khadem A, Jafari R. Normal development of the brain: a survey of joint structural-functional brain studies. Rev Neurosci 2022; 33:745-765. [PMID: 35304982 DOI: 10.1515/revneuro-2022-0017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 02/17/2022] [Indexed: 11/15/2022]
Abstract
Joint structural-functional (S-F) developmental studies present a novel approach to address the complex neuroscience questions on how the human brain works and how it matures. Joint S-F biomarkers have the inherent potential to model effectively the brain's maturation, fill the information gap in temporal brain atlases, and demonstrate how the brain's performance matures during the lifespan. This review presents the current state of knowledge on heterochronous and heterogeneous development of S-F links during the maturation period. The S-F relationship has been investigated in early-matured unimodal and prolonged-matured transmodal regions of the brain using a variety of structural and functional biomarkers and data acquisition modalities. Joint S-F unimodal studies have employed auditory and visual stimuli, while the main focus of joint S-F transmodal studies has been resting-state and cognitive experiments. However, nonsignificant associations between some structural and functional biomarkers and their maturation show that designing and developing effective S-F biomarkers is still a challenge in the field. Maturational characteristics of brain asymmetries have been poorly investigated by the joint S-F studies, and the results were partially inconsistent with previous nonjoint ones. The inherent complexity of the brain performance can be modeled using multifactorial and nonlinear techniques as promising methods to simulate the impact of age on S-F relations considering their analysis challenges.
Collapse
Affiliation(s)
- Roxana Namiranian
- Department of Biomedical Engineering, Faculty of Electrical Engineering, K. N. Toosi University of Technology, Tehran 16317-14191, Iran
| | - Sahar Rahimi Malakshan
- Department of Biomedical Engineering, Faculty of Electrical Engineering, K. N. Toosi University of Technology, Tehran 16317-14191, Iran
| | - Hamid Abrishami Moghaddam
- Department of Biomedical Engineering, Faculty of Electrical Engineering, K. N. Toosi University of Technology, Tehran 16317-14191, Iran.,Inserm UMR 1105, Université de Picardie Jules Verne, 80054 Amiens, France
| | - Ali Khadem
- Department of Biomedical Engineering, Faculty of Electrical Engineering, K. N. Toosi University of Technology, Tehran 16317-14191, Iran
| | - Reza Jafari
- Department of Electrical and Computer Engineering, Thompson Engineering Building, University of Western Ontario, London, ON N6A 5B9, Canada
| |
Collapse
|
5
|
Millán AP, Torres JJ, Johnson S, Marro J. Growth strategy determines the memory and structural properties of brain networks. Neural Netw 2021; 142:44-56. [PMID: 33984735 DOI: 10.1016/j.neunet.2021.04.027] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2020] [Revised: 03/04/2021] [Accepted: 04/20/2021] [Indexed: 11/18/2022]
Abstract
The interplay between structure and function affects the emerging properties of many natural systems. Here we use an adaptive neural network model that couples activity and topological dynamics and reproduces the experimental temporal profiles of synaptic density observed in the brain. We prove that the existence of a transient period of relatively high synaptic connectivity is critical for the development of the system under noise circumstances, such that the resulting network can recover stored memories. Moreover, we show that intermediate synaptic densities provide optimal developmental paths with minimum energy consumption, and that ultimately it is the transient heterogeneity in the network that determines its evolution. These results could explain why the pruning curves observed in actual brain areas present their characteristic temporal profiles and they also suggest new design strategies to build biologically inspired neural networks with particular information processing capabilities.
Collapse
Affiliation(s)
- Ana P Millán
- Amsterdam UMC, Vrije Universiteit Amsterdam, Department of Clinical Neurophysiology and MEG Center, Amsterdam Neuroscience, De Boelelaan 1117, Amsterdam, The Netherlands.
| | - Joaquín J Torres
- Institute 'Carlos I' for Theoretical and Computational Physics, University of Granada, Spain
| | - Samuel Johnson
- School of Mathematics, University of Birmingham, Edgbaston B15 2TT, UK; Alan Turing Institute, London NW1 2DB, UK
| | - J Marro
- Institute 'Carlos I' for Theoretical and Computational Physics, University of Granada, Spain
| |
Collapse
|
6
|
Stevenson R, Samokhina E, Rossetti I, Morley JW, Buskila Y. Neuromodulation of Glial Function During Neurodegeneration. Front Cell Neurosci 2020; 14:278. [PMID: 32973460 PMCID: PMC7473408 DOI: 10.3389/fncel.2020.00278] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2020] [Accepted: 08/05/2020] [Indexed: 12/12/2022] Open
Abstract
Glia, a non-excitable cell type once considered merely as the connective tissue between neurons, is nowadays acknowledged for its essential contribution to multiple physiological processes including learning, memory formation, excitability, synaptic plasticity, ion homeostasis, and energy metabolism. Moreover, as glia are key players in the brain immune system and provide structural and nutritional support for neurons, they are intimately involved in multiple neurological disorders. Recent advances have demonstrated that glial cells, specifically microglia and astroglia, are involved in several neurodegenerative diseases including Amyotrophic lateral sclerosis (ALS), Epilepsy, Parkinson's disease (PD), Alzheimer's disease (AD), and frontotemporal dementia (FTD). While there is compelling evidence for glial modulation of synaptic formation and regulation that affect neuronal signal processing and activity, in this manuscript we will review recent findings on neuronal activity that affect glial function, specifically during neurodegenerative disorders. We will discuss the nature of each glial malfunction, its specificity to each disorder, overall contribution to the disease progression and assess its potential as a future therapeutic target.
Collapse
Affiliation(s)
- Rebecca Stevenson
- School of Medicine, Western Sydney University, Campbelltown, NSW, Australia
| | - Evgeniia Samokhina
- School of Medicine, Western Sydney University, Campbelltown, NSW, Australia
| | - Ilaria Rossetti
- School of Medicine, Western Sydney University, Campbelltown, NSW, Australia
| | - John W. Morley
- School of Medicine, Western Sydney University, Campbelltown, NSW, Australia
| | - Yossi Buskila
- School of Medicine, Western Sydney University, Campbelltown, NSW, Australia
- International Centre for Neuromorphic Systems, The MARCS Institute for Brain, Behaviour and Development, Penrith, NSW, Australia
| |
Collapse
|
7
|
Yuan Y, Liu J, Zhao P, Xing F, Huo H, Fang T. Structural Insights Into the Dynamic Evolution of Neuronal Networks as Synaptic Density Decreases. Front Neurosci 2019; 13:892. [PMID: 31507365 PMCID: PMC6714520 DOI: 10.3389/fnins.2019.00892] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2019] [Accepted: 08/08/2019] [Indexed: 11/13/2022] Open
Abstract
The human brain is thought to be an extremely complex but efficient computing engine, processing vast amounts of information from a changing world. The decline in the synaptic density of neuronal networks is one of the most important characteristics of brain development, which is closely related to synaptic pruning, synaptic growth, synaptic plasticity, and energy metabolism. However, because of technical limitations in observing large-scale neuronal networks dynamically connected through synapses, how neuronal networks are organized and evolve as their synaptic density declines remains unclear. Here, by establishing a biologically reasonable neuronal network model, we show that despite a decline in the synaptic density, the connectivity, and efficiency of neuronal networks can be improved. Importantly, by analyzing the degree distribution, we also find that both the scale-free characteristic of neuronal networks and the emergence of hub neurons rely on the spatial distance between neurons. These findings may promote our understanding of neuronal networks in the brain and have guiding significance for the design of neuronal network models.
Collapse
Affiliation(s)
- Ye Yuan
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Jian Liu
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Peng Zhao
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Fu Xing
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Hong Huo
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| | - Tao Fang
- Department of Automation, Shanghai Jiao Tong University, Shanghai, China.,Key Laboratory of System Control and Information Processing, Ministry of Education, Shanghai, China
| |
Collapse
|