1
|
Pilzak A, Calderini M, Berberian N, Thivierge JP. Role of short-term plasticity and slow temporal dynamics in enhancing time series prediction with a brain-inspired recurrent neural network. CHAOS (WOODBURY, N.Y.) 2025; 35:023153. [PMID: 39977307 DOI: 10.1063/5.0233158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/12/2024] [Accepted: 02/01/2025] [Indexed: 02/22/2025]
Abstract
Typical reservoir networks are based on random connectivity patterns that differ from brain circuits in two important ways. First, traditional reservoir networks lack synaptic plasticity among recurrent units, whereas cortical networks exhibit plasticity across all neuronal types and cortical layers. Second, reservoir networks utilize random Gaussian connectivity, while cortical networks feature a heavy-tailed distribution of synaptic strengths. It is unclear what are the computational advantages of these features for predicting complex time series. In this study, we integrated short-term plasticity (STP) and lognormal connectivity into a novel recurrent neural network (RNN) framework. The model exhibited rich patterns of population activity characterized by slow coordinated fluctuations. Using graph spectral decomposition, we show that weighted networks with lognormal connectivity and STP yield higher complexity than several graph types. When tested on various tasks involving the prediction of complex time series data, the RNN model outperformed a baseline model with random connectivity as well as several other network architectures. Overall, our results underscore the potential of incorporating brain-inspired features such as STP and heavy-tailed connectivity to enhance the robustness and performance of artificial neural networks in complex data prediction and signal processing tasks.
Collapse
Affiliation(s)
- Artem Pilzak
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
| | - Matias Calderini
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
| | - Nareg Berberian
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
| | - Jean-Philippe Thivierge
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
- Brain and Mind Research Institute, University of Ottawa, 451 Smyth Rd., Ottawa, Ontario K1H 8M5, Canada
| |
Collapse
|
2
|
Xiao Y, Adegoke M, Leung CS, Leung KW. Robust noise-aware algorithm for randomized neural network and its convergence properties. Neural Netw 2024; 173:106202. [PMID: 38422835 DOI: 10.1016/j.neunet.2024.106202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2023] [Revised: 12/19/2023] [Accepted: 02/20/2024] [Indexed: 03/02/2024]
Abstract
The concept of randomized neural networks (RNNs), such as the random vector functional link network (RVFL) and extreme learning machine (ELM), is a widely accepted and efficient network method for constructing single-hidden layer feedforward networks (SLFNs). Due to its exceptional approximation capabilities, RNN is being extensively used in various fields. While the RNN concept has shown great promise, its performance can be unpredictable in imperfect conditions, such as weight noises and outliers. Thus, there is a need to develop more reliable and robust RNN algorithms. To address this issue, this paper proposes a new objective function that addresses the combined effect of weight noise and training data outliers for RVFL networks. Based on the half-quadratic optimization method, we then propose a novel algorithm, named noise-aware RNN (NARNN), to optimize the proposed objective function. The convergence of the NARNN is also theoretically validated. We also discuss the way to use the NARNN for ensemble deep RVFL (edRVFL) networks. Finally, we present an extension of the NARNN to concurrently address weight noise, stuck-at-fault, and outliers. The experimental results demonstrate that the proposed algorithm outperforms a number of state-of-the-art robust RNN algorithms.
Collapse
Affiliation(s)
- Yuqi Xiao
- Department of Electrical Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China; State Key Laboratory of Terahertz and Millimeter Waves, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China; Shenzhen Key Laboratory of Millimeter Wave and Wideband Wireless Communications, CityU Shenzhen Research Institute, Shenzhen, 518057, China.
| | - Muideen Adegoke
- Department of Electrical Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China.
| | - Chi-Sing Leung
- Department of Electrical Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China.
| | - Kwok Wa Leung
- Department of Electrical Engineering, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China; State Key Laboratory of Terahertz and Millimeter Waves, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, HKSAR, China; Shenzhen Key Laboratory of Millimeter Wave and Wideband Wireless Communications, CityU Shenzhen Research Institute, Shenzhen, 518057, China.
| |
Collapse
|
3
|
Thivierge JP, Giraud É, Lynn M. Toward a Brain-Inspired Theory of Artificial Learning. Cognit Comput 2023. [DOI: 10.1007/s12559-023-10121-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
|
4
|
Thivierge JP, Giraud E, Lynn M, Théberge Charbonneau A. Key role of neuronal diversity in structured reservoir computing. CHAOS (WOODBURY, N.Y.) 2022; 32:113130. [PMID: 36456321 DOI: 10.1063/5.0111131] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 10/24/2022] [Indexed: 06/17/2023]
Abstract
Chaotic time series have been captured by reservoir computing models composed of a recurrent neural network whose output weights are trained in a supervised manner. These models, however, are typically limited to randomly connected networks of homogeneous units. Here, we propose a new class of structured reservoir models that incorporates a diversity of cell types and their known connections. In a first version of the model, the reservoir was composed of mean-rate units separated into pyramidal, parvalbumin, and somatostatin cells. Stability analysis of this model revealed two distinct dynamical regimes, namely, (i) an inhibition-stabilized network (ISN) where strong recurrent excitation is balanced by strong inhibition and (ii) a non-ISN network with weak excitation. These results were extended to a leaky integrate-and-fire model that captured different cell types along with their network architecture. ISN and non-ISN reservoir networks were trained to relay and generate a chaotic Lorenz attractor. Despite their increased performance, ISN networks operate in a regime of activity near the limits of stability where external perturbations yield a rapid divergence in output. The proposed framework of structured reservoir computing opens avenues for exploring how neural microcircuits can balance performance and stability when representing time series through distinct dynamical regimes.
Collapse
Affiliation(s)
- Jean-Philippe Thivierge
- University of Ottawa Brain and Mind Research Institute, 451 Smyth Rd., Ottawa, Ontario K1H 8M5, Canada
| | - Eloïse Giraud
- School of Psychology, University of Ottawa, 156 Jean-Jacques Lussier, Ottawa, Ontario K1N 6N5, Canada
| | - Michael Lynn
- University of Ottawa Brain and Mind Research Institute, 451 Smyth Rd., Ottawa, Ontario K1H 8M5, Canada
| | | |
Collapse
|