An adaptive threshold neuron for recurrent spiking neural networks with nanodevice hardware implementation.
Nat Commun 2021;
12:4234. [PMID:
34244491 PMCID:
PMC8270926 DOI:
10.1038/s41467-021-24427-8]
[Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2020] [Accepted: 06/14/2021] [Indexed: 11/19/2022] Open
Abstract
We propose a Double EXponential Adaptive Threshold (DEXAT) neuron model that improves the performance of neuromorphic Recurrent Spiking Neural Networks (RSNNs) by providing faster convergence, higher accuracy and a flexible long short-term memory. We present a hardware efficient methodology to realize the DEXAT neurons using tightly coupled circuit-device interactions and experimentally demonstrate the DEXAT neuron block using oxide based non-filamentary resistive switching devices. Using experimentally extracted parameters we simulate a full RSNN that achieves a classification accuracy of 96.1% on SMNIST dataset and 91% on Google Speech Commands (GSC) dataset. We also demonstrate full end-to-end real-time inference for speech recognition using real fabricated resistive memory circuit based DEXAT neurons. Finally, we investigate the impact of nanodevice variability and endurance illustrating the robustness of DEXAT based RSNNs.
Recurrent spiking neural networks have garnered interest due to their energy efficiency; however, they suffer from lower accuracy compared to conventional neural networks. Here, the authors present an alternative neuron model and its efficient hardware implementation, demonstrating high classification accuracy across a range of datasets.
Collapse