1
|
Han J, Zhou L. Fixed-time synchronization of proportional delay memristive complex-valued competitive neural networks. Neural Netw 2025; 188:107411. [PMID: 40153880 DOI: 10.1016/j.neunet.2025.107411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2025] [Revised: 02/22/2025] [Accepted: 03/14/2025] [Indexed: 04/01/2025]
Abstract
The fixed-time synchronization (FXS) is considered for memristive complex-valued competitive neural networks (MCVCNNs) with proportional delays. Two less conservative criteria supporting the FXS of MCVCNNs are founded by involving Lyapunov method and inequality techniques. Suitable switch controllers are designed by defining different norms of complex numbers instead of treating complex-valued neural networks as two real-valued systems. Furthermore, the settling time (ST) has been approximated. Finally, two simulations are shown to confirm the effectiveness of criteria in this paper and the outcomes of practical application in image protection.
Collapse
Affiliation(s)
- Jiapeng Han
- School of Mathematics Science and Institute of Mathematics and Interdisciplinary Sciences, Tianjin Normal University, Tianjin, 300387, China
| | - Liqun Zhou
- School of Mathematics Science and Institute of Mathematics and Interdisciplinary Sciences, Tianjin Normal University, Tianjin, 300387, China.
| |
Collapse
|
2
|
Cao B, Nie X, Zheng WX, Cao J. Multistability of State-Dependent Switched Fractional-Order Hopfield Neural Networks With Mexican-Hat Activation Function and Its Application in Associative Memories. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2025; 36:1213-1227. [PMID: 38048243 DOI: 10.1109/tnnls.2023.3334871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/06/2023]
Abstract
The multistability and its application in associative memories are investigated in this article for state-dependent switched fractional-order Hopfield neural networks (FOHNNs) with Mexican-hat activation function (AF). Based on the Brouwer's fixed point theorem, the contraction mapping principle and the theory of fractional-order differential equations, some sufficient conditions are established to ensure the existence, exact existence and local stability of multiple equilibrium points (EPs) in the sense of Filippov, in which the positively invariant sets are also estimated. In particular, the analysis concerning the existence and stability of EPs is quite different from those in the literature because the considered system involves both fractional-order derivative and state-dependent switching. It should be pointed out that, compared with the results in the literature, the total number of EPs and stable EPs increases from and to and , respectively, where with being the system dimension. Besides, a new method is designed to realize associative memories for grayscale and color images by introducing a deviation vector, which, in comparison with the existing works, not only improves the utilization efficiency of EPs, but also reduces the system dimension and computational burden. Finally, the effectiveness of the theoretical results is illustrated by four numerical simulations.
Collapse
|
3
|
Abernot M, Azemard N, Todri-Sanial A. Oscillatory neural network learning for pattern recognition: an on-chip learning perspective and implementation. Front Neurosci 2023; 17:1196796. [PMID: 37397448 PMCID: PMC10308018 DOI: 10.3389/fnins.2023.1196796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Accepted: 05/29/2023] [Indexed: 07/04/2023] Open
Abstract
In the human brain, learning is continuous, while currently in AI, learning algorithms are pre-trained, making the model non-evolutive and predetermined. However, even in AI models, environment and input data change over time. Thus, there is a need to study continual learning algorithms. In particular, there is a need to investigate how to implement such continual learning algorithms on-chip. In this work, we focus on Oscillatory Neural Networks (ONNs), a neuromorphic computing paradigm performing auto-associative memory tasks, like Hopfield Neural Networks (HNNs). We study the adaptability of the HNN unsupervised learning rules to on-chip learning with ONN. In addition, we propose a first solution to implement unsupervised on-chip learning using a digital ONN design. We show that the architecture enables efficient ONN on-chip learning with Hebbian and Storkey learning rules in hundreds of microseconds for networks with up to 35 fully-connected digital oscillators.
Collapse
Affiliation(s)
- Madeleine Abernot
- Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), Department of Microelectroncis, University of Montpellier, CNRS, Montpellier, France
| | - Nadine Azemard
- Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), Department of Microelectroncis, University of Montpellier, CNRS, Montpellier, France
| | - Aida Todri-Sanial
- Laboratoire d'Informatique de Robotique et de Microélectronique de Montpellier (LIRMM), Department of Microelectroncis, University of Montpellier, CNRS, Montpellier, France
- Electrical Engineering Department, Eindhoven University of Technology, Eindhoven, Netherlands
| |
Collapse
|
4
|
Gao S, Zhou M, Wang Z, Sugiyama D, Cheng J, Wang J, Todo Y. Fully Complex-Valued Dendritic Neuron Model. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:2105-2118. [PMID: 34487498 DOI: 10.1109/tnnls.2021.3105901] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
A single dendritic neuron model (DNM) that owns the nonlinear information processing ability of dendrites has been widely used for classification and prediction. Complex-valued neural networks that consist of a number of multiple/deep-layer McCulloch-Pitts neurons have achieved great successes so far since neural computing was utilized for signal processing. Yet no complex value representations appear in single neuron architectures. In this article, we first extend DNM from a real-value domain to a complex-valued one. Performance of complex-valued DNM (CDNM) is evaluated through a complex XOR problem, a non-minimum phase equalization problem, and a real-world wind prediction task. Also, a comparative analysis on a set of elementary transcendental functions as an activation function is implemented and preparatory experiments are carried out for determining hyperparameters. The experimental results indicate that the proposed CDNM significantly outperforms real-valued DNM, complex-valued multi-layer perceptron, and other complex-valued neuron models.
Collapse
|
5
|
Synchronization analysis and parameters identification of uncertain delayed fractional-order BAM neural networks. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07791-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
6
|
Sajjan M, Li J, Selvarajan R, Sureshbabu SH, Kale SS, Gupta R, Singh V, Kais S. Quantum machine learning for chemistry and physics. Chem Soc Rev 2022; 51:6475-6573. [PMID: 35849066 DOI: 10.1039/d2cs00203e] [Citation(s) in RCA: 24] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
Machine learning (ML) has emerged as a formidable force for identifying hidden but pertinent patterns within a given data set with the objective of subsequent generation of automated predictive behavior. In recent years, it is safe to conclude that ML and its close cousin, deep learning (DL), have ushered in unprecedented developments in all areas of physical sciences, especially chemistry. Not only classical variants of ML, even those trainable on near-term quantum hardwares have been developed with promising outcomes. Such algorithms have revolutionized materials design and performance of photovoltaics, electronic structure calculations of ground and excited states of correlated matter, computation of force-fields and potential energy surfaces informing chemical reaction dynamics, reactivity inspired rational strategies of drug designing and even classification of phases of matter with accurate identification of emergent criticality. In this review we shall explicate a subset of such topics and delineate the contributions made by both classical and quantum computing enhanced machine learning algorithms over the past few years. We shall not only present a brief overview of the well-known techniques but also highlight their learning strategies using statistical physical insight. The objective of the review is not only to foster exposition of the aforesaid techniques but also to empower and promote cross-pollination among future research in all areas of chemistry which can benefit from ML and in turn can potentially accelerate the growth of such algorithms.
Collapse
Affiliation(s)
- Manas Sajjan
- Department of Chemistry, Purdue University, West Lafayette, IN-47907, USA. .,Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA
| | - Junxu Li
- Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA.,Department of Physics and Astronomy, Purdue University, West Lafayette, IN-47907, USA
| | - Raja Selvarajan
- Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA.,Department of Physics and Astronomy, Purdue University, West Lafayette, IN-47907, USA
| | - Shree Hari Sureshbabu
- Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA.,Elmore Family School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN-47907, USA
| | - Sumit Suresh Kale
- Department of Chemistry, Purdue University, West Lafayette, IN-47907, USA. .,Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA
| | - Rishabh Gupta
- Department of Chemistry, Purdue University, West Lafayette, IN-47907, USA. .,Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA
| | - Vinit Singh
- Department of Chemistry, Purdue University, West Lafayette, IN-47907, USA. .,Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA
| | - Sabre Kais
- Department of Chemistry, Purdue University, West Lafayette, IN-47907, USA. .,Purdue Quantum Science and Engineering Institute, Purdue University, West Lafayette, Indiana 47907, USA.,Department of Physics and Astronomy, Purdue University, West Lafayette, IN-47907, USA.,Elmore Family School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN-47907, USA
| |
Collapse
|
7
|
Adaptive synchronization of fractional-order complex-valued coupled neural networks via direct error method. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2021.11.015] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
8
|
Liu Y, Shen B, Sun J. Stubborn state estimation for complex-valued neural networks with mixed time delays: the discrete time case. Neural Comput Appl 2022. [DOI: 10.1007/s00521-021-06707-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
|
9
|
Kobayashi M. Noise-Robust Projection Rule for Rotor and Matrix-Valued Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2022; 33:567-576. [PMID: 33048772 DOI: 10.1109/tnnls.2020.3028091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
A complex-valued Hopfield neural network (CHNN) has weak noise tolerance due to rotational invariance. Some alternatives of CHNN, such as a rotor Hopfield neural network (RHNN) and a matrix-valued Hopfield neural network (MHNN), resolve rotational invariance and improve the noise tolerance. However, the RHNN and MHNN with projection rules have a different problem of self-feedbacks. If the self-feedbacks are reduced, the noise tolerance is expected to be improved further. For reduction in the self-feedbacks, the noise-robust projection rules are introduced. The stability conditions are extended, and the self-feedbacks are reduced based on the extended stability conditions. Computer simulations support that the noise tolerance is improved. In particular, the noise tolerance is more robust against an increase in the number of training patterns.
Collapse
|
10
|
Novel global polynomial stability criteria of impulsive complex-valued neural networks with multi-proportional delays. Neural Comput Appl 2021. [DOI: 10.1007/s00521-021-06555-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
11
|
Abernot M, Gil T, Jiménez M, Núñez J, Avellido MJ, Linares-Barranco B, Gonos T, Hardelin T, Todri-Sanial A. Digital Implementation of Oscillatory Neural Network for Image Recognition Applications. Front Neurosci 2021; 15:713054. [PMID: 34512246 PMCID: PMC8427800 DOI: 10.3389/fnins.2021.713054] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2021] [Accepted: 08/04/2021] [Indexed: 11/20/2022] Open
Abstract
Computing paradigm based on von Neuman architectures cannot keep up with the ever-increasing data growth (also called “data deluge gap”). This has resulted in investigating novel computing paradigms and design approaches at all levels from materials to system-level implementations and applications. An alternative computing approach based on artificial neural networks uses oscillators to compute or Oscillatory Neural Networks (ONNs). ONNs can perform computations efficiently and can be used to build a more extensive neuromorphic system. Here, we address a fundamental problem: can we efficiently perform artificial intelligence applications with ONNs? We present a digital ONN implementation to show a proof-of-concept of the ONN approach of “computing-in-phase” for pattern recognition applications. To the best of our knowledge, this is the first attempt to implement an FPGA-based fully-digital ONN. We report ONN accuracy, training, inference, memory capacity, operating frequency, hardware resources based on simulations and implementations of 5 × 3 and 10 × 6 ONNs. We present the digital ONN implementation on FPGA for pattern recognition applications such as performing digits recognition from a camera stream. We discuss practical challenges and future directions in implementing digital ONN.
Collapse
Affiliation(s)
- Madeleine Abernot
- Laboratoire d'Informatique, de Robotique et de Microélectronique de Montpellier, University of Montpellier, CNRS, Montpellier, France
| | - Thierry Gil
- Laboratoire d'Informatique, de Robotique et de Microélectronique de Montpellier, University of Montpellier, CNRS, Montpellier, France
| | - Manuel Jiménez
- Instituto de Microelectronica de Sevilla, IMSE-CNM, CSIC, Universidad de Sevilla, Sevilla, Spain
| | - Juan Núñez
- Instituto de Microelectronica de Sevilla, IMSE-CNM, CSIC, Universidad de Sevilla, Sevilla, Spain
| | - María J Avellido
- Instituto de Microelectronica de Sevilla, IMSE-CNM, CSIC, Universidad de Sevilla, Sevilla, Spain
| | - Bernabé Linares-Barranco
- Instituto de Microelectronica de Sevilla, IMSE-CNM, CSIC, Universidad de Sevilla, Sevilla, Spain
| | | | | | - Aida Todri-Sanial
- Laboratoire d'Informatique, de Robotique et de Microélectronique de Montpellier, University of Montpellier, CNRS, Montpellier, France
| |
Collapse
|
12
|
Kobayashi M. Storage Capacity of Quaternion-Valued Hopfield Neural Networks With Dual Connections. Neural Comput 2021; 33:2226-2240. [PMID: 34310674 DOI: 10.1162/neco_a_01405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 02/25/2021] [Indexed: 11/04/2022]
Abstract
A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
13
|
Kobayashi M. Hyperbolic-valued Hopfield neural networks in hybrid mode. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2021.01.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
14
|
Kobayashi M. Noise Robust Projection Rule for Klein Hopfield Neural Networks. Neural Comput 2021; 33:1698-1716. [PMID: 34496385 DOI: 10.1162/neco_a_01385] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2020] [Accepted: 01/15/2021] [Indexed: 11/04/2022]
Abstract
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
15
|
Kobayashi M. Two-Level Complex-Valued Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:2274-2278. [PMID: 32479402 DOI: 10.1109/tnnls.2020.2995413] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
In multistate neural associative memories, some neurons have small noise and the others have large noise. If we know which neurons have small noise, the noise tolerance could be improved. In this brief, we provide a novel method to reinforce neurons with small noise and apply our new method to images with the Gaussian noise. A complex-valued multistate neuron is decomposed to two neurons, referred to as high and low neurons. For the Gaussian noise, the high neurons are expected to have small noise. The noise tolerance is improved by reinforcement of high neurons. The computer simulations support the efficiency of reinforced neurons.
Collapse
|
16
|
|
17
|
Information geometry of hyperbolic-valued Boltzmann machines. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.12.048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
18
|
Kobayashi M. Quaternion Projection Rule for Rotor Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:900-908. [PMID: 32287014 DOI: 10.1109/tnnls.2020.2979920] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
A rotor Hopfield neural network (RHNN) is an extension of a complex-valued Hopfield neural network (CHNN) and has excellent noise tolerance. The RHNN decomposition theorem says that an RHNN decomposes into a CHNN and a symmetric CHNN. For a large number of training patterns, the projection rule for RHNNs generates large self-feedbacks, which deteriorates the noise tolerance. To remove self-feedbacks, we propose a projection rule using quaternions based on the decomposition theorem. Using computer simulations, we show that the quaternion projection rule improves noise tolerance.
Collapse
|
19
|
Kobayashi M. Quaternion-Valued Twin-Multistate Hopfield Neural Networks With Dual Connections. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:892-899. [PMID: 32248127 DOI: 10.1109/tnnls.2020.2979904] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Dual connections (DCs) utilize the noncommutativity of quaternions and improve the noise tolerance of quaternion Hopfield neural networks (QHNNs). In this article, we introduce DCs to twin-multistate QHNNs. We conduct computer simulations to investigate the noise tolerance. The QHNNs with DCs were weak against an increase in the number of training patterns, but they were robust against increased resolution factor. The simulation results can be explained from the standpoints of storage capacities and rotational invariance.
Collapse
|
20
|
Abstract
Hopfield neural networks have been extended using hypercomplex numbers. The algebra of bicomplex numbers, also referred to as commutative quaternions, is a number system of dimension 4. Since the multiplication is commutative, many notions and theories of linear algebra, such as determinant, are available, unlike quaternions. A bicomplex-valued Hopfield neural network (BHNN) has been proposed as a multistate neural associative memory. However, the stability conditions have been insufficient for the projection rule. In this work, the stability conditions are extended and applied to improvement of the projection rule. The computer simulations suggest improved noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
21
|
|
22
|
Kobayashi M. Synthesis of complex- and hyperbolic-valued Hopfield neural networks. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
23
|
Xiao J, Jia Y, Jiang X, Wang S. Circular Complex-Valued GMDH-Type Neural Network for Real-Valued Classification Problems. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:5285-5299. [PMID: 32078563 DOI: 10.1109/tnnls.2020.2966031] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Recently, applications of complex-valued neural networks (CVNNs) to real-valued classification problems have attracted significant attention. However, most existing CVNNs are black-box models with poor explanation performance. This study extends the real-valued group method of data handling (RGMDH)-type neural network to the complex field and constructs a circular complex-valued group method of data handling (C-CGMDH)-type neural network, which is a white-box model. First, a complex least squares method is proposed for parameter estimation. Second, a new complex-valued symmetric regularity criterion is constructed with a logarithmic function to represent explicitly the magnitude and phase of the actual and predicted complex output to evaluate and select the middle candidate models. Furthermore, the property of this new complex-valued external criterion is proven to be similar to that of the real external criterion. Before training this model, a circular transformation is used to transform the real-valued input features to the complex field. Twenty-five real-valued classification data sets from the UCI Machine Learning Repository are used to conduct the experiments. The results show that both RGMDH and C-CGMDH models can select the most important features from the complete feature space through a self-organizing modeling process. Compared with RGMDH, the C-CGMDH model converges faster and selects fewer features. Furthermore, its classification performance is statistically significantly better than the benchmark complex-valued and real-valued models. Regarding time complexity, the C-CGMDH model is comparable with other models in dealing with the data sets that have few features. Finally, we demonstrate that the GMDH-type neural network can be interpretable.
Collapse
|
24
|
Kobayashi M. Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks. Neural Comput 2020; 32:2237-2248. [DOI: 10.1162/neco_a_01320] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
25
|
|
26
|
Abstract
For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
27
|
|
28
|
|
29
|
Robust Exponential Stability for Discrete-Time Quaternion-Valued Neural Networks with Time Delays and Parameter Uncertainties. Neural Process Lett 2020. [DOI: 10.1007/s11063-020-10196-w] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
30
|
de Castro FZ, Valle ME. A broad class of discrete-time hypercomplex-valued Hopfield neural networks. Neural Netw 2020; 122:54-67. [DOI: 10.1016/j.neunet.2019.09.040] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Revised: 09/30/2019] [Accepted: 09/30/2019] [Indexed: 11/28/2022]
|
31
|
Exponential and adaptive synchronization of inertial complex-valued neural networks: A non-reduced order and non-separation approach. Neural Netw 2020; 124:50-59. [PMID: 31982673 DOI: 10.1016/j.neunet.2020.01.002] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2019] [Revised: 12/07/2019] [Accepted: 01/07/2020] [Indexed: 11/22/2022]
Abstract
This paper mainly deals with the problem of exponential and adaptive synchronization for a type of inertial complex-valued neural networks via directly constructing Lyapunov functionals without utilizing standard reduced-order transformation for inertial neural systems and common separation approach for complex-valued systems. At first, a complex-valued feedback control scheme is designed and a nontrivial Lyapunov functional, composed of the complex-valued state variables and their derivatives, is proposed to analyze exponential synchronization. Some criteria involving multi-parameters are derived and a feasible method is provided to determine these parameters so as to clearly show how to choose control gains in practice. In addition, an adaptive control strategy in complex domain is developed to adjust control gains and asymptotic synchronization is ensured by applying the method of undeterminated coefficients in the construction of Lyapunov functional and utilizing Barbalat Lemma. Lastly, a numerical example along with simulation results is provided to support the theoretical work.
Collapse
|
32
|
Wang X, Wang Z, Song Q, Shen H, Huang X. A waiting-time-based event-triggered scheme for stabilization of complex-valued neural networks. Neural Netw 2020; 121:329-338. [DOI: 10.1016/j.neunet.2019.09.032] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2019] [Revised: 09/11/2019] [Accepted: 09/22/2019] [Indexed: 10/25/2022]
|
33
|
Tanaka G, Nakane R, Takeuchi T, Yamane T, Nakano D, Katayama Y, Hirose A. Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:24-38. [PMID: 30892239 DOI: 10.1109/tnnls.2019.2899344] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The development of hardware neural networks, including neuromorphic hardware, has been accelerated over the past few years. However, it is challenging to operate very large-scale neural networks with low-power hardware devices, partly due to signal transmissions through a massive number of interconnections. Our aim is to deal with the issue of communication cost from an algorithmic viewpoint and study learning algorithms for energy-efficient information processing. Here, we consider two approaches to finding spatially arranged sparse recurrent neural networks with the high cost-performance ratio for associative memory. In the first approach following classical methods, we focus on sparse modular network structures inspired by biological brain networks and examine their storage capacity under an iterative learning rule. We show that incorporating long-range intermodule connections into purely modular networks can enhance the cost-performance ratio. In the second approach, we formulate for the first time an optimization problem where the network sparsity is maximized under the constraints imposed by a pattern embedding condition. We show that there is a tradeoff between the interconnection cost and the computational performance in the optimized networks. We demonstrate that the optimized networks can achieve a better cost-performance ratio compared with those considered in the first approach. We show the effectiveness of the optimization approach mainly using binary patterns and apply it also to gray-scale image restoration. Our results suggest that the presented approaches are useful in seeking more sparse and less costly connectivity of neural networks for the enhancement of energy efficiency in hardware neural networks.
Collapse
|
34
|
Zheng B, Hu C, Yu J, Jiang H. Finite-time synchronization of fully complex-valued neural networks with fractional-order. Neurocomputing 2020. [DOI: 10.1016/j.neucom.2019.09.048] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
35
|
Kobayashi M. O(2) -Valued Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2019; 30:3833-3838. [PMID: 30843853 DOI: 10.1109/tnnls.2019.2897994] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
In complex-valued Hopfield neural networks (CHNNs), the neuron states are complex numbers whose amplitudes are: 1) they can also be described in special orthogonal matrices of order and 2) here, we propose a new Hopfield model, the O(2) -valued Hopfield neural network [ O(2) -HNN], whose neuron states are extended to orthogonal matrices. Its neuron states are embedded in 4-D space, while those of CHNNs are embedded in 2-D space. Computer simulations were conducted to compare the noise tolerance (NT) and storage capacity (SC) of CHNNs, O(2) -HNNs, and rotor Hopfield neural networks. In terms of SC, O(2) -HNNs outperformed the others, while in NT, they outdid CHNNs.
Collapse
|
36
|
|
37
|
Li Y, Meng X. Almost Automorphic Solutions in Distribution Sense of Quaternion-Valued Stochastic Recurrent Neural Networks with Mixed Time-Varying Delays. Neural Process Lett 2019. [DOI: 10.1007/s11063-019-10151-4] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
38
|
Finite Time Stability Analysis of Fractional-Order Complex-Valued Memristive Neural Networks with Proportional Delays. Neural Process Lett 2019. [DOI: 10.1007/s11063-019-10097-7] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
39
|
Yang X, Li C, Song Q, Li H, Huang J. Effects of State-Dependent Impulses on Robust Exponential Stability of Quaternion-Valued Neural Networks Under Parametric Uncertainty. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2019; 30:2197-2211. [PMID: 30507516 DOI: 10.1109/tnnls.2018.2877152] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
This paper addresses the state-dependent impulsive effects on robust exponential stability of quaternion-valued neural networks (QVNNs) with parametric uncertainties. In view of the noncommutativity of quaternion multiplication, we have to separate the concerned quaternion-valued models into four real-valued parts. Then, several assumptions ensuring every solution of the separated state-dependent impulsive neural networks intersects each of the discontinuous surface exactly once are proposed. In the meantime, by applying the B -equivalent method, the addressed state-dependent impulsive models are reduced to fixed-time ones, and the latter can be regarded as the comparative systems of the former. For the subsequent analysis, we proposed a novel norm inequality of block matrix, which can be utilized to analyze the same stability properties of the separated state-dependent impulsive models and the reduced ones efficaciously. Afterward, several sufficient conditions are well presented to guarantee the robust exponential stability of the origin of the considered models; it is worth mentioning that two cases of addressed models are analyzed concretely, that is, models with exponential stable continuous subsystems and destabilizing impulses, and models with unstable continuous subsystems and stabilizing impulses. In addition, an application case corresponding to the stability problem of models with unstable continuous subsystems and stabilizing impulses for state-dependent impulse control to robust exponential synchronization of QVNNs is considered summarily. Finally, some numerical examples are proffered to illustrate the effectiveness and correctness of the obtained results.
Collapse
|
40
|
Exponential synchronization of time-varying delayed complex-valued neural networks under hybrid impulsive controllers. Neural Netw 2019; 114:157-163. [DOI: 10.1016/j.neunet.2019.02.006] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2018] [Revised: 01/03/2019] [Accepted: 02/22/2019] [Indexed: 12/14/2022]
|
41
|
Wang Z, Cao J, Guo Z, Huang L. Generalized stability for discontinuous complex-valued Hopfield neural networks via differential inclusions. Proc Math Phys Eng Sci 2018; 474:20180507. [PMID: 30602935 PMCID: PMC6304032 DOI: 10.1098/rspa.2018.0507] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2018] [Accepted: 10/24/2018] [Indexed: 11/12/2022] Open
Abstract
Some dynamical behaviours of discontinuous complex-valued Hopfield neural networks are discussed in this paper. First, we introduce a method to construct the complex-valued set-valued mapping and define some basic definitions for discontinuous complex-valued differential equations. In addition, Leray-Schauder alternative theorem is used to analyse the equilibrium existence of the networks. Lastly, we present the dynamical behaviours, including global stability and convergence in measure for discontinuous complex-valued neural networks (CVNNs) via differential inclusions. The main contribution of this paper is that we extend previous studies on continuous CVNNs to discontinuous ones. Several simulations are given to substantiate the correction of the proposed results.
Collapse
Affiliation(s)
- Zengyun Wang
- The Jiangsu Provincial Key Laboratory of Networked Collective Intelligence, Southeast University, Nanjing 210096, People's Republic of China
- School of Mathematics, Southeast University, Nanjing 210096, People's Republic of China
- Department of Mathematics, Hunan First Normal University, Changsha 410205, People's Republic of China
| | - Jinde Cao
- The Jiangsu Provincial Key Laboratory of Networked Collective Intelligence, Southeast University, Nanjing 210096, People's Republic of China
- School of Mathematics, Southeast University, Nanjing 210096, People's Republic of China
| | - Zhenyuan Guo
- College of Mathematics and Economtrics, Hunan University, Changsha 410082, People's Republic of China
| | - Lihong Huang
- Changsha University of Science and Technology, Changsha 410114, People's Republic of China
| |
Collapse
|
42
|
|
43
|
Kobayashi M. Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2018; 2018:1275290. [PMID: 30515194 PMCID: PMC6236997 DOI: 10.1155/2018/1275290] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Accepted: 09/18/2018] [Indexed: 11/17/2022]
Abstract
A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Takeda 4-3-11, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
44
|
Song Q, Chen X. Multistability Analysis of Quaternion-Valued Neural Networks With Time Delays. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:5430-5440. [PMID: 29994739 DOI: 10.1109/tnnls.2018.2801297] [Citation(s) in RCA: 69] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper addresses the multistability issue for quaternion-valued neural networks (QVNNs) with time delays. By using the inequality technique, sufficient conditions are proposed for the boundedness and the global attractivity of delayed QVNNs. Based on the geometrical properties of the activation functions, several criteria are obtained to ensure the existence of equilibrium points, of which are locally stable. Two numerical examples are provided to illustrate the effectiveness of the obtained results.
Collapse
|
45
|
|
46
|
Chen X, Song Q, Li Z, Zhao Z, Liu Y. Stability Analysis of Continuous-Time and Discrete-Time Quaternion-Valued Neural Networks With Linear Threshold Neurons. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:2769-2781. [PMID: 28600263 DOI: 10.1109/tnnls.2017.2704286] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
This paper addresses the problem of stability for continuous-time and discrete-time quaternion-valued neural networks (QVNNs) with linear threshold neurons. Applying the semidiscretization technique to the continuous-time QVNNs, the discrete-time analogs are obtained, which preserve the dynamical characteristics of their continuous-time counterparts. Via the plural decomposition method of quaternion, homeomorphic mapping theorem, as well as Lyapunov theorem, some sufficient conditions on the existence, uniqueness, and global asymptotical stability of the equilibrium point are derived for the continuous-time QVNNs and their discrete-time analogs, respectively. Furthermore, a uniform sufficient condition on the existence, uniqueness, and global asymptotical stability of the equilibrium point is obtained for both continuous-time QVNNs and their discrete-time version. Finally, two numerical examples are provided to substantiate the effectiveness of the proposed results.
Collapse
|
47
|
Kobayashi M. Decomposition of Rotor Hopfield Neural Networks Using Complex Numbers. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:1366-1370. [PMID: 28182561 DOI: 10.1109/tnnls.2017.2657781] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
A complex-valued Hopfield neural network (CHNN) is a multistate model of a Hopfield neural network. It has the disadvantage of low noise tolerance. Meanwhile, a symmetric CHNN (SCHNN) is a modification of a CHNN that improves noise tolerance. Furthermore, a rotor Hopfield neural network (RHNN) is an extension of a CHNN. It has twice the storage capacity of CHNNs and SCHNNs, and much better noise tolerance than CHNNs, although it requires twice many connection parameters. In this brief, we investigate the relations between CHNN, SCHNN, and RHNN; an RHNN is uniquely decomposed into a CHNN and SCHNN. In addition, the Hebbian learning rule for RHNNs is decomposed into those for CHNNs and SCHNNs.
Collapse
|
48
|
Global Exponential Synchronization of Complex-Valued Neural Networks with Time Delays via Matrix Measure Method. Neural Process Lett 2018. [DOI: 10.1007/s11063-018-9805-9] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
49
|
Kobayashi M. Stability of Rotor Hopfield Neural Networks With Synchronous Mode. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:744-748. [PMID: 28055918 DOI: 10.1109/tnnls.2016.2635140] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
A complex-valued Hopfield neural network (CHNN) is a model of a Hopfield neural network using multistate neurons. The stability conditions of CHNNs have been widely studied. A CHNN with a synchronous mode will converge to a fixed point or a cycle of length 2. A rotor Hopfield neural network (RHNN) is also a model of a multistate Hopfield neural network. RHNNs have much higher storage capacity and noise tolerance than CHNNs. We extend the theories regarding the stability of CHNNs to RHNNs. In addition, we investigate the stability of RHNNs with the projection rule. Although a CHNN with projection rule can be trapped at a cycle, an RHNN with projection rule converges to a fixed point. This is one of the great advantages of RHNNs.
Collapse
|
50
|
Zhang W, Cao J, Chen D, Alsaadi FE. Synchronization in Fractional-Order Complex-Valued Delayed Neural Networks. ENTROPY 2018; 20:e20010054. [PMID: 33265140 PMCID: PMC7512252 DOI: 10.3390/e20010054] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/13/2017] [Revised: 01/07/2018] [Accepted: 01/08/2018] [Indexed: 11/16/2022]
Abstract
This paper discusses the synchronization of fractional order complex valued neural networks (FOCVNN) at the presence of time delay. Synchronization criterions are achieved through the employment of a linear feedback control and comparison theorem of fractional order linear systems with delay. Feasibility and effectiveness of the proposed system are validated through numerical simulations.
Collapse
Affiliation(s)
- Weiwei Zhang
- School of Mathematics and Computational Science, Anqing Normal University, Anqing 246011, China
- Correspondence: ; Tel.: +86-152-5566-0785
| | - Jinde Cao
- School of Mathematics, Southeast University, Nanjing 210096, China
- Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah 21589, Saudi Arabia
| | - Dingyuan Chen
- School of Mathematics and Computational Science, Anqing Normal University, Anqing 246011, China
| | - Fuad E. Alsaadi
- Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
| |
Collapse
|