1
|
Grassucci E, Zhang A, Comminiello D. PHNNs: Lightweight Neural Networks via Parameterized Hypercomplex Convolutions. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2024; 35:8293-8305. [PMID: 37015366 DOI: 10.1109/tnnls.2022.3226772] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Hypercomplex neural networks have proven to reduce the overall number of parameters while ensuring valuable performance by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this article, we define the parameterization of hypercomplex convolutional layers and introduce the family of parameterized hypercomplex neural networks (PHNNs) that are lightweight and efficient large-scale models. Our method grasps the convolution rules and the filter organization directly from data without requiring a rigidly predefined domain structure to follow. PHNNs are flexible to operate in any user-defined or tuned domain, from 1-D to [Formula: see text] regardless of whether the algebra rules are preset. Such a malleability allows processing multidimensional inputs in their natural domain without annexing further dimensions, as done, instead, in quaternion neural networks (QNNs) for 3-D inputs like color images. As a result, the proposed family of PHNNs operates with 1/n free parameters as regards its analog in the real domain. We demonstrate the versatility of this approach to multiple domains of application by performing experiments on various image datasets and audio datasets in which our method outperforms real and quaternion-valued counterparts. Full code is available at: https://github.com/eleGAN23/HyperNets.
Collapse
|
2
|
Vector-Valued Hopfield Neural Networks and Distributed Synapse Based Convolutional and Linear Time-Variant Associative Memories. Neural Process Lett 2022. [DOI: 10.1007/s11063-022-11035-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
3
|
Kobayashi M. Noise-Robust Projection Rule for Rotor and Matrix-Valued Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2022; 33:567-576. [PMID: 33048772 DOI: 10.1109/tnnls.2020.3028091] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
A complex-valued Hopfield neural network (CHNN) has weak noise tolerance due to rotational invariance. Some alternatives of CHNN, such as a rotor Hopfield neural network (RHNN) and a matrix-valued Hopfield neural network (MHNN), resolve rotational invariance and improve the noise tolerance. However, the RHNN and MHNN with projection rules have a different problem of self-feedbacks. If the self-feedbacks are reduced, the noise tolerance is expected to be improved further. For reduction in the self-feedbacks, the noise-robust projection rules are introduced. The stability conditions are extended, and the self-feedbacks are reduced based on the extended stability conditions. Computer simulations support that the noise tolerance is improved. In particular, the noise tolerance is more robust against an increase in the number of training patterns.
Collapse
|
4
|
Multiple asymptotic stability of fractional-order quaternion-valued neural networks with time-varying delays. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2021.03.079] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
5
|
Kobayashi M. Storage Capacity of Quaternion-Valued Hopfield Neural Networks With Dual Connections. Neural Comput 2021; 33:2226-2240. [PMID: 34310674 DOI: 10.1162/neco_a_01405] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2020] [Accepted: 02/25/2021] [Indexed: 11/04/2022]
Abstract
A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
6
|
Kobayashi M. Hyperbolic-valued Hopfield neural networks in hybrid mode. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2021.01.121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
7
|
Kobayashi M. Noise Robust Projection Rule for Klein Hopfield Neural Networks. Neural Comput 2021; 33:1698-1716. [PMID: 34496385 DOI: 10.1162/neco_a_01385] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2020] [Accepted: 01/15/2021] [Indexed: 11/04/2022]
Abstract
Multistate Hopfield models, such as complex-valued Hopfield neural networks (CHNNs), have been used as multistate neural associative memories. Quaternion-valued Hopfield neural networks (QHNNs) reduce the number of weight parameters of CHNNs. The CHNNs and QHNNs have weak noise tolerance by the inherent property of rotational invariance. Klein Hopfield neural networks (KHNNs) improve the noise tolerance by resolving rotational invariance. However, the KHNNs have another disadvantage of self-feedback, a major factor of deterioration in noise tolerance. In this work, the stability conditions of KHNNs are extended. Moreover, the projection rule for KHNNs is modified using the extended conditions. The proposed projection rule improves the noise tolerance by a reduction in self-feedback. Computer simulations support that the proposed projection rule improves the noise tolerance of KHNNs.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
8
|
|
9
|
|
10
|
Information geometry of hyperbolic-valued Boltzmann machines. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.12.048] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
11
|
Kobayashi M. Quaternion Projection Rule for Rotor Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:900-908. [PMID: 32287014 DOI: 10.1109/tnnls.2020.2979920] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
A rotor Hopfield neural network (RHNN) is an extension of a complex-valued Hopfield neural network (CHNN) and has excellent noise tolerance. The RHNN decomposition theorem says that an RHNN decomposes into a CHNN and a symmetric CHNN. For a large number of training patterns, the projection rule for RHNNs generates large self-feedbacks, which deteriorates the noise tolerance. To remove self-feedbacks, we propose a projection rule using quaternions based on the decomposition theorem. Using computer simulations, we show that the quaternion projection rule improves noise tolerance.
Collapse
|
12
|
Kobayashi M. Quaternion-Valued Twin-Multistate Hopfield Neural Networks With Dual Connections. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2021; 32:892-899. [PMID: 32248127 DOI: 10.1109/tnnls.2020.2979904] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Dual connections (DCs) utilize the noncommutativity of quaternions and improve the noise tolerance of quaternion Hopfield neural networks (QHNNs). In this article, we introduce DCs to twin-multistate QHNNs. We conduct computer simulations to investigate the noise tolerance. The QHNNs with DCs were weak against an increase in the number of training patterns, but they were robust against increased resolution factor. The simulation results can be explained from the standpoints of storage capacities and rotational invariance.
Collapse
|
13
|
Abstract
Hopfield neural networks have been extended using hypercomplex numbers. The algebra of bicomplex numbers, also referred to as commutative quaternions, is a number system of dimension 4. Since the multiplication is commutative, many notions and theories of linear algebra, such as determinant, are available, unlike quaternions. A bicomplex-valued Hopfield neural network (BHNN) has been proposed as a multistate neural associative memory. However, the stability conditions have been insufficient for the projection rule. In this work, the stability conditions are extended and applied to improvement of the projection rule. The computer simulations suggest improved noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
14
|
|
15
|
Kobayashi M. Synthesis of complex- and hyperbolic-valued Hopfield neural networks. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2020.10.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
16
|
Kobayashi M. Bicomplex Projection Rule for Complex-Valued Hopfield Neural Networks. Neural Comput 2020; 32:2237-2248. [DOI: 10.1162/neco_a_01320] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A complex-valued Hopfield neural network (CHNN) with a multistate activation function is a multistate model of neural associative memory. The weight parameters need a lot of memory resources. Twin-multistate activation functions were introduced to quaternion- and bicomplex-valued Hopfield neural networks. Since their architectures are much more complicated than that of CHNN, the architecture should be simplified. In this work, the number of weight parameters is reduced by bicomplex projection rule for CHNNs, which is given by the decomposition of bicomplex-valued Hopfield neural networks. Computer simulations support that the noise tolerance of CHNN with a bicomplex projection rule is equal to or even better than that of quaternion- and bicomplex-valued Hopfield neural networks. By computer simulations, we find that the projection rule for hyperbolic-valued Hopfield neural networks in synchronous mode maintains a high noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
17
|
|
18
|
Abstract
For most multistate Hopfield neural networks, the stability conditions in asynchronous mode are known, whereas those in synchronous mode are not. If they were to converge in synchronous mode, recall would be accelerated by parallel processing. Complex-valued Hopfield neural networks (CHNNs) with a projection rule do not converge in synchronous mode. In this work, we provide stability conditions for hyperbolic Hopfield neural networks (HHNNs) in synchronous mode instead of CHNNs. HHNNs provide better noise tolerance than CHNNs. In addition, the stability conditions are applied to the projection rule, and HHNNs with a projection rule converge in synchronous mode. By computer simulations, we find that the projection rule for HHNNs in synchronous mode maintains a high noise tolerance.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
19
|
|
20
|
Wang H, Wei G, Wen S, Huang T. Generalized norm for existence, uniqueness and stability of Hopfield neural networks with discrete and distributed delays. Neural Netw 2020; 128:288-293. [PMID: 32454373 DOI: 10.1016/j.neunet.2020.05.014] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2019] [Revised: 05/11/2020] [Accepted: 05/11/2020] [Indexed: 11/16/2022]
Abstract
In this paper, the existence, uniqueness and stability criteria of solutions for Hopfield neural networks with discrete and distributed delays (DDD HNNs) are investigated by the definitions of three kinds of generalized norm (ξ-norm). A general DDD HNN model is firstly introduced, where the discrete delays τpq(t) are asynchronous time-varying delays. Then, {ξ,1}-norm, {ξ,2}-norm and {ξ,∞}-norm are successively used to derive the existence, uniqueness and stability criteria of solutions for the DDD HNNs. In the proof of theorems, special functions and assumptions are given to deal with discrete and distributed delays. Furthermore, a corollary is concluded for the existence and stability criteria of solutions. The methods given in this paper can also be used to study the synchronization and μ-stability of different DDD NNs. Finally, two numerical examples and their simulation figures are given to illustrate the effectiveness of these results.
Collapse
Affiliation(s)
- Huamin Wang
- Department of Mathematics, Luoyang Normal University, Luoyang, Henan 471934, China
| | - Guoliang Wei
- College of Science, University of Shanghai for Science and Technology, Shanghai 200093, China.
| | - Shiping Wen
- Centre for Artificial Intelligence, Faculty of Engineering & Information Technology, University of Technology Sydney, Sydney, 2007, Australia
| | - Tingwen Huang
- Department of Science, Texas A&M University at Qatar, Doha 23874, Qatar
| |
Collapse
|
21
|
|
22
|
Li N, Zheng WX. Passivity Analysis for Quaternion-Valued Memristor-Based Neural Networks With Time-Varying Delay. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:639-650. [PMID: 31021808 DOI: 10.1109/tnnls.2019.2908755] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
This paper is concerned with the problem of global exponential passivity for quaternion-valued memristor-based neural networks (QVMNNs) with time-varying delay. The QVMNNs can be seen as a switched system due to the memristor parameters are switching according to the states of the network. This is the first time that the global exponential passivity of QVMNNs with time-varying delay is investigated. By means of a nondecomposition method and structuring novel Lyapunov functional in form of quaternion self-conjugate matrices, the delay-dependent passivity criteria are derived in the forms of quaternion-valued linear matrix inequalities (LMIs) as well as complex-valued LMIs. Furthermore, the asymptotical stability criteria can be obtained from the proposed passivity criteria. Finally, a numerical example is presented to illustrate the effectiveness of the theoretical results.
Collapse
|
23
|
de Castro FZ, Valle ME. A broad class of discrete-time hypercomplex-valued Hopfield neural networks. Neural Netw 2020; 122:54-67. [DOI: 10.1016/j.neunet.2019.09.040] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Revised: 09/30/2019] [Accepted: 09/30/2019] [Indexed: 11/28/2022]
|
24
|
Kobayashi M. Noise Robust Projection Rule for Hyperbolic Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2020; 31:352-356. [PMID: 30892249 DOI: 10.1109/tnnls.2019.2899914] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. Low noise tolerance is the main disadvantage of CHNNs. The hyperbolic Hopfield neural network (HHNN) is a noise robust multistate Hopfield model. In HHNNs employing the projection rule, noise tolerance rapidly worsened as the number of training patterns increased. This result was caused by the self-loops. The projection rule for CHNNs improves noise tolerance by removing the self-loops, however, that for HHNNs cannot remove them. In this brief, we extended the stability condition for the self-loops of HHNNs and modified the projection rule. Thus, the HHNNs had improved noise tolerance.
Collapse
|
25
|
|
26
|
Uykan Z. On the Working Principle of the Hopfield Neural Networks and Its Equivalence to the GADIA in Optimization. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2019; 31:3294-3304. [PMID: 31603804 DOI: 10.1109/tnnls.2019.2940920] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Hopfield neural networks (HNNs) are one of the most well-known and widely used kinds of neural networks in optimization. In this article, the author focuses on building a deeper understanding of the working principle of the HNN during an optimization process. Our investigations yield several novel results giving some important insights into the working principle of both continuous and discrete HNNs. This article shows that what the traditional HNN actually does as energy function decreases is to divide the neurons into two classes in such a way that the sum of biased class volumes is minimized (or maximized) regardless of the types of the optimization problems. Introducing neuron-specific class labels, the author concludes that the traditional discrete HNN is actually a special case of the greedy asynchronous distributed interference avoidance algorithm (GADIA) [17] of Babadi and Tarokh for the 2-class optimization problems. The computer results confirm the findings.
Collapse
|
27
|
Chen L, Huang T, Tenreiro Machado J, Lopes AM, Chai Y, Wu R. Delay-dependent criterion for asymptotic stability of a class of fractional-order memristive neural networks with time-varying delays. Neural Netw 2019; 118:289-299. [DOI: 10.1016/j.neunet.2019.07.006] [Citation(s) in RCA: 46] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/07/2019] [Revised: 06/04/2019] [Accepted: 07/07/2019] [Indexed: 11/16/2022]
|
28
|
|
29
|
Li Y, Shen S. Almost automorphic solution of quaternion-valued BAM neural networks with time-varying delays on time scales1. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2019. [DOI: 10.3233/jifs-181118] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
- Yongkun Li
- Department of Mathematics, Yunnan University, Kunming, Yunnan, People’s Republic of China
| | - Shiping Shen
- Department of Mathematics, Yunnan University, Kunming, Yunnan, People’s Republic of China
| |
Collapse
|
30
|
Xiang M, Scalzo Dees B, Mandic DP. Multiple-Model Adaptive Estimation for 3-D and 4-D Signals: A Widely Linear Quaternion Approach. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2019; 30:72-84. [PMID: 29993725 DOI: 10.1109/tnnls.2018.2829526] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Quaternion state estimation techniques have been used in various applications, yet they are only suitable for dynamical systems represented by a single known model. In order to deal with model uncertainty, this paper proposes a class of widely linear quaternion multiple-model adaptive estimation (WL-QMMAE) algorithms based on widely linear quaternion Kalman filters and Bayesian inference. The augmented second-order quaternion statistics is employed to capture complete second-order statistical information in improper quaternion signals. Within the WL-QMMAE framework, a widely linear quaternion interacting multiple-model algorithm is proposed to track time-variant model uncertainty, while a widely linear quaternion static multiple-model algorithm is proposed for time-invariant model uncertainty. A performance analysis of the proposed algorithms shows that, as expected, the WL-QMMAE reduces to semiwidely linear QMMAE for [Formula: see text]-improper signals and further reduces to strictly linear QMMAE for proper signals. Simulation results indicate that for improper signals, the proposed WL-QMMAE algorithms exhibit an enhanced performance over their strictly linear counterparts. The effectiveness of the proposed recursive performance analysis algorithm is also validated.
Collapse
|
31
|
The global exponential pseudo almost periodic synchronization of quaternion-valued cellular neural networks with time-varying delays. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.04.044] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
32
|
Li Y, Qin J, Li B. Anti-periodic Solutions for Quaternion-Valued High-Order Hopfield Neural Networks with Time-Varying Delays. Neural Process Lett 2018. [DOI: 10.1007/s11063-018-9867-8] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
33
|
Existence and global exponential stability of periodic solutions for quaternion-valued cellular neural networks with time-varying delays. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.02.077] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|