51
|
|
52
|
Fixed points of symmetric complex-valued Hopfield neural networks. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2017.05.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
53
|
|
54
|
|
55
|
Tan Y, Tang S, Yang J, Liu Z. Robust stability analysis of impulsive complex-valued neural networks with time delays and parameter uncertainties. JOURNAL OF INEQUALITIES AND APPLICATIONS 2017; 2017:215. [PMID: 28959116 PMCID: PMC5594064 DOI: 10.1186/s13660-017-1490-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Accepted: 08/27/2017] [Indexed: 06/07/2023]
Abstract
The present study considers the robust stability for impulsive complex-valued neural networks (CVNNs) with discrete time delays. By applying the homeomorphic mapping theorem and some inequalities in a complex domain, some sufficient conditions are obtained to prove the existence and uniqueness of the equilibrium for the CVNNs. By constructing appropriate Lyapunov-Krasovskii functionals and employing the complex-valued matrix inequality skills, the study finds the conditions to guarantee its global robust stability. A numerical simulation illustrates the correctness of the proposed theoretical results.
Collapse
Affiliation(s)
- Yuanshun Tan
- College of Mathematics and Statistics, Chongqing Jiaotong University, Chongqing, 400074 China
| | - Sanyi Tang
- College of Mathematics and Statistics, Shaanxi Normal University, Shanxi, 710062 China
| | - Jin Yang
- College of Mathematics and Statistics, Chongqing Jiaotong University, Chongqing, 400074 China
| | - Zijian Liu
- College of Mathematics and Statistics, Chongqing Jiaotong University, Chongqing, 400074 China
| |
Collapse
|
56
|
Chen X, Li Z, Song Q, Hu J, Tan Y. Robust stability analysis of quaternion-valued neural networks with time delays and parameter uncertainties. Neural Netw 2017; 91:55-65. [DOI: 10.1016/j.neunet.2017.04.006] [Citation(s) in RCA: 111] [Impact Index Per Article: 13.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2016] [Revised: 02/17/2017] [Accepted: 04/14/2017] [Indexed: 11/30/2022]
|
57
|
Robust fixed-time synchronization for uncertain complex-valued neural networks with discontinuous activation functions. Neural Netw 2017; 90:42-55. [PMID: 28388472 DOI: 10.1016/j.neunet.2017.03.006] [Citation(s) in RCA: 124] [Impact Index Per Article: 15.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2016] [Revised: 02/20/2017] [Accepted: 03/12/2017] [Indexed: 11/27/2022]
|
58
|
Subramanian K, Muthukumar P. Global asymptotic stability of complex-valued neural networks with additive time-varying delays. Cogn Neurodyn 2017; 11:293-306. [PMID: 28559957 DOI: 10.1007/s11571-017-9429-1] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2016] [Revised: 02/15/2017] [Accepted: 03/06/2017] [Indexed: 05/29/2023] Open
Abstract
In this paper, we extensively study the global asymptotic stability problem of complex-valued neural networks with leakage delay and additive time-varying delays. By constructing a suitable Lyapunov-Krasovskii functional and applying newly developed complex valued integral inequalities, sufficient conditions for the global asymptotic stability of proposed neural networks are established in the form of complex-valued linear matrix inequalities. This linear matrix inequalities are efficiently solved by using standard available numerical packages. Finally, three numerical examples are given to demonstrate the effectiveness of the theoretical results.
Collapse
Affiliation(s)
- K Subramanian
- Department of Mathematics, The Gandhigram Rural Institute - Deemed University, Gandhigram, Tamilnadu 624 302 India
| | - P Muthukumar
- Department of Mathematics, The Gandhigram Rural Institute - Deemed University, Gandhigram, Tamilnadu 624 302 India
| |
Collapse
|
59
|
Gong W, Liang J, Kan X, Wang L, Dobaie AM. Robust state estimation for stochastic complex-valued neural networks with sampled-data. Neural Comput Appl 2017. [DOI: 10.1007/s00521-017-3030-8] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
60
|
Kobayashi M. Fast Recall for Complex-Valued Hopfield Neural Networks with Projection Rules. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE 2017; 2017:4894278. [PMID: 28553351 PMCID: PMC5434469 DOI: 10.1155/2017/4894278] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/12/2016] [Accepted: 03/06/2017] [Indexed: 11/17/2022]
Abstract
Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.
Collapse
Affiliation(s)
- Masaki Kobayashi
- Mathematical Science Center, University of Yamanashi, Takeda 4-3-11, Kofu, Yamanashi 400-8511, Japan
| |
Collapse
|
61
|
|
62
|
|
63
|
Kobayashi M. Symmetric Complex-Valued Hopfield Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:1011-1015. [PMID: 26849875 DOI: 10.1109/tnnls.2016.2518672] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Complex-valued neural networks, which are extensions of ordinary neural networks, have been studied as interesting models by many researchers. Especially, complex-valued Hopfield neural networks (CHNNs) have been used to process multilevel data, such as gray-scale images. CHNNs with Hermitian connection weights always converge using asynchronous update. The noise tolerance of CHNNs deteriorates extremely as the resolution increases. Noise tolerance is one of the most controversial problems for CHNNs. It is known that rotational invariance reduces noise tolerance. In this brief, we propose symmetric CHNNs (SCHNNs), which have symmetric connection weights. We define their energy function and prove that the SCHNNs always converge. In addition, we show that the SCHNNs improve noise tolerance through computer simulations and explain this improvement from the standpoint of rotational invariance.
Collapse
|
64
|
Dynamical Behavior of Complex-Valued Hopfield Neural Networks with Discontinuous Activation Functions. Neural Process Lett 2016. [DOI: 10.1007/s11063-016-9563-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
65
|
Liang J, Gong W, Huang T. Multistability of complex-valued neural networks with discontinuous activation functions. Neural Netw 2016; 84:125-142. [PMID: 27718391 DOI: 10.1016/j.neunet.2016.08.008] [Citation(s) in RCA: 61] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2016] [Revised: 06/21/2016] [Accepted: 08/23/2016] [Indexed: 11/29/2022]
Abstract
In this paper, based on the geometrical properties of the discontinuous activation functions and the Brouwer's fixed point theory, the multistability issue is tackled for the complex-valued neural networks with discontinuous activation functions and time-varying delays. To address the network with discontinuous functions, Filippov solution of the system is defined. Through rigorous analysis, several sufficient criteria are obtained to assure the existence of 25n equilibrium points. Among them, 9n points are locally stable and 16n-9n equilibrium points are unstable. Furthermore, to enlarge the attraction basins of the 9n equilibrium points, some mild conditions are imposed. Finally, one numerical example is provided to illustrate the effectiveness of the obtained results.
Collapse
Affiliation(s)
- Jinling Liang
- Department of Mathematics, Southeast University, Nanjing 210096, China.
| | - Weiqiang Gong
- Department of Mathematics, Southeast University, Nanjing 210096, China.
| | | |
Collapse
|
66
|
Synchronization of fractional-order complex-valued neural networks with time delay. Neural Netw 2016; 81:16-28. [DOI: 10.1016/j.neunet.2016.05.003] [Citation(s) in RCA: 191] [Impact Index Per Article: 21.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2015] [Revised: 03/20/2016] [Accepted: 05/09/2016] [Indexed: 11/23/2022]
|
67
|
|
68
|
Gong W, Liang J, Zhang C. Multistability of complex-valued neural networks with distributed delays. Neural Comput Appl 2016. [DOI: 10.1007/s00521-016-2305-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
69
|
Chen X, Song Q, Zhao Z, Liu Y. Global μ -stability analysis of discrete-time complex-valued neural networks with leakage delay and mixed delays. Neurocomputing 2016. [DOI: 10.1016/j.neucom.2015.10.120] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
70
|
New stability condition for discrete-time fully coupled neural networks with multivalued neurons. Neurocomputing 2015. [DOI: 10.1016/j.neucom.2015.04.036] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
71
|
Velmurugan G, Rakkiyappan R, Cao J. Further analysis of global μ-stability of complex-valued neural networks with unbounded time-varying delays. Neural Netw 2015; 67:14-27. [DOI: 10.1016/j.neunet.2015.03.007] [Citation(s) in RCA: 70] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2014] [Revised: 03/13/2015] [Accepted: 03/15/2015] [Indexed: 11/25/2022]
|
72
|
Rakkiyappan R, Velmurugan G, Cao J. Stability analysis of memristor-based fractional-order neural networks with different memductance functions. Cogn Neurodyn 2015; 9:145-77. [PMID: 25861402 PMCID: PMC4384520 DOI: 10.1007/s11571-014-9312-2] [Citation(s) in RCA: 49] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2014] [Revised: 09/03/2014] [Accepted: 09/08/2014] [Indexed: 11/26/2022] Open
Abstract
In this paper, the problem of the existence, uniqueness and uniform stability of memristor-based fractional-order neural networks (MFNNs) with two different types of memductance functions is extensively investigated. Moreover, we formulate the complex-valued memristor-based fractional-order neural networks (CVMFNNs) with two different types of memductance functions and analyze the existence, uniqueness and uniform stability of such networks. By using Banach contraction principle and analysis technique, some sufficient conditions are obtained to ensure the existence, uniqueness and uniform stability of the considered MFNNs and CVMFNNs with two different types of memductance functions. The analysis results establish from the theory of fractional-order differential equations with discontinuous right-hand sides. Finally, four numerical examples are presented to show the effectiveness of our theoretical results.
Collapse
Affiliation(s)
- R. Rakkiyappan
- />Department of Mathematics, Bharathiar University, Coimbatore, 641 046 Tamilnadu India
| | - G. Velmurugan
- />Department of Mathematics, Bharathiar University, Coimbatore, 641 046 Tamilnadu India
| | - Jinde Cao
- />Department of Mathematics, and Research Center for Complex Systems and Network Sciences, Southeast University, Nanjing, 210096 Jiangsu China
- />Department of Mathematics, Faculty of Science, King Abdulaziz University, Jeddah, 21589 Saudi Arabia
| |
Collapse
|
73
|
Abstract
This letter investigates the characteristics of the complex-valued neuron model with parameters represented by polar coordinates (called polar variable complex-valued neuron). The parameters of the polar variable complex-valued neuron are unidentifiable. The plateau phenomenon can occur during learning of the polar variable complex-valued neuron. Furthermore, computer simulations suggest that a single polar variable complex-valued neuron has the following characteristics in the case of using the steepest gradient-descent method with square error: (1) unidentifiable parameters (singular points) degrade the learning speed and (2) a plateau can occur during learning. When the weight is attracted to the singular point, the learning tends to become stuck. However, computer simulations also show that the steepest gradient-descent method with amplitude-phase error and the complex-valued natural gradient method could reduce the effects of the singular points. The learning dynamics near singular points depends on the error functions and the training algorithms used.
Collapse
Affiliation(s)
- Tohru Nitta
- National Institute of Advanced Industrial Science and Technology, Tsukuba, Ibaraki, 305-8568 Japan
| |
Collapse
|
74
|
Rakkiyappan R, Velmurugan G, Cao J. Multiple μ-stability analysis of complex-valued neural networks with unbounded time-varying delays. Neurocomputing 2015. [DOI: 10.1016/j.neucom.2014.08.015] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
75
|
|
76
|
Valle ME. Complex-Valued Recurrent Correlation Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2014; 23:1600-1612. [PMID: 25095268 DOI: 10.1109/tnnls.2014.2341013] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
In this paper, we generalize the bipolar recurrent correlation neural networks (RCNNs) of Chiueh and Goodman for patterns whose components are in the complex unit circle. The novel networks, referred to as complex-valued RCNNs (CV-RCNNs), are characterized by a possible nonlinear function, which is applied on the real part of the scalar product of the current state and the original patterns. We show that the CV-RCNNs always converge to a stationary state. Thus, they have potential application as associative memories. In this context, we provide sufficient conditions for the retrieval of a memorized vector. Furthermore, computational experiments concerning the reconstruction of corrupted grayscale images reveal that certain CV-RCNNs exhibit an excellent noise tolerance.
Collapse
|
77
|
Passivity Analysis of Memristor-Based Complex-Valued Neural Networks with Time-Varying Delays. Neural Process Lett 2014. [DOI: 10.1007/s11063-014-9371-8] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
78
|
Mostafa M, Teich WG, Lindner J. Local stability analysis of discrete-time, continuous-state, complex-valued recurrent neural networks with inner state feedback. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2014; 25:830-836. [PMID: 24807960 DOI: 10.1109/tnnls.2013.2281217] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Recurrent neural networks (RNNs) are well known for their capability to minimize suitable cost functions without the need for a training phase. This is possible because they can be Lyapunov stable. Although the global stability analysis has attracted a lot of interest, local stability is desirable for specific applications. In this brief, we investigate the local asymptotical stability of two classes of discrete-time, continuous-state, complex-valued RNNs with parallel update and inner state feedback. We show that many already known results are special cases of the results obtained here. We also generalize some known results from the real-valued case to the complex-valued one. Finally, we investigate the stability in the presence of time-variant activation functions. Complex-valued activation functions in this brief are separable with respect to the real and imaginary parts.
Collapse
|
79
|
Chen X, Song Q. Global stability of complex-valued neural networks with both leakage time delay and discrete time delay on time scales. Neurocomputing 2013. [DOI: 10.1016/j.neucom.2013.04.040] [Citation(s) in RCA: 134] [Impact Index Per Article: 11.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
80
|
Oku M, Makino T, Aihara K. Pseudo-orthogonalization of memory patterns for associative memory. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:1877-1887. [PMID: 24808619 DOI: 10.1109/tnnls.2013.2268542] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
A new method for improving the storage capacity of associative memory models on a neural network is proposed. The storage capacity of the network increases in proportion to the network size in the case of random patterns, but, in general, the capacity suffers from correlation among memory patterns. Numerous solutions to this problem have been proposed so far, but their high computational cost limits their scalability. In this paper, we propose a novel and simple solution that is locally computable without any iteration. Our method involves XNOR masking of the original memory patterns with random patterns, and the masked patterns and masks are concatenated. The resulting decorrelated patterns allow higher storage capacity at the cost of the pattern length. Furthermore, the increase in the pattern length can be reduced through blockwise masking, which results in a small amount of capacity loss. Movie replay and image recognition are presented as examples to demonstrate the scalability of the proposed method.
Collapse
|
81
|
Savitha R, Suresh S, Sundararajan N. Projection-based fast learning fully complex-valued relaxation neural network. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2013; 24:529-541. [PMID: 24808375 DOI: 10.1109/tnnls.2012.2235460] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
This paper presents a fully complex-valued relaxation network (FCRN) with its projection-based learning algorithm. The FCRN is a single hidden layer network with a Gaussian-like sech activation function in the hidden layer and an exponential activation function in the output layer. For a given number of hidden neurons, the input weights are assigned randomly and the output weights are estimated by minimizing a nonlinear logarithmic function (called as an energy function) which explicitly contains both the magnitude and phase errors. A projection-based learning algorithm determines the optimal output weights corresponding to the minima of the energy function by converting the nonlinear programming problem into that of solving a set of simultaneous linear algebraic equations. The resultant FCRN approximates the desired output more accurately with a lower computational effort. The classification ability of FCRN is evaluated using a set of real-valued benchmark classification problems from the University of California, Irvine machine learning repository. Here, a circular transformation is used to transform the real-valued input features to the complex domain. Next, the FCRN is used to solve three practical problems: a quadrature amplitude modulation channel equalization, an adaptive beamforming, and a mammogram classification. Performance results from this paper clearly indicate the superior classification/approximation performance of the FCRN.
Collapse
|
82
|
Hirose A, Yoshida S. Generalization characteristics of complex-valued feedforward neural networks in relation to signal coherence. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2012; 23:541-551. [PMID: 24805038 DOI: 10.1109/tnnls.2012.2183613] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Applications of complex-valued neural networks (CVNNs) have expanded widely in recent years-in particular in radar and coherent imaging systems. In general, the most important merit of neural networks lies in their generalization ability. This paper compares the generalization characteristics of complex-valued and real-valued feedforward neural networks in terms of the coherence of the signals to be dealt with. We assume a task of function approximation such as interpolation of temporal signals. Simulation and real-world experiments demonstrate that CVNNs with amplitude-phase-type activation function show smaller generalization error than real-valued networks, such as bivariate and dual-univariate real-valued neural networks. Based on the results, we discuss how the generalization characteristics are influenced by the coherence of the signals depending on the degree of freedom in the learning and on the circularity in neural dynamics.
Collapse
|
83
|
Suresh S, Savitha R, Sundararajan N. A Sequential Learning Algorithm for Complex-Valued Self-Regulating Resource Allocation Network-CSRAN. ACTA ACUST UNITED AC 2011; 22:1061-72. [DOI: 10.1109/tnn.2011.2144618] [Citation(s) in RCA: 72] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
84
|
Tripathi BK, Kalra PK. On Efficient Learning Machine With Root-Power Mean Neuron in Complex Domain. ACTA ACUST UNITED AC 2011; 22:727-38. [DOI: 10.1109/tnn.2011.2115251] [Citation(s) in RCA: 47] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
85
|
Valle ME. Sparsely connected autoassociative fuzzy implicative memories and their application for the reconstruction of large gray-scale images. Neurocomputing 2010. [DOI: 10.1016/j.neucom.2010.03.017] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
|
86
|
|