1
|
Murthy GR. Toward Optimal Synthesis of Discrete-Time Hopfield Neural Network. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:9549-9554. [PMID: 35333718 DOI: 10.1109/tnnls.2022.3156107] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
In this research brief, the relationship between eigenvectors (with {+1, -1} components) of a synaptic weight matrix W and the stable/anti-stable states of discrete-time Hopfield associative memory (HAM) is established. Also, the synthesis of W with desired stable/anti-stable states using spectral representation of W in even/odd dimension is discussed when the threshold vector is a non-zero vector. Freedom in choice of eigenvalues is capitalized to improve the noise immunity of the Hopfield neural network (HNN). Also, the problem of optimal synthesis of Hopfield Associative memory is formulated.
Collapse
|
2
|
Zhang J, Zhu S, Bao G, Liu X, Wen S. Analysis and Design of Multivalued High-Capacity Associative Memories Based on Delayed Recurrent Neural Networks. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:12989-13000. [PMID: 34347620 DOI: 10.1109/tcyb.2021.3095499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
This article aims at analyzing and designing the multivalued high-capacity-associative memories based on recurrent neural networks with both asynchronous and distributed delays. In order to increase storage capacities, multivalued activation functions are introduced into associative memories. The stored patterns are retrieved by external input vectors instead of initial conditions, which can guarantee accurate associative memories by avoiding spurious equilibrium points. Some sufficient conditions are proposed to ensure the existence, uniqueness, and global exponential stability of the equilibrium point of neural networks with mixed delays. For neural networks with n neurons, m -dimensional input vectors, and 2k -valued activation functions, the autoassociative memories have (2k)n storage capacities and heteroassociative memories have min {(2k)n,(2k)m} storage capacities. That is, the storage capacities of designed associative memories in this article are obviously higher than the 2n and min {2n,2m} storage capacities of the conventional ones. Three examples are given to support the theoretical results.
Collapse
|
3
|
Associative Memory Synthesis Based on Region Attractive Recurrent Neural Networks. Neural Process Lett 2022. [DOI: 10.1007/s11063-022-10823-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
4
|
Zhou C, Zeng X, Luo C, Zhang H. A New Local Bipolar Autoassociative Memory Based on External Inputs of Discrete Recurrent Neural Networks With Time Delay. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:2479-2489. [PMID: 27529878 DOI: 10.1109/tnnls.2016.2575925] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.In this paper, local bipolar auto-associative memories are presented based on discrete recurrent neural networks with a class of gain type activation function. The weight parameters of neural networks are acquired by a set of inequalities without the learning procedure. The global exponential stability criteria are established to ensure the accuracy of the restored patterns by considering time delays and external inputs. The proposed methodology is capable of effectively overcoming spurious memory patterns and achieving memory capacity. The effectiveness, robustness, and fault-tolerant capability are validated by simulated experiments.
Collapse
Affiliation(s)
- Caigen Zhou
- Institute of Intelligence Science and Technology, Hohai University, Nanjing, China
| | - Xiaoqin Zeng
- Institute of Intelligence Science and Technology, Hohai University, Nanjing, China
| | - Chaomin Luo
- Department of Electrical and Computer Engineering, University of Detroit Mercy, Detroit, MI, USA
| | - Huaguang Zhang
- College of Information Science and Engineering, Northeastern University, Shenyang, China
| |
Collapse
|
5
|
A generalized bipolar auto-associative memory model based on discrete recurrent neural networks. Neurocomputing 2015. [DOI: 10.1016/j.neucom.2015.03.052] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
6
|
Zhang H, Huang Y, Wang B, Wang Z. Design and analysis of associative memories based on external inputs of delayed recurrent neural networks. Neurocomputing 2014. [DOI: 10.1016/j.neucom.2013.12.014] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
7
|
Bao G, Zeng Z. Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions. Neurocomputing 2012. [DOI: 10.1016/j.neucom.2011.08.026] [Citation(s) in RCA: 57] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
8
|
Costea RL, Marinov CA. New accurate and flexible design procedure for a stable KWTA continuous time network. IEEE TRANSACTIONS ON NEURAL NETWORKS 2011; 22:1357-67. [PMID: 21768047 DOI: 10.1109/tnn.2011.2154340] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
The classical continuous time recurrent (Hopfield) network is considered and adapted to K -winner-take-all operation. The neurons are of sigmoidal type with a controllable gain G, an amplitude m and interconnected by the conductance p. The network is intended to process one by one a sequence of lists, each of them with N distinct elements, each of them squeezed to [0,I] admission interval, each of them having an imposed minimum separation between elements z(min). The network carries out: 1) a matching dynamic process between the order of list elements and the order of outputs, and 2) a binary type steady-state separation between K and K+1 outputs, the former surpassing a +ξ threshold and the later falling under the -ξ threshold. As a result, the machine will signal the ranks of the K largest elements of the list. To achieve 1), the initial condition of processing phase has to be placed in a computable θ -vicinity of zero-state. This requires a resetting procedure after each list. To achieve 2) the bias current M has to be within a certain interval computable from circuit parameters. In addition, the steady-state should be asymptotically stable. To these goals, we work with high gain and exploit the sigmoid properties and network symmetry. The various inequality type constraints between parameters are shown to be compatible and a neat synthesis procedure, simple and flexible, is given for the tanh sigmoid. It starts with the given parameters N, K, I, z(min), m and computes simple bounds of p, G, ξ, θ, and M. Numerical tests and comments reveal qualities and shortcomings of the method.
Collapse
Affiliation(s)
- Ruxandra L Costea
- Department of Electrical Engineering, Polytechnic University of Bucharest, Bucharest, Romania.
| | | |
Collapse
|
9
|
Pengsheng Zheng, Jianxiong Zhang, Wansheng Tang. Learning Associative Memories by Error Backpropagation. ACTA ACUST UNITED AC 2011; 22:347-55. [DOI: 10.1109/tnn.2010.2099239] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
10
|
Fornarelli G, Giaquinto A. Adaptive particle swarm optimization for CNN associative memories design. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2009.05.004] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
11
|
Chartier S, Boukadoum M, Amiri M. BAM Learning of Nonlinearly Separable Tasks by Using an Asymmetrical Output Function and Reinforcement Learning. ACTA ACUST UNITED AC 2009; 20:1281-92. [DOI: 10.1109/tnn.2009.2023120] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
12
|
Zeng Z, Wang J. Associative memories based on continuous-time cellular neural networks designed using space-invariant cloning templates. Neural Netw 2009; 22:651-7. [DOI: 10.1016/j.neunet.2009.06.031] [Citation(s) in RCA: 67] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2009] [Revised: 05/25/2009] [Accepted: 06/25/2009] [Indexed: 11/25/2022]
|
13
|
|
14
|
Ferrari S. Multiobjective algebraic synthesis of neural control systems by implicit model following. IEEE TRANSACTIONS ON NEURAL NETWORKS 2009; 20:406-19. [PMID: 19203887 DOI: 10.1109/tnn.2008.2008332] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
The advantages brought about by using classical linear control theory in conjunction with neural approximators have long been recognized in the literature. In particular, using linear controllers to obtain the starting neural control design has been shown to be a key step for the successful development and implementation of adaptive-critic neural controllers. Despite their adaptive capabilities, neural controllers are often criticized for not providing the same performance and stability guarantees as classical linear designs. Therefore, this paper develops an algebraic synthesis procedure for designing dynamic output-feedback neural controllers that are closed-loop stable and meet the same performance objectives as any classical linear design. The performance synthesis problem is addressed by deriving implicit model-following algebraic relationships between model matrices, obtained from the classical design, and the neural control parameters. Additional linear matrix inequalities (LMIs) conditions for closed-loop exponential stability of the neural controller are derived using existing integral quadratic constraints (IQCs) for operators with repeated slope-restricted nonlinearities. The approach is demonstrated by designing a recurrent neural network controller for a highly maneuverable tailfin-controlled missile that meets multiple design objectives, including pole placement for transient tuning, H(infinity) and H(2) performance in the presence of parameter uncertainty, and command-input tracking.
Collapse
Affiliation(s)
- Silvia Ferrari
- Department of Mechanical Engineering & Materials Science, Duke University, Durham, NC 27708-0005 USA.
| |
Collapse
|
15
|
Zhigang Zeng, Jun Wang. Design and Analysis of High-Capacity Associative Memories Based on a Class of Discrete-Time Recurrent Neural Networks. ACTA ACUST UNITED AC 2008; 38:1525-36. [DOI: 10.1109/tsmcb.2008.927717] [Citation(s) in RCA: 121] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
16
|
Wu Y, Batalama SN. Improved one-shot learning for feedforward associative memories with application to composite pattern association. ACTA ACUST UNITED AC 2008; 31:119-25. [PMID: 18244773 DOI: 10.1109/3477.907570] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
The local identical index (LII) associative memory (AM) proposed by the authors in a previous paper is a one-shot feedforward structure designed to exhibit no spurious attractors. In this paper we relax the latter design constraint in exchange for enlarged basins of attraction and we develop a family of modified LII AM networks that exhibit improved performance, particularly in memorizing highly correlated patterns. The new algorithm meets the requirement of no spurious attractors only in a local sense. Finally, we show that the modified LII family of networks can accommodate composite patterns of any size by storing (memorizing) only the basic (prime) prototype patterns. The latter property translates to low learning complexity and a simple network structure with significant memory savings. Simulation studies and comparisons illustrate and support the the optical developments.
Collapse
Affiliation(s)
- Y Wu
- Dept. of Electr. Eng., State Univ. of New York, Buffalo, NY
| | | |
Collapse
|
17
|
Zeng Z, Wang J. Analysis and Design of Associative Memories Based on Recurrent Neural Networks with Linear Saturation Activation Functions and Time-Varying Delays. Neural Comput 2007; 19:2149-82. [PMID: 17571941 DOI: 10.1162/neco.2007.19.8.2149] [Citation(s) in RCA: 56] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In this letter, some sufficient conditions are obtained to guarantee recurrent neural networks with linear saturation activation functions, and time-varying delays have multiequilibria located in the saturation region and the boundaries of the saturation region. These results on pattern characterization are used to analyze and design autoassociative memories, which are directly based on the parameters of the neural networks. Moreover, a formula for the numbers of spurious equilibria is also derived. Four design procedures for recurrent neural networks with linear saturation activation functions and time-varying delays are developed based on stability results. Two of these procedures allow the neural network to be capable of learning and forgetting. Finally, simulation results demonstrate the validity and characteristics of the proposed approach.
Collapse
Affiliation(s)
- Zhigang Zeng
- School of Automation, Wuhan University of Technology, Wuhan, Hubei, 430070, China.
| | | |
Collapse
|
18
|
Casali D, Costantini G, Perfetti R, Ricci E. Associative memory design using support vector machines. IEEE TRANSACTIONS ON NEURAL NETWORKS 2006; 17:1165-74. [PMID: 17001978 DOI: 10.1109/tnn.2006.877539] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
The relation existing between support vector machines (SVMs) and recurrent associative memories is investigated. The design of associative memories based on the generalized brain-state-in-a-box (GBSB) neural model is formulated as a set of independent classification tasks which can be efficiently solved by standard software packages for SVM learning. Some properties of the networks designed in this way are evidenced, like the fact that surprisingly they follow a generalized Hebb's law. The performance of the SVM approach is compared to existing methods with nonsymmetric connections, by some design examples.
Collapse
Affiliation(s)
- Daniele Casali
- Department of Electronic Engineering, University of Rome "Tor Vergata," 00100 Rome, Italy.
| | | | | | | |
Collapse
|
19
|
Lee DL, Chuang TC. Designing Asymmetric Hopfield-Type Associative Memory With Higher Order Hamming Stability. ACTA ACUST UNITED AC 2005; 16:1464-76. [PMID: 16342488 DOI: 10.1109/tnn.2005.852863] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The problem of optimal asymmetric Hopfield-type associative memory (HAM) design based on perceptron-type learning algorithms is considered. It is found that most of the existing methods considered the design problem as either 1) finding optimal hyperplanes according to normal distance from the prototype vectors to the hyperplane surface or 2) obtaining weight matrix W = [w(ij)] by solving a constraint optimization problem. In this paper, we show that since the state space of the HAM consists of only bipolar patterns, i.e., V = (v1, v2, . . ., vN)T E {-1, +1}N, the basins of attraction around each prototype (training) vector should be expanded by using Hamming distance measure. For this reason, in this paper, the design problem is considered from a different point of view. Our idea is to systematically increase the size of the training set according to the desired basin of attraction around each prototype vector. We name this concept the higher order Hamming stability and show that conventional minimum-overlap algorithm can be modified to incorporate this concept. Experimental results show that the recall capability as well as the number of spurious memories are all improved by using the proposed method. Moreover, it is well known that setting all self-connections wiiVi to zero has the effect of reducing the number of spurious memories in state space. From the experimental results, we find that the basin width around each prototype vector can be enlarged by allowing nonzero diagonal elements on learning of the weight matrix W. If the magnitude of w(ii) is small for all i, then the condition w(ii) = OVi can be relaxed without seriously affecting the number of spurious memories in the state space. Therefore, the method proposed in this paper can be used to increase the basin width around each prototype vector with the cost of slightly increasing the number of spurious memories in the state space.
Collapse
Affiliation(s)
- Donq-Liang Lee
- Department of Computer Science and Information Engineering, Vanung University, Chung-Li 32056, Taiwan, ROC.
| | | |
Collapse
|
20
|
Park J, Kim HY, Park Y, Lee SW. A synthesis procedure for associative memories based on space-varying cellular neural networks. Neural Netw 2001; 14:107-13. [PMID: 11213209 DOI: 10.1016/s0893-6080(00)00086-1] [Citation(s) in RCA: 34] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
In this paper, we consider the problem of realizing associative memories via space-varying CNNs (cellular neural networks). Based on some known results and a newly derived theorem for the CNN model, we propose a synthesis procedure for obtaining a space-varying CNN that can store given bipolar vectors with certain desirable properties. The major part of our synthesis procedure consists of solving generalized eigenvalue problems and/or linear matrix inequality problems, which can be efficiently solved by recently developed interior point methods. The validity of the proposed approach is illustrated by a design example.
Collapse
Affiliation(s)
- J Park
- Department of Control and Instrumentation Engineering, Korea University, Chochiwon, Chungnam, South Korea.
| | | | | | | |
Collapse
|
21
|
Synthesis approach for bidirectional associative memories based on the perceptron training algorithm. Neurocomputing 2000. [DOI: 10.1016/s0925-2312(00)00313-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
22
|
|