1
|
Lin HC, Zeng HB, Zhang XM, Wang W. Stability Analysis for Delayed Neural Networks via a Generalized Reciprocally Convex Inequality. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:7491-7499. [PMID: 35108209 DOI: 10.1109/tnnls.2022.3144032] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
This article deals with the stability of neural networks (NNs) with time-varying delay. First, a generalized reciprocally convex inequality (RCI) is presented, providing a tight bound for reciprocally convex combinations. This inequality includes some existing ones as special case. Second, in order to cater for the use of the generalized RCI, a novel Lyapunov-Krasovskii functional (LKF) is constructed, which includes a generalized delay-product term. Third, based on the generalized RCI and the novel LKF, several stability criteria for the delayed NNs under study are put forward. Finally, two numerical examples are given to illustrate the effectiveness and advantages of the proposed stability criteria.
Collapse
|
2
|
Kwon OM, Lee SH, Park MJ. Some Novel Results on Stability Analysis of Generalized Neural Networks With Time-Varying Delays via Augmented Approach. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:2238-2248. [PMID: 32886616 DOI: 10.1109/tcyb.2020.3001341] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
This article proposes three new methods to enlarge the feasible region for guaranteeing stability for generalized neural networks having time-varying delays based on the Lyapunov method. First, two new zero equalities in which three states are augmented are proposed and inserted into the results of the time derivative of the constructed Lyapunov-Krasovskii functionals for the first time. Second, inspired by the Wirtinger-based integral inequality, new Lyapunov-Krasovskii functionals are introduced. Finally, by utilizing the relationship among the augmented vectors and from the original equation, newly augmented zero equalities are established and Finsler's lemma are applied. Through three numerical examples, it is verified that the proposed methods can contribute to enhance the allowable region of maximum delay bounds.
Collapse
|
3
|
Zhang X, Wang D, Ota K, Dong M, Li H. Exponential Stability of Mixed Time-Delay Neural Networks Based on Switching Approaches. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:1125-1137. [PMID: 32396121 DOI: 10.1109/tcyb.2020.2985777] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Neural networks (NNs) have been deeply studied due to their wide applicability. Since time delays are unavoidable in reality, it is basic and crucial for all applications based on NNs to guarantee system stability under the influence of mixed time delays. To better exploit the variation information of time delay, we introduce the switching idea and approaches into mixed time-delay NNs to solve the stability problem. First, the considered mixed time-delay NNs are modeled as the switched NNs by dividing the two classes of time delays, discrete and distributed time delays, into some variable intervals and combining these intervals as new switching modes. With the help of mode-dependent average dwell-time switching, Lyapunov theory, and mathematical techniques, several exponential stability criteria on the modeled switched systems containing different modes are obtained. Moreover, via introducing the mathematical condition of the unstable subsystem in the switching system, a less conservativeness condition on the exponential stability of the modeled NNs is proposed. We perform three examples for testifying the validity of the proposed methods over existing ones.
Collapse
|
4
|
Lee SH, Park MJ, Ji DH, Kwon OM. Stability and dissipativity criteria for neural networks with time-varying delays via an augmented zero equality approach. Neural Netw 2021; 146:141-150. [PMID: 34856528 DOI: 10.1016/j.neunet.2021.11.007] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Revised: 09/29/2021] [Accepted: 11/05/2021] [Indexed: 10/19/2022]
Abstract
This work investigates the stability and dissipativity problems for neural networks with time-varying delay. By the construction of new augmented Lyapunov-Krasovskii functionals based on integral inequality and the use of zero equality approach, three improved results are proposed in the forms of linear matrix inequalities. And, based on the stability results, the dissipativity analysis for NNs with time-varying delays was investigated. Through some numerical examples, the superiority and effectiveness of the proposed results are shown by comparing the existing works.
Collapse
Affiliation(s)
- S H Lee
- School of Electrical Engineering, Chungbuk National University, Cheongju 28644, Republic of Korea
| | - M J Park
- Center for Global Converging Humanities, Kyung Hee University, Yongin 17104, Republic of Korea
| | - D H Ji
- Samsung Advanced Institute Of Technology, Samsung Electronics, Suwon 16678, Republic of Korea.
| | - O M Kwon
- School of Electrical Engineering, Chungbuk National University, Cheongju 28644, Republic of Korea.
| |
Collapse
|
5
|
Stability analysis of delayed neural networks based on a relaxed delay-product-type Lyapunov functional. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2021.01.098] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
6
|
Tian Y, Wang Z. Stability analysis for delayed neural networks based on the augmented Lyapunov-Krasovskii functional with delay-product-type and multiple integral terms. Neurocomputing 2020. [DOI: 10.1016/j.neucom.2020.05.045] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
7
|
Feng Z, Shao H, Shao L. Further improved stability results for generalized neural networks with time-varying delays. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2019.07.019] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
8
|
Chen J, Park JH, Xu S. Stability analysis of discrete-time neural networks with an interval-like time-varying delay. Neurocomputing 2019. [DOI: 10.1016/j.neucom.2018.10.044] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
9
|
Zhang R, Zeng D, Park JH, Liu Y, Zhong S. A New Approach to Stochastic Stability of Markovian Neural Networks With Generalized Transition Rates. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2019; 30:499-510. [PMID: 29994722 DOI: 10.1109/tnnls.2018.2843771] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper investigates the stability problem of Markovian neural networks (MNNs) with time delay. First, to reflect more realistic behaviors, more generalized transition rates are considered for MNNs, where all transition rates of some jumping modes are completely unknown. Second, a new approach, namely time-delay-dependent-matrix (TDDM) approach, is proposed for the first time. The TDDM approach is associated with both time delay and its time derivative. Thus, the TDDM approach can fully capture the information of time delay and would play a key role in deriving less conservative results. Third, based on the TDDM approach and applying Wirtinger's inequality and improved reciprocally convex inequality, stability criteria are derived. In comparison with some existing results, our results are not only less conservative but also involve lower calculation complexity. Finally, numerical examples are provided to show the effectiveness and advantages of the proposed results.
Collapse
|
10
|
Passivity and stability analysis of neural networks with time-varying delays via extended free-weighting matrices integral inequality. Neural Netw 2018; 106:67-78. [DOI: 10.1016/j.neunet.2018.06.010] [Citation(s) in RCA: 38] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2018] [Revised: 05/07/2018] [Accepted: 06/13/2018] [Indexed: 11/22/2022]
|
11
|
Ding S, Wang Z, Zhang H. Event-Triggered Stabilization of Neural Networks With Time-Varying Switching Gains and Input Saturation. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:5045-5056. [PMID: 29994184 DOI: 10.1109/tnnls.2017.2787642] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper investigates the event-triggered stabilization of neural networks (NNs) subject to input saturation. The main core lies in the design of a novel controller with time-varying switching gains and the associated switching event-triggered condition (ETC). The ETC is essentially a switching between the aperiodic sampling and continuous event trigger. The control gains of the designed controller are composed of an exponentially decaying term and two gain matrices. The two gain matrices are required to be switched when the switching between the aperiodic sampling and continuous event trigger is met. By employing the generalized sector condition and switching Lyapunov function, several sufficient conditions that ensure the local exponential stability of the NNs are formulated in terms of linear matrix inequalities (LMIs). Both the exponentially decaying term and switching gains improve the feasible region of LMIs, and then they are helpful to enlarge the set of admissible initial conditions, the threshold in ETC, and the average waiting time. Together with several optimization problems, two numerical examples are employed to validate the effectiveness of our results.
Collapse
|
12
|
Lee TH, Trinh HM, Park JH. Stability Analysis of Neural Networks With Time-Varying Delay by Constructing Novel Lyapunov Functionals. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:4238-4247. [PMID: 29990087 DOI: 10.1109/tnnls.2017.2760979] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
This paper presents two novel Lyapunov functionals for analyzing the stability of neural networks with time-varying delay. Based on our newly proposed Lyapunov functionals and a relaxed Wirtinger-based integral inequality, new stability criteria are derived in the form of linear matrix inequalities. A comprehensive comparison of results is given to illustrate the newly proposed stability criteria from both the conservative and computational complexity point of views.
Collapse
|
13
|
Li Z, Bai Y, Huang C, Yan H, Mu S. Improved Stability Analysis for Delayed Neural Networks. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:4535-4541. [PMID: 29990171 DOI: 10.1109/tnnls.2017.2743262] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
In this brief, by constructing an augmented Lyapunov-Krasovskii functional in a triple integral form, the stability analysis of delayed neural networks is investigated. In order to exploit more accurate bounds for the derivatives of triple integrals, new double integral inequalities are developed, which include some recently introduced estimation techniques as special cases. The information on the activation function is taken into full consideration. Taking advantages of the proposed inequalities, the stability criteria with less conservatism are derived. The improvement of the obtained approaches is verified by numerical examples.
Collapse
|
14
|
Xu Y, Lu R, Shi P, Tao J, Xie S. Robust Estimation for Neural Networks With Randomly Occurring Distributed Delays and Markovian Jump Coupling. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:845-855. [PMID: 28129186 DOI: 10.1109/tnnls.2016.2636325] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
This paper studies the issue of robust state estimation for coupled neural networks with parameter uncertainty and randomly occurring distributed delays, where the polytopic model is employed to describe the parameter uncertainty. A set of Bernoulli processes with different stochastic properties are introduced to model the randomly occurrences of the distributed delays. Novel state estimators based on the local coupling structure are proposed to make full use of the coupling information. The augmented estimation error system is obtained based on the Kronecker product. A new Lyapunov function, which depends both on the polytopic uncertainty and the coupling information, is introduced to reduce the conservatism. Sufficient conditions, which guarantee the stochastic stability and the performance of the augmented estimation error system, are established. Then, the estimator gains are further obtained on the basis of these conditions. Finally, a numerical example is used to prove the effectiveness of the results.
Collapse
|
15
|
Wan L, Wu A. Multistability in Mittag-Leffler sense of fractional-order neural networks with piecewise constant arguments. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.01.049] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
16
|
Ding S, Wang Z, Zhang H. Dissipativity Analysis for Stochastic Memristive Neural Networks With Time-Varying Delays: A Discrete-Time Case. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018; 29:618-630. [PMID: 28055917 DOI: 10.1109/tnnls.2016.2631624] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In this paper, the dissipativity problem of discrete-time memristive neural networks (DMNNs) with time-varying delays and stochastic perturbation is investigated. A class of logical switched functions are put forward to reflect the memristor-based switched property of connection weights, and the DMNNs are then recast into a tractable model. Based on the tractable model, the robust analysis method and Refined Jensen-based inequalities are applied to establish some sufficient conditions that ensure the of DMNNs. Two numerical examples are presented to illustrate the effectiveness of the obtained results.
Collapse
|
17
|
Wang Z, Ding S, Shan Q, Zhang H. Stability of Recurrent Neural Networks With Time-Varying Delay via Flexible Terminal Method. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2017; 28:2456-2463. [PMID: 27448372 DOI: 10.1109/tnnls.2016.2578309] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
This brief is concerned with the stability criteria for recurrent neural networks with time-varying delay. First, based on convex combination technique, a delay interval with fixed terminals is changed into the one with flexible terminals, which is called flexible terminal method (FTM). Second, based on the FTM, a novel Lyapunov-Krasovskii functional is constructed, in which the integral interval associated with delayed variables is not fixed. Thus, the FTM can achieve the same effect as that of delay-partitioning method, while their implementary ways are different. Guided by FTM, Wirtinger-based integral inequality and free-weight matrix method are employed to develop several stability criteria, respectively. Finally, the feasibility and the effectiveness of the proposed results are tested by two numerical examples.
Collapse
|
18
|
Zhang CK, He Y, Jiang L, Wang QG, Wu M. Stability Analysis of Discrete-Time Neural Networks With Time-Varying Delay via an Extended Reciprocally Convex Matrix Inequality. IEEE TRANSACTIONS ON CYBERNETICS 2017; 47:3040-3049. [PMID: 28222008 DOI: 10.1109/tcyb.2017.2665683] [Citation(s) in RCA: 78] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
This paper is concerned with the stability analysis of discrete-time neural networks with a time-varying delay. Assessment of the effect of time delays on system stability requires suitable delay-dependent stability criteria. This paper aims to develop new stability criteria for reduction of conservatism without much increase of computational burden. An extended reciprocally convex matrix inequality is developed to replace the popular reciprocally convex combination lemma (RCCL). It has potential to reduce the conservatism of the RCCL-based criteria without introducing any extra decision variable due to its advantage of reduced estimation gap using the same decision variables. Moreover, a delay-product-type term is introduced for the first time into the Lyapunov function candidate such that a delay-variation-dependent stability criterion with the bounds of delay change rate is established. Finally, the advantages of the proposed criteria are demonstrated through two numerical examples.
Collapse
|
19
|
Ding S, Wang Z, Wang J, Zhang H. H∞state estimation for memristive neural networks with time-varying delays: The discrete-time case. Neural Netw 2016; 84:47-56. [DOI: 10.1016/j.neunet.2016.08.002] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2016] [Revised: 07/28/2016] [Accepted: 08/08/2016] [Indexed: 10/21/2022]
|