Shan Q, Zhang H, Wang Z, Zhang Z. Global Asymptotic Stability and Stabilization of Neural Networks With General Noise.
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2018;
29:597-607. [PMID:
28055925 DOI:
10.1109/tnnls.2016.2637567]
[Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Neural networks (NNs) in the stochastic environment were widely modeled as stochastic differential equations, which were driven by white noise, such as Brown or Wiener process in the existing papers. However, they are not necessarily the best models to describe dynamic characters of NNs disturbed by nonwhite noise in some specific situations. In this paper, general noise disturbance, which may be nonwhite, is introduced to NNs. Since NNs with nonwhite noise cannot be described by Itô integral equation, a novel modeling method of stochastic NNs is utilized. By a framework in light of random field approach and Lyapunov theory, the global asymptotic stability and stabilization in probability or in the mean square of NNs with general noise are analyzed, respectively. Criteria for the concerned systems based on linear matrix inequality are proposed. Some examples are given to illustrate the effectiveness of the obtained results.
Collapse