1
|
Kong LW, Brewer GA, Lai YC. Reservoir-computing based associative memory and itinerancy for complex dynamical attractors. Nat Commun 2024; 15:4840. [PMID: 38844437 DOI: 10.1038/s41467-024-49190-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Accepted: 05/24/2024] [Indexed: 06/09/2024] Open
Abstract
Traditional neural network models of associative memories were used to store and retrieve static patterns. We develop reservoir-computing based memories for complex dynamical attractors, under two common recalling scenarios in neuropsychology: location-addressable with an index channel and content-addressable without such a channel. We demonstrate that, for location-addressable retrieval, a single reservoir computing machine can memorize a large number of periodic and chaotic attractors, each retrievable with a specific index value. We articulate control strategies to achieve successful switching among the attractors, unveil the mechanism behind failed switching, and uncover various scaling behaviors between the number of stored attractors and the reservoir network size. For content-addressable retrieval, we exploit multistability with cue signals, where the stored attractors coexist in the high-dimensional phase space of the reservoir network. As the length of the cue signal increases through a critical value, a high success rate can be achieved. The work provides foundational insights into developing long-term memories and itinerancy for complex dynamical patterns.
Collapse
Affiliation(s)
- Ling-Wei Kong
- Department of Computational Biology, Cornell University, Ithaca, New York, USA
- School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, Arizona, USA
| | - Gene A Brewer
- Department of Psychology, Arizona State University, Tempe, Arizona, USA
| | - Ying-Cheng Lai
- School of Electrical, Computer and Energy Engineering, Arizona State University, Tempe, Arizona, USA.
- Department of Physics, Arizona State University, Tempe, Arizona, USA.
| |
Collapse
|
2
|
Hart JD. Attractor reconstruction with reservoir computers: The effect of the reservoir's conditional Lyapunov exponents on faithful attractor reconstruction. CHAOS (WOODBURY, N.Y.) 2024; 34:043123. [PMID: 38579149 DOI: 10.1063/5.0196257] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Accepted: 03/22/2024] [Indexed: 04/07/2024]
Abstract
Reservoir computing is a machine learning framework that has been shown to be able to replicate the chaotic attractor, including the fractal dimension and the entire Lyapunov spectrum, of the dynamical system on which it is trained. We quantitatively relate the generalized synchronization dynamics of a driven reservoir during the training stage to the performance of the trained reservoir computer at the attractor reconstruction task. We show that, in order to obtain successful attractor reconstruction and Lyapunov spectrum estimation, the maximal conditional Lyapunov exponent of the driven reservoir must be significantly more negative than the most negative Lyapunov exponent of the target system. We also find that the maximal conditional Lyapunov exponent of the reservoir depends strongly on the spectral radius of the reservoir adjacency matrix; therefore, for attractor reconstruction and Lyapunov spectrum estimation, small spectral radius reservoir computers perform better in general. Our arguments are supported by numerical examples on well-known chaotic systems.
Collapse
Affiliation(s)
- Joseph D Hart
- U.S. Naval Research Laboratory, Code 5675, Washington, DC 20375, USA
| |
Collapse
|
3
|
Whiteaker B, Gerstoft P. Reducing echo state network size with controllability matrices. CHAOS (WOODBURY, N.Y.) 2022; 32:073116. [PMID: 35907714 DOI: 10.1063/5.0071926] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/17/2021] [Accepted: 06/21/2022] [Indexed: 06/15/2023]
Abstract
Echo state networks are a fast training variant of recurrent neural networks excelling at approximating nonlinear dynamical systems and time series prediction. These machine learning models act as nonlinear fading memory filters. While these models benefit from quick training and low complexity, computation demands from a large reservoir matrix are a bottleneck. Using control theory, a reduced size replacement reservoir matrix is found. Starting from a large, task-effective reservoir matrix, we form a controllability matrix whose rank indicates the active sub-manifold and candidate replacement reservoir size. Resulting time speed-ups and reduced memory usage come with minimal error increase to chaotic climate reconstruction or short term prediction. Experiments are performed on simple time series signals and the Lorenz-1963 and Mackey-Glass complex chaotic signals. Observing low error models shows variation of active rank and memory along a sequence of predictions.
Collapse
Affiliation(s)
- Brian Whiteaker
- Scripps Institution of Oceanography, University of California at San Diego, La Jolla, California 92093-0238, USA
| | - Peter Gerstoft
- Scripps Institution of Oceanography, University of California at San Diego, La Jolla, California 92093-0238, USA
| |
Collapse
|
4
|
Roy M, Senapati A, Poria S, Mishra A, Hens C. Role of assortativity in predicting burst synchronization using echo state network. Phys Rev E 2022; 105:064205. [PMID: 35854538 DOI: 10.1103/physreve.105.064205] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2021] [Accepted: 05/11/2022] [Indexed: 06/15/2023]
Abstract
In this study, we use a reservoir computing based echo state network (ESN) to predict the collective burst synchronization of neurons. Specifically, we investigate the ability of ESN in predicting the burst synchronization of an ensemble of Rulkov neurons placed on a scale-free network. We have shown that a limited number of nodal dynamics used as input in the machine can capture the real trend of burst synchronization in this network. Further, we investigate the proper selection of nodal inputs of degree-degree (positive and negative) correlated networks. We show that for a disassortative network, selection of different input nodes based on degree has no significant role in the machine's prediction. However, in the case of assortative network, training the machine with the information (i.e., time series) of low degree nodes gives better results in predicting the burst synchronization. The results are found to be consistent with the investigation carried out with a continuous time Hindmarsh-Rose neuron model. Furthermore, the role of hyperparameters like spectral radius and leaking parameter of ESN on the prediction process has been examined. Finally, we explain the underlying mechanism responsible for observing these differences in the prediction in a degree correlated network.
Collapse
Affiliation(s)
- Mousumi Roy
- Department of Applied Mathematics, University of Calcutta, 92, A.P.C. Road, Kolkata 700009, India
| | - Abhishek Senapati
- Center for Advanced Systems Understanding (CASUS), 02826 Görlitz, Germany
| | - Swarup Poria
- Department of Applied Mathematics, University of Calcutta, 92, A.P.C. Road, Kolkata 700009, India
| | - Arindam Mishra
- Division of Dynamics, Lodz University of Technology, Stefanowskiego 1/15, 90924 Lodz, Poland
| | - Chittaranjan Hens
- Physics and Applied Mathematics Unit, Indian Statistical Institute, Kolkata 700108, India
| |
Collapse
|
5
|
Platt JA, Wong A, Clark R, Penny SG, Abarbanel HDI. Robust forecasting using predictive generalized synchronization in reservoir computing. CHAOS (WOODBURY, N.Y.) 2021; 31:123118. [PMID: 34972341 DOI: 10.1063/5.0066013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/07/2021] [Accepted: 11/15/2021] [Indexed: 06/14/2023]
Abstract
Reservoir computers (RCs) are a class of recurrent neural networks (RNNs) that can be used for forecasting the future of observed time series data. As with all RNNs, selecting the hyperparameters in the network to yield excellent forecasting presents a challenge when training on new inputs. We analyze a method based on predictive generalized synchronization (PGS) that gives direction in designing and evaluating the architecture and hyperparameters of an RC. To determine the occurrences of PGS, we rely on the auxiliary method to provide a computationally efficient pre-training test that guides hyperparameter selection. We provide a metric for evaluating the RC using the reproduction of the input system's Lyapunov exponents that demonstrates robustness in prediction.
Collapse
Affiliation(s)
- Jason A Platt
- Department of Physics, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093, USA
| | - Adrian Wong
- Department of Physics, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093, USA
| | - Randall Clark
- Department of Physics, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093, USA
| | - Stephen G Penny
- Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder, Boulder, Colorado 80305-3328, USA
| | - Henry D I Abarbanel
- Department of Physics, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093, USA
| |
Collapse
|
6
|
Ray A, Chakraborty T, Ghosh D. Optimized ensemble deep learning framework for scalable forecasting of dynamics containing extreme events. CHAOS (WOODBURY, N.Y.) 2021; 31:111105. [PMID: 34881612 DOI: 10.1063/5.0074213] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Accepted: 11/02/2021] [Indexed: 06/13/2023]
Abstract
The remarkable flexibility and adaptability of both deep learning models and ensemble methods have led to the proliferation for their application in understanding many physical phenomena. Traditionally, these two techniques have largely been treated as independent methodologies in practical applications. This study develops an optimized ensemble deep learning framework wherein these two machine learning techniques are jointly used to achieve synergistic improvements in model accuracy, stability, scalability, and reproducibility, prompting a new wave of applications in the forecasting of dynamics. Unpredictability is considered one of the key features of chaotic dynamics; therefore, forecasting such dynamics of nonlinear systems is a relevant issue in the scientific community. It becomes more challenging when the prediction of extreme events is the focus issue for us. In this circumstance, the proposed optimized ensemble deep learning (OEDL) model based on a best convex combination of feed-forward neural networks, reservoir computing, and long short-term memory can play a key role in advancing predictions of dynamics consisting of extreme events. The combined framework can generate the best out-of-sample performance than the individual deep learners and standard ensemble framework for both numerically simulated and real-world data sets. We exhibit the outstanding performance of the OEDL framework for forecasting extreme events generated from a Liénard-type system, prediction of COVID-19 cases in Brazil, dengue cases in San Juan, and sea surface temperature in the Niño 3.4 region.
Collapse
Affiliation(s)
- Arnob Ray
- Physics and Applied Mathematics Unit, Indian Statistical Institute, Kolkata 700108, India
| | - Tanujit Chakraborty
- Department of Science and Engineering, Sorbonne University Abu Dhabi, Abu Dhabi, UAE
| | - Dibakar Ghosh
- Physics and Applied Mathematics Unit, Indian Statistical Institute, Kolkata 700108, India
| |
Collapse
|
7
|
Ghosh S, Senapati A, Mishra A, Chattopadhyay J, Dana SK, Hens C, Ghosh D. Reservoir computing on epidemic spreading: A case study on COVID-19 cases. Phys Rev E 2021; 104:014308. [PMID: 34412296 DOI: 10.1103/physreve.104.014308] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Accepted: 06/23/2021] [Indexed: 12/19/2022]
Abstract
A reservoir computing based echo state network (ESN) is used here for the purpose of predicting the spread of a disease. The current infection trends of a disease in some targeted locations are efficiently captured by the ESN when it is fed with the infection data for other locations. The performance of the ESN is first tested with synthetic data generated by numerical simulations of independent uncoupled patches, each governed by the classical susceptible-infected-recovery model for a choice of distributed infection parameters. From a large pool of synthetic data, the ESN predicts the current trend of infection in 5% patches by exploiting the uncorrelated infection trend of 95% patches. The prediction remains consistent for most of the patches for approximately 4 to 5 weeks. The machine's performance is further tested with real data on the current COVID-19 pandemic collected for different countries. We show that our proposed scheme is able to predict the trend of the disease for up to 3 weeks for some targeted locations. An important point is that no detailed information on the epidemiological rate parameters is needed; the success of the machine rather depends on the history of the disease progress represented by the time-evolving data sets of a large number of locations. Finally, we apply a modified version of our proposed scheme for the purpose of future forecasting.
Collapse
Affiliation(s)
- Subrata Ghosh
- Physics and Applied Mathematics Unit, Indian Statistical Institute, 203 B. T. Road, Kolkata 700108, India
| | - Abhishek Senapati
- Agricultural and Ecological Research Unit, Indian Statistical Institute, 203 B. T. Road, Kolkata 700108, India.,Center for Advanced Systems Understanding (CASUS), Goerlitz, Germany
| | - Arindam Mishra
- Department of Mathematics, Jadavpur University, Kolkata 700032, India
| | - Joydev Chattopadhyay
- Agricultural and Ecological Research Unit, Indian Statistical Institute, 203 B. T. Road, Kolkata 700108, India
| | - Syamal K Dana
- Department of Mathematics, Jadavpur University, Kolkata 700032, India
| | - Chittaranjan Hens
- Physics and Applied Mathematics Unit, Indian Statistical Institute, 203 B. T. Road, Kolkata 700108, India
| | - Dibakar Ghosh
- Physics and Applied Mathematics Unit, Indian Statistical Institute, 203 B. T. Road, Kolkata 700108, India
| |
Collapse
|
8
|
Verzelli P, Alippi C, Livi L. Learn to synchronize, synchronize to learn. CHAOS (WOODBURY, N.Y.) 2021; 31:083119. [PMID: 34470256 DOI: 10.1063/5.0056425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 07/27/2021] [Indexed: 06/13/2023]
Abstract
In recent years, the artificial intelligence community has seen a continuous interest in research aimed at investigating dynamical aspects of both training procedures and machine learning models. Of particular interest among recurrent neural networks, we have the Reservoir Computing (RC) paradigm characterized by conceptual simplicity and a fast training scheme. Yet, the guiding principles under which RC operates are only partially understood. In this work, we analyze the role played by Generalized Synchronization (GS) when training a RC to solve a generic task. In particular, we show how GS allows the reservoir to correctly encode the system generating the input signal into its dynamics. We also discuss necessary and sufficient conditions for the learning to be feasible in this approach. Moreover, we explore the role that ergodicity plays in this process, showing how its presence allows the learning outcome to apply to multiple input trajectories. Finally, we show that satisfaction of the GS can be measured by means of the mutual false nearest neighbors index, which makes effective to practitioners theoretical derivations.
Collapse
Affiliation(s)
- Pietro Verzelli
- Faculty of Informatics, Università della Svizzera Italiana, Lugano 69000, Switzerland
| | - Cesare Alippi
- Faculty of Informatics, Università della Svizzera Italiana, Lugano 69000, Switzerland
| | - Lorenzo Livi
- Department of Computer Science and Mathematics, University of Manitoba, Winnipeg, Manitoba R3T 2N2, Canada
| |
Collapse
|
9
|
Grigoryeva L, Hart A, Ortega JP. Chaos on compact manifolds: Differentiable synchronizations beyond the Takens theorem. Phys Rev E 2021; 103:062204. [PMID: 34271749 DOI: 10.1103/physreve.103.062204] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/20/2020] [Accepted: 05/05/2021] [Indexed: 11/07/2022]
Abstract
This paper shows that a large class of fading memory state-space systems driven by discrete-time observations of dynamical systems defined on compact manifolds always yields continuously differentiable synchronizations. This general result provides a powerful tool for the representation, reconstruction, and forecasting of chaotic attractors. It also improves previous statements in the literature for differentiable generalized synchronizations, whose existence was so far guaranteed for a restricted family of systems and was detected using Hölder exponent-based criteria.
Collapse
Affiliation(s)
- Lyudmila Grigoryeva
- Department of Mathematics and Statistics, Universität Konstanz, Box 146, D-78457 Konstanz, Germany
| | - Allen Hart
- Department of Mathematical Sciences, University of Bath, Bath BA2 7AY, United Kingdom
| | - Juan-Pablo Ortega
- Division of Mathematical Sciences, Nanyang Technological University, 637371 Singapore
| |
Collapse
|
10
|
Carroll TL. Optimizing Reservoir Computers for Signal Classification. Front Physiol 2021; 12:685121. [PMID: 34220549 PMCID: PMC8249854 DOI: 10.3389/fphys.2021.685121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Accepted: 05/24/2021] [Indexed: 11/13/2022] Open
Abstract
Reservoir computers are a type of recurrent neural network for which the network connections are not changed. To train the reservoir computer, a set of output signals from the network are fit to a training signal by a linear fit. As a result, training of a reservoir computer is fast, and reservoir computers may be built from analog hardware, resulting in high speed and low power consumption. To get the best performance from a reservoir computer, the hyperparameters of the reservoir computer must be optimized. In signal classification problems, parameter optimization may be computationally difficult; it is necessary to compare many realizations of the test signals to get good statistics on the classification probability. In this work, it is shown in both a spiking reservoir computer and a reservoir computer using continuous variables that the optimum classification performance occurs for the hyperparameters that maximize the entropy of the reservoir computer. Optimizing for entropy only requires a single realization of each signal to be classified, making the process much faster to compute.
Collapse
|
11
|
Carroll TL. Low dimensional manifolds in reservoir computers. CHAOS (WOODBURY, N.Y.) 2021; 31:043113. [PMID: 34251231 DOI: 10.1063/5.0047006] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Accepted: 03/26/2021] [Indexed: 06/13/2023]
Abstract
A reservoir computer is a complex dynamical system, often created by coupling nonlinear nodes in a network. The nodes are all driven by a common driving signal. Reservoir computers can contain hundreds to thousands of nodes, resulting in a high dimensional dynamical system, but the reservoir computer variables evolve on a lower dimensional manifold in this high dimensional space. This paper describes how this manifold dimension depends on the parameters of the reservoir computer, and how the manifold dimension is related to the performance of the reservoir computer at a signal estimation task. It is demonstrated that increasing the coupling between nodes while controlling the largest Lyapunov exponent of the reservoir computer can optimize the reservoir computer performance. It is also noted that the sparsity of the reservoir computer network does not have any influence on performance.
Collapse
Affiliation(s)
- T L Carroll
- U.S. Naval Research Lab, Washington, DC 20375, USA
| |
Collapse
|
12
|
Guo Y, Zhang H, Wang L, Fan H, Xiao J, Wang X. Transfer learning of chaotic systems. CHAOS (WOODBURY, N.Y.) 2021; 31:011104. [PMID: 33754764 DOI: 10.1063/5.0033870] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/20/2020] [Accepted: 12/24/2020] [Indexed: 06/12/2023]
Abstract
Can a neural network trained by the time series of system A be used to predict the evolution of system B? This problem, knowing as transfer learning in a broad sense, is of great importance in machine learning and data mining yet has not been addressed for chaotic systems. Here, we investigate transfer learning of chaotic systems from the perspective of synchronization-based state inference, in which a reservoir computer trained by chaotic system A is used to infer the unmeasured variables of chaotic system B, while A is different from B in either parameter or dynamics. It is found that if systems A and B are different in parameter, the reservoir computer can be well synchronized to system B. However, if systems A and B are different in dynamics, the reservoir computer fails to synchronize with system B in general. Knowledge transfer along a chain of coupled reservoir computers is also studied, and it is found that, although the reservoir computers are trained by different systems, the unmeasured variables of the driving system can be successfully inferred by the remote reservoir computer. Finally, by an experiment of chaotic pendulum, we demonstrate that the knowledge learned from the modeling system can be transferred and used to predict the evolution of the experimental system.
Collapse
Affiliation(s)
- Yali Guo
- School of Physics and Information Technology, Shaanxi Normal University, Xi'an 710062, China
| | - Han Zhang
- School of Physics and Information Technology, Shaanxi Normal University, Xi'an 710062, China
| | - Liang Wang
- School of Physics and Information Technology, Shaanxi Normal University, Xi'an 710062, China
| | - Huawei Fan
- School of Physics and Information Technology, Shaanxi Normal University, Xi'an 710062, China
| | - Jinghua Xiao
- School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China
| | - Xingang Wang
- School of Physics and Information Technology, Shaanxi Normal University, Xi'an 710062, China
| |
Collapse
|
13
|
Carroll TL. Do reservoir computers work best at the edge of chaos? CHAOS (WOODBURY, N.Y.) 2020; 30:121109. [PMID: 33380041 DOI: 10.1063/5.0038163] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/19/2020] [Accepted: 12/02/2020] [Indexed: 06/12/2023]
Abstract
It has been demonstrated that cellular automata had the highest computational capacity at the edge of chaos [N. H. Packard, in Dynamic Patterns in Complex Systems, edited by J. A. S. Kelso, A. J. Mandell, and M. F. Shlesinger (World Scientific, Singapore, 1988), pp. 293-301; C. G. Langton, Physica D 42(1), 12-37 (1990); J. P. Crutchfield and K. Young, in Complexity, Entropy, and the Physics of Information, edited by W. H. Zurek (Addison-Wesley, Redwood City, CA, 1990), pp. 223-269], the parameter at which their behavior transitioned from ordered to chaotic. This same concept has been applied to reservoir computers; a number of researchers have stated that the highest computational capacity for a reservoir computer is at the edge of chaos, although others have suggested that this rule is not universally true. Because many reservoir computers do not show chaotic behavior but merely become unstable, it is felt that a more accurate term for this instability transition is the "edge of stability." Here, I find two examples where the computational capacity of a reservoir computer decreases as the edge of stability is approached: in one case because generalized synchronization breaks down and in the other case because the reservoir computer is a poor match to the problem being solved. The edge of stability as an optimal operating point for a reservoir computer is not in general true, although it may be true in some cases.
Collapse
Affiliation(s)
- T L Carroll
- U.S. Naval Research Lab, Washington, DC 20375, USA
| |
Collapse
|
14
|
Tang Y, Kurths J, Lin W, Ott E, Kocarev L. Introduction to Focus Issue: When machine learning meets complex systems: Networks, chaos, and nonlinear dynamics. CHAOS (WOODBURY, N.Y.) 2020; 30:063151. [PMID: 32611112 DOI: 10.1063/5.0016505] [Citation(s) in RCA: 27] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Accepted: 06/05/2020] [Indexed: 06/11/2023]
Affiliation(s)
- Yang Tang
- Key Laboratory of Advanced Control and Optimization for Chemical Processes, Ministry of Education, East China University of Science and Technology, Shanghai, China
| | - Jürgen Kurths
- Potsdam Institute for Climate Impact Research, Potsdam 14473, Germany
| | - Wei Lin
- Center for Computational Systems Biology of ISTBI and Research Institute of Intelligent Complex Systems, Fudan University, Shanghai 200433, China
| | - Edward Ott
- Department of Physics, University of Maryland, College Park, Maryland 20742, USA
| | - Ljupco Kocarev
- Macedonian Academy of Sciences and Arts, 1000 Skopje, Macedonia
| |
Collapse
|