Ferraz MSA, Kihara AH. Beyond randomness: Evaluating measures of information entropy in binary series.
Phys Rev E 2022;
105:044101. [PMID:
35590660 DOI:
10.1103/physreve.105.044101]
[Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2021] [Accepted: 03/09/2022] [Indexed: 06/15/2023]
Abstract
The enormous amount of currently available data demands efforts to extract meaningful information. For this purpose, different measurements are applied, including Shannon's entropy, permutation entropy, and the Lempel-Ziv complexity. These methods have been used in many applications, such as pattern recognition, series classification, and several other areas (e.g., physical, financial, and biomedical). Data in these applications are often presented in binary series with temporal correlations. Herein, we compare the measures of information entropy in binary series conveying short- and long-range temporal correlations characterized by the Hurst exponent H. Combining numerical and analytical approaches, we scrutinize different methods that were not efficient in detecting temporal correlations. To surpass this limitation, we propose a measure called the binary permutation index (BPI). We will demonstrate that BPI efficiently discriminates patterns embedded in the series, offering advantages over previous methods. Subsequently, we collect stock market time series and rain precipitation data as well as perform in vivo electrophysiological recordings in the hippocampus of an experimental animal model of temporal lobe epilepsy, in which the BPI application in both public open source and experimental data is demonstrated. An index is proposed to evaluate information entropy, allowing the ability to discriminate randomness and extract meaningful information in binary time series.
Collapse