Ma Z, Lu X, Xie J, Yang Z, Xue JH, Tan ZH, Xiao B, Guo J. On the Comparisons of Decorrelation Approaches for Non-Gaussian Neutral Vector Variables.
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023;
34:1823-1837. [PMID:
32248126 DOI:
10.1109/tnnls.2020.2978858]
[Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
As a typical non-Gaussian vector variable, a neutral vector variable contains nonnegative elements only, and its l1 -norm equals one. In addition, its neutral properties make it significantly different from the commonly studied vector variables (e.g., the Gaussian vector variables). Due to the aforementioned properties, the conventionally applied linear transformation approaches [e.g., principal component analysis (PCA) and independent component analysis (ICA)] are not suitable for neutral vector variables, as PCA cannot transform a neutral vector variable, which is highly negatively correlated, into a set of mutually independent scalar variables and ICA cannot preserve the bounded property after transformation. In recent work, we proposed an efficient nonlinear transformation approach, i.e., the parallel nonlinear transformation (PNT), for decorrelating neutral vector variables. In this article, we extensively compare PNT with PCA and ICA through both theoretical analysis and experimental evaluations. The results of our investigations demonstrate the superiority of PNT for decorrelating the neutral vector variables.
Collapse