1
|
Maggu J, Majumdar A. Kernelized transformed subspace clustering with geometric weights for non-linear manifolds. Neurocomputing 2022. [DOI: 10.1016/j.neucom.2022.11.077] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
2
|
Zhang Y, Huang Q, Zhang B, He S, Dan T, Peng H, Cai H. Deep Multiview Clustering via Iteratively Self-Supervised Universal and Specific Space Learning. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:11734-11746. [PMID: 34191743 DOI: 10.1109/tcyb.2021.3086153] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Multiview clustering seeks to partition objects via leveraging cross-view relations to provide a comprehensive description of the same objects. Most existing methods assume that different views are linear transformable or merely sampling from a common latent space. Such rigid assumptions betray reality, thus leading to unsatisfactory performance. To tackle the issue, we propose to learn both common and specific sampling spaces for each view to fully exploit their collaborative representations. The common space corresponds to the universal self-representation basis for all views, while the specific spaces are the view-specific basis accordingly. An iterative self-supervision scheme is conducted to strengthen the learned affinity matrix. The clustering is modeled by a convex optimization. We first solve its linear formulation by the popular scheme. Then, we employ the deep autoencoder structure to exploit its deep nonlinear formulation. The extensive experimental results on six real-world datasets demonstrate that the proposed model achieves uniform superiority over the benchmark methods.
Collapse
|
3
|
Xiao Q, Du S, Yu Y, Huang Y, Song J. Hyper-Laplacian regularized multi-view subspace clustering with jointing representation learning and weighted tensor nuclear norm constraint. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-212316] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
In recent years, tensor-Singular Value Decomposition (t-SVD) based tensor nuclear norm has achieved remarkable progress in multi-view subspace clustering. However, most existing clustering methods still have the following shortcomings: (a) It has no meaning in practical applications for singular values to be treated equally. (b) They often ignore that data samples in the real world usually exist in multiple nonlinear subspaces. In order to solve the above shortcomings, we propose a hyper-Laplacian regularized multi-view subspace clustering model that joints representation learning and weighted tensor nuclear norm constraint, namely JWHMSC. Specifically, in the JWHMSC model, firstly, in order to capture the global structure between different views, the subspace representation matrices of all views are stacked into a low-rank constrained tensor. Secondly, hyper-Laplace graph regularization is adopted to preserve the local geometric structure embedded in the high-dimensional ambient space. Thirdly, considering the prior information of singular values, the weighted tensor nuclear norm (WTNN) based on t-SVD is introduced to treat singular values differently, which makes the JWHMSC more accurately obtain the sample distribution of classification information. Finally, representation learning, WTNN constraint and hyper-Laplacian graph regularization constraint are integrated into a framework to obtain the overall optimal solution of the algorithm. Compared with the state-of-the-art method, the experimental results on eight benchmark datasets show the good performance of the proposed method JWHMSC in multi-view clustering.
Collapse
Affiliation(s)
- Qingjiang Xiao
- Key Laboratory of China’s Ethnic Languages and Information Technology of Ministry of Education, Chinese National Information Technology Research Institute, Northwest Minzu University, Lanzhou, Gansu, China
| | - Shiqiang Du
- Key Laboratory of China’s Ethnic Languages and Information Technology of Ministry of Education, Chinese National Information Technology Research Institute, Northwest Minzu University, Lanzhou, Gansu, China
- College of Mathematics and Computer Science, Northwest Minzu University, Lanzhou, Gansu, China
| | - Yao Yu
- College of Mathematics and Computer Science, Northwest Minzu University, Lanzhou, Gansu, China
| | - Yixuan Huang
- College of Mathematics and Computer Science, Northwest Minzu University, Lanzhou, Gansu, China
| | - Jinmei Song
- Key Laboratory of China’s Ethnic Languages and Information Technology of Ministry of Education, Chinese National Information Technology Research Institute, Northwest Minzu University, Lanzhou, Gansu, China
| |
Collapse
|
4
|
Wang T, Wu J, Zhang Z, Zhou W, Chen G, Liu S. Multi-scale graph attention subspace clustering network. Neurocomputing 2021. [DOI: 10.1016/j.neucom.2021.06.058] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|