1
|
Wang Y, Cengiz K. Implementation of the Spark technique in a matrix distributed computing algorithm. JOURNAL OF INTELLIGENT SYSTEMS 2022. [DOI: 10.1515/jisys-2022-0051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Abstract
Two analyzes of Spark engine performance strategies to implement the Spark technique in a matrix distributed computational algorithm, the multiplication of a sparse multiplication operational test model. The dimensions of the two input sparse matrices have been fixed to 30,000 × 30,000, and the density of the input matrix have been changed. The experimental results show that when the density reaches about 0.3, the original dense matrix multiplication performance can outperform the sparse-sparse matrix multiplication, which is basically consistent with the relationship between the sparse matrix multiplication implementation in the single-machine sparse matrix test and the computational performance of the local native library. When the density of the fixed sparse matrix is 0.01, the distributed density-sparse matrix multiplication outperforms the same sparsity but uses the density matrix storage, and the acceleration ratio increases from 1.88× to 5.71× with the increase in dimension. The overall performance of distributed operations is improved.
Collapse
Affiliation(s)
- Ying Wang
- Department of Information Engineering, Tianjin Maritime College , Tianjin , 300350 , China
| | - Korhan Cengiz
- College of Information Technology, University of Fujairah , Fujairah , United Arab Emirates
- Department of Electrical – Electronics Engineering, Trakya University , 22030 , Edirne , Turkey
| |
Collapse
|
2
|
A Clustering Ensemble Framework with Integration of Data Characteristics and Structure Information: A Graph Neural Networks Approach. MATHEMATICS 2022. [DOI: 10.3390/math10111834] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Clustering ensemble is a research hotspot of data mining that aggregates several base clustering results to generate a single output clustering with improved robustness and stability. However, the validity of the ensemble result is usually affected by unreliability in the generation and integration of base clusterings. In order to address this issue, we develop a clustering ensemble framework viewed from graph neural networks that generates an ensemble result by integrating data characteristics and structure information. In this framework, we extract structure information from base clustering results of the data set by using a coupling affinity measure After that, we combine structure information with data characteristics by using a graph neural network (GNN) to learn their joint embeddings in latent space. Then, we employ a Gaussian mixture model (GMM) to predict the final cluster assignment in the latent space. Finally, we construct the GNN and GMM as a unified optimization model to integrate the objectives of graph embedding and consensus clustering. Our framework can not only elegantly combine information in feature space and structure space, but can also achieve suitable representations for final cluster partitioning. Thus, it can produce an outstanding result. Experimental results on six synthetic benchmark data sets and six real world data sets show that the proposed framework yields a better performance compared to 12 reference algorithms that are developed based on either clustering ensemble architecture or a deep clustering strategy.
Collapse
|