1
|
Lu W, Ma C, Pedrycz W, Yang J. Design of Granular Model: A Method Driven by Hyper-Box Iteration Granulation. IEEE TRANSACTIONS ON CYBERNETICS 2023; 53:2899-2913. [PMID: 34767519 DOI: 10.1109/tcyb.2021.3124235] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Recently, granular models have been highlighted in system modeling and applied to many fields since their outcomes are information granules supporting human-centric comprehension and reasoning. In this study, a design method of granular model driven by hyper-box iteration granulation is proposed. The method is composed mainly of partition of input space, formation of input hyper-box information granules with confidence levels, and granulation of output data corresponding to input hyper-box information granules. Among them, the formation of input hyper-box information granules is realized through performing the hyper-box iteration granulation algorithm governed by information granularity on input space, and the granulation of out data corresponding to input hyper-box information granules is completed by the improved principle of justifiable granularity to produce triangular fuzzy information granules. Compared with the existing granular models, the resulting one can yield the more accurate numeric and preferable granular outcomes simultaneously. Experiments completed on the synthetic and publicly available datasets demonstrate the superiority of the granular model designed by the proposed method at granular and numeric levels. Also, the impact of parameters involved in the proposed design method on the performance of ensuing granular model is explored.
Collapse
|
2
|
Liu P, Li Y, Zhang X, Pedrycz W. A Multiattribute Group Decision-Making Method With Probabilistic Linguistic Information Based on an Adaptive Consensus Reaching Model and Evidential Reasoning. IEEE TRANSACTIONS ON CYBERNETICS 2023; 53:1905-1919. [PMID: 35486566 DOI: 10.1109/tcyb.2022.3165030] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
This article proposes a new multiattribute group decision-making (MAGDM) method with probabilistic linguistic information that considers the following three aspects: an allocation of ignorance information, a realization of group consensus, and an aggregation of assessments. To allocate ignorance information, an optimization model based on minimizing the distances among experts is developed. To measure the consensus degree, a consensus index that considers the information granules of linguistic terms (LTs) is defined. On this basis, a suitable optimization model is established to realize the group consensus adaptively by optimizing the allocation of information granules of LTs with the particle swarm optimization (PSO) algorithm. With an objective to reduce the information loss during aggregation phases, the process of generating comprehensive assessments of alternatives with the evidential reasoning (ER) algorithm is presented. Therefore, a new method is developed based on the adaptive consensus reaching (ACR) model and the ER algorithm. Finally, the applicability of the proposed method is demonstrated by solving a selection problem of a financial technology company. Comparative analyses are conducted to show the advantages of the proposed method.
Collapse
|
3
|
Zhang G, Zhu X, Yin L, Pedrycz W, Li Z. Granular data representation under privacy protection: Tradeoff between data utility and privacy via information granularity. Appl Soft Comput 2022. [DOI: 10.1016/j.asoc.2022.109808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
4
|
A novel multi-level framework for anomaly detection in time series data. APPL INTELL 2022. [DOI: 10.1007/s10489-022-04016-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
5
|
|
6
|
An optimization viewpoint on evaluation-based interval-valued multi-attribute three-way decision model. Inf Sci (N Y) 2022. [DOI: 10.1016/j.ins.2022.04.055] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
7
|
Zhu X, Pedrycz W, Li Z. A Two-Stage Approach for Constructing Type-2 Information Granules. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:2214-2224. [PMID: 32721903 DOI: 10.1109/tcyb.2020.2965967] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
In this article, we are concerned with the formation of type-2 information granules in a two-stage approach. We present a comprehensive algorithmic framework which gives rise to information granules of a higher type (type-2, to be specific) such that the key structure of the local granular data, their topologies, and their diversities become fully reflected and quantified. In contrast to traditional collaborative clustering where local structures (information granules) are obtained by running algorithms on the local datasets and communicating findings across sites, we propose a way of characterizing granular data (formed) by forming a suite of higher type information granules to reveal an overall structure of a collection of locally available datasets. Information granules built at the lower level on a basis of local sources of data are weighted by the number of data they represent while the information granules formed at the higher level of hierarchy are more abstract and general, thus facilitating a formation of a hierarchical description of data realized at different levels of detail. The construction of information granules is completed by resorting to fuzzy clustering algorithms (more specifically, the well-known Fuzzy C-Means). In the formation of information granules, we follow the fundamental principle of granular computing, viz., the principle of justifiable granularity. Experimental studies concerning selected publicly available machine-learning datasets are reported.
Collapse
|
8
|
Yang J, Luo T, Zeng L, Jin X. The cost-sensitive approximation of neighborhood rough sets and granular layer selection. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2022. [DOI: 10.3233/jifs-212234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Neighborhood rough sets (NRS) are the extended model of the classical rough sets. The NRS describe the target concept by upper and lower neighborhood approximation boundaries. However, the method of approximately describing the uncertain target concept with existed neighborhood information granules is not given. To solve this problem, the cost-sensitive approximation model of the NRS is proposed in this paper, and its related properties are analyzed. To obtain the optimal approximation granular layer, the cost-sensitive progressive mechanism is proposed by considering user requirements. The case study shows that the reasonable granular layer and its approximation can be obtained under certain constraints, which is suitable for cost-sensitive application scenarios. The experimental results show that the advantage of the proposed approximation model, moreover, the decision cost of the NRS approximation model will monotonically decrease with granularity being finer.
Collapse
Affiliation(s)
- Jie Yang
- School of Physics and Electronic Science, Zunyi Normal University, Zunyi, China
- National Pilot School of Software, Yunnan University, Kunming, China
| | - Tian Luo
- School of Physics and Electronic Science, Zunyi Normal University, Zunyi, China
| | - Lijuan Zeng
- School of Physics and Electronic Science, Zunyi Normal University, Zunyi, China
| | - Xin Jin
- National Pilot School of Software, Yunnan University, Kunming, China
| |
Collapse
|
9
|
Xu K, Pedrycz W, Li Z. Augmentation of the reconstruction performance of Fuzzy C-Means with an optimized fuzzification factor vector. Knowl Based Syst 2021. [DOI: 10.1016/j.knosys.2021.106951] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
10
|
Lu W, Pedrycz W, Yang J, Liu X. Granular Fuzzy Modeling Guided Through the Synergy of Granulating Output Space and Clustering Input Subspaces. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:2625-2638. [PMID: 31021786 DOI: 10.1109/tcyb.2019.2909037] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
As an augmentation of classic fuzzy models, granular fuzzy models (GFMs) have been applied to many fields being in rapport with experimental data, models, and users. However, most of the existing methods used to construct GFMs are based on the principle of optimal allocation of information granularity, which requires that a numeric model be provided in advance. In this paper, a straightforward and convincing modeling method is proposed to directly construct GFM on a basis of experimental data. The method first granulates the output space to form some interval information granules with distinct semantics and then uses them to partition the entire input space into a series of input subspaces. Subsequently, an initial GFM is emerged by using "If-Then" rules to relate with those interval information granules positioned in the output space and structures expressed in prototypes that are produced by clustering individual input subspaces. Further, the initial GFM is also refined by continuously migrating prototypes in individual input subspaces. The experimental studies using the synthetic dataset and several real-world datasets are reported. They offer a useful insight into the feasibility and effectiveness of the proposed modeling method and reveal the impact of parameters on the performance of the ensuing GFMs. An application example is also presented to exhibit the advantages of the resulting GFM.
Collapse
|
11
|
Ouyang T, Pedrycz W, Reyes-Galaviz OF, Pizzi NJ. Granular Description of Data Structures: A Two-Phase Design. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:1902-1912. [PMID: 30605118 DOI: 10.1109/tcyb.2018.2887115] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The study is concerned with a description of large numeric data with the aid of building a limited collection of representative information granules with the objective of capturing the structure of the original data. The proposed development scheme consists of two steps. First, a clustering algorithm characterized by high flexibility of coping with the diverse geometry of data structure and efficient computational overhead is invoked. At the second step, a clustering algorithm applied to the clusters already formed during the first phase, yielding a collection of numeric prototypes is involved and the numeric prototypes produced there are then generalized into their granular prototypes. The quality of granular prototypes is quantified while their build-up is supported by the mechanisms of granular computing such as the principle of justifiable granularity. In this paper, the clustering algorithms of DBSCAN and fuzzy C -means were used in successive phases of the processed approach. The experimental studies concerning synthetic data and publicly available data are covered and the performance of the developed approach is assessed along with a comparative analysis.
Collapse
|
12
|
Zhu X, Pedrycz W, Li Z. A Development of Granular Input Space in System Modeling. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:1639-1650. [PMID: 30892261 DOI: 10.1109/tcyb.2019.2899633] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
In this paper, we elaborate on a new design approach to the development and analysis of granular input spaces and ensuing granular modeling. Given a numeric model (no matter what specific design methodology has been used to construct it and what architecture has been adopted), we form a granular input space through allocating a certain level of information granularity across the input variables. The formation of granular input space helps us gain a better insight into the ranking of input variables with respect to their precision (the variables with a lower level of information granularity need to be specified in a precise way when estimating the inputs). As a consequence, for granular inputs, the outputs of the granular model are also information granules (say, intervals, fuzzy sets, rough sets, etc.). It is shown that the process of forming granular input space can be sought as an optimization of allocation of information granularity across the input variables so that the specificity of the corresponding granular outputs of the granular model becomes the highest while coverage of data becomes maximized. The construction of granular input space dwells upon two fundamental principles of granular computing-the principle of justifiable granularity and the optimal allocation of information granularity. The quality of the granular input space is quantified in terms of the two conflicting criteria, that is, the specificity of the results produced by the granular model and the coverage of experimental data delivered by this model. In the ensuing optimization problem, one maximizes a product of specificity and coverage. Differential evolution is engaged in this optimization task. The experimental studies involve both synthetic dataset and data coming from the machine learning repository.
Collapse
|
13
|
Fang Y, Zhou D, Li K, Ju Z, Liu H. Attribute-Driven Granular Model for EMG-Based Pinch and Fingertip Force Grand Recognition. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:789-800. [PMID: 31425131 DOI: 10.1109/tcyb.2019.2931142] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Fine multifunctional prosthetic hand manipulation requires precise control on the pinch-type and the corresponding force, and it is a challenge to decode both aspects from myoelectric signals. This paper proposes an attribute-driven granular model (AGrM) under a machine-learning scheme to solve this problem. The model utilizes the additionally captured attribute as the latent variable for a supervised granulation procedure. It was fulfilled for EMG-based pinch-type classification and the fingertip force grand prediction. In the experiments, 16 channels of surface electromyographic signals (i.e., main attribute) and continuous fingertip force (i.e., subattribute) were simultaneously collected while subjects performing eight types of hand pinches. The use of AGrM improved the pinch-type recognition accuracy to around 97.2% by 1.8% when constructing eight granules for each grasping type and received more than 90% force grand prediction accuracy at any granular level greater than six. Further, sensitivity analysis verified its robustness with respect to different channel combination and interferences. In comparison with other clustering-based granulation methods, AGrM achieved comparable pinch recognition accuracy but was of lowest computational cost and highest force grand prediction accuracy.
Collapse
|
14
|
Abstract
We lay the theoretical foundations of a novel model, termed picture hesitant fuzzy rough sets, based on picture hesitant fuzzy relations. We also combine this notion with the ideas of multi-granulation rough sets. As a consequence, a new multi-granulation rough set model on two universes, termed a multi-granulation picture hesitant fuzzy rough set, is developed. When the universes coincide or play a symmetric role, the concept assumes the standard format. In this context, we put forward two new classes of multi-granulation picture hesitant fuzzy rough sets, namely, the optimistic and pessimistic multi-granulation picture hesitant fuzzy rough sets. Further, we also investigate the relationships among these two concepts and picture hesitant fuzzy rough sets.
Collapse
|
15
|
|
16
|
Bemani-N. A, Akbarzadeh-T. MR. A hybrid adaptive granular approach to Takagi–Sugeno–Kang fuzzy rule discovery. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2019.105491] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
17
|
Truong HQ, Ngo LT, Pham LT. Interval Type-2 Fuzzy Possibilistic C-Means Clustering Based on Granular Gravitational Forces and Particle Swarm Optimization. JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS 2019. [DOI: 10.20965/jaciii.2019.p0592] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The interval type-2 fuzzy possibilistic C-means clustering (IT2FPCM) algorithm improves the performance of the fuzzy possibilistic C-means clustering (FPCM) algorithm by addressing high degrees of noise and uncertainty. However, the IT2FPCM algorithm continues to face drawbacks including sensitivity to cluster centroid initialization, slow processing speed, and the possibility of being easily trapped in local optima. To overcome these drawbacks and better address noise and uncertainty, we propose an IT2FPCM method based on granular gravitational forces and particle swarm optimization (PSO). This method is based on the idea of gravitational forces grouping the data points into granules and then processing clusters on a granular space using a hybrid algorithm of the IT2FPCM and PSO algorithms. The proposed method also determines the initial centroids by merging granules until the number of granules is equal to the number of clusters. By reducing the elements in the granular space, the proposed algorithms also significantly improve performance when clustering large datasets. Experimental results are reported on different datasets compared with other approaches to demonstrate the advantages of the proposed method.
Collapse
|
18
|
|
19
|
Shen Y, Pedrycz W, Wang X. Clustering Homogeneous Granular Data: Formation and Evaluation. IEEE TRANSACTIONS ON CYBERNETICS 2019; 49:1391-1402. [PMID: 29994448 DOI: 10.1109/tcyb.2018.2802453] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
In this paper, we develop a comprehensive conceptual and algorithmic framework to cope with a problem of clustering homogeneous information granules. While there have been several approaches to coping with granular (viz. non-numeric) data, the origin of granular data themselves considered there is somewhat unclear and, as a consequence, the results of clustering start lacking some full-fledged interpretation. In this paper, we offer a holistic view at clustering information granules and an evaluation of the results of clustering. We start with a process of forming information granules with the use of the principle of justifiable granularity (PJG). With this regard, we discuss a number of parameters used in this development of information granules as well as quantify the quality of the granules produced in this manner. In the sequel, Fuzzy C -Means is applied to cluster the derived information granules, which are represented in a parametric manner and associated with weights resulting from the usage of the PJG. The quality of clustering results is evaluated through the use of the reconstruction criterion (quantifying the concept of information granulation and degranulation). A suite of experiments using synthetic and publicly available datasets is reported to quantify the performance of the proposed approach and highlight its key features.
Collapse
|
20
|
Pedrycz W. Local-Density-Based Optimal Granulation and Manifold Information Granule Description. IEEE TRANSACTIONS ON CYBERNETICS 2018; 48:2795-2808. [PMID: 28945607 DOI: 10.1109/tcyb.2017.2750481] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Constructing information granules (IGs) has been of significant interest to the discipline of granular computing. The principle of justifiable granularity has been proposed to guide the design of IGs, opening an avenue of pursuits of building IGs carried out on a basis of well-defined and intuitively appealing principles. However, how to improve the efficiency and accuracy of the resulting constructs is an open issue. In this paper, we present a local-density-based optimal granulation model (LoDOG), exhibiting evident advantages: 1) it can detect arbitrarily-shaped IGs and 2) it finds the optimal granulation solutions with O(N) complexity, once the leading tree structure has been constructed. We describe IGs of arbitrary shapes using a small collection of landmark points positioned on the skeleton of the underlying manifold, which contribute to approximate reconstruction capabilities of the original dataset. A dissimilarity metric is developed to evaluate the quality of the obtained reconstruction. The interpretability of LoDOG IGs is discussed. Theoretical analysis and empirical evaluations are covered to demonstrate the effectiveness of LoDOG and the manifold description.
Collapse
|
21
|
Zhu X, Pedrycz W, Li Z. Granular Data Description: Designing Ellipsoidal Information Granules. IEEE TRANSACTIONS ON CYBERNETICS 2017; 47:4475-4484. [PMID: 28113415 DOI: 10.1109/tcyb.2016.2612226] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Granular computing (GrC) has emerged as a unified conceptual and processing framework. Information granules are fundamental constructs that permeate concepts and models of GrC. This paper is concerned with a design of a collection of meaningful, easily interpretable ellipsoidal information granules with the use of the principle of justifiable granularity by taking into consideration reconstruction abilities of the designed information granules. The principle of justifiable granularity supports designing of information granules based on numeric or granular evidence, and aims to achieve a compromise between justifiability and specificity of the information granules to be constructed. A two-stage development strategy behind the construction of justifiable information granules is considered. First, a collection of numeric prototypes is determined with the use of fuzzy clustering. Second, the lengths of the semi-axes of ellipsoidal information granules to be formed around such prototypes are optimized. Two optimization criteria are introduced and studied. Experimental studies involving synthetic data set and data sets coming from the machine learning repository are reported.
Collapse
|
22
|
Hu X, Pedrycz W, Wu G, Wang X. Data reconstruction with information granules: An augmented method of fuzzy clustering. Appl Soft Comput 2017. [DOI: 10.1016/j.asoc.2017.02.014] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
23
|
Kok VJ, Chan CS. GrCS: Granular Computing-Based Crowd Segmentation. IEEE TRANSACTIONS ON CYBERNETICS 2017; 47:1157-1168. [PMID: 26992194 DOI: 10.1109/tcyb.2016.2538765] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Crowd segmentation is important in serving as the basis for a wide range of crowd analysis tasks such as density estimation and behavior understanding. However, due to interocclusions, perspective distortion, clutter background, and random crowd distribution, localizing crowd segments is technically a very challenging task. This paper proposes a novel crowd segmentation framework-based on granular computing (GrCS) to enable the problem of crowd segmentation to be conceptualized at different levels of granularity, and to map problems into computationally tractable subproblems. It shows that by exploiting the correlation among pixel granules, we are able to aggregate structurally similar pixels into meaningful atomic structure granules. This is useful in outlining natural boundaries between crowd and background (i.e., noncrowd) regions. From the structure granules, we infer the crowd and background regions by granular information classification. GrCS is scene-independent and can be applied effectively to crowd scenes with a variety of physical layout and crowdedness. Extensive experiments have been conducted on hundreds of real and synthetic crowd scenes. The results demonstrate that by exploiting the correlation among granules, we can outline the natural boundaries of structurally similar crowd and background regions necessary for crowd segmentation.
Collapse
|
24
|
Ahmed MM, Isa NAM. Knowledge base to fuzzy information granule: A review from the interpretability-accuracy perspective. Appl Soft Comput 2017. [DOI: 10.1016/j.asoc.2016.12.055] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
25
|
|
26
|
|
27
|
Zhao J, Han Z, Pedrycz W, Wang W. Granular Model of Long-Term Prediction for Energy System in Steel Industry. IEEE TRANSACTIONS ON CYBERNETICS 2016; 46:388-400. [PMID: 26168454 DOI: 10.1109/tcyb.2015.2445918] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
Sound energy scheduling and allocation is of paramount significance for the current steel industry, and the quantitative prediction of energy media is being regarded as the prerequisite for such challenging tasks. In this paper, a long-term prediction for the energy flows is proposed by using a granular computing-based method that considers industrial-driven semantics and granulates the initial data based on the specificity of manufacturing processes. When forming information granules on a basis of experimental data, we propose to deal with the unequal-length temporal granules by exploiting dynamic time warping, which becomes instrumental to the realization of the prediction model. The model engages the fuzzy C -means clustering method. To quantify the performance of the proposed method, real-world industrial energy data coming from a steel plant in China are employed. The experimental results demonstrate that the proposed method is superior to some other data-driven methods and becomes capable of satisfying the requirements of the practically viable prediction.
Collapse
|
28
|
Xu W, Li W. Granular Computing Approach to Two-Way Learning Based on Formal Concept Analysis in Fuzzy Datasets. IEEE TRANSACTIONS ON CYBERNETICS 2016; 46:366-379. [PMID: 25347892 DOI: 10.1109/tcyb.2014.2361772] [Citation(s) in RCA: 153] [Impact Index Per Article: 19.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
The main task of granular computing (GrC) is about representing, constructing, and processing information granules. Information granules are formalized in many different approaches. Different formal approaches emphasize the same fundamental facet in different ways. In this paper, we propose a novel GrC method of machine learning by using formal concept description of information granules. Based on information granules, the model and mechanism of two-way learning system is constructed in fuzzy datasets. It is addressed about how to train arbitrary fuzzy information granules to become necessary, sufficient, and necessary and sufficient fuzzy information granules. Moreover, an algorithm of the presented approach is established, and the complexity of the algorithm is analyzed carefully. Finally, to interpret and help understand the theories and algorithm, a real-life case study is considered and experimental evaluation is performed by five datasets from the University of California-Irvine, which is valuable for applying these theories to deal with practical issues.
Collapse
|
29
|
Granular meta-clustering based on hierarchical, network, and temporal connections. GRANULAR COMPUTING 2016. [DOI: 10.1007/s41066-015-0007-9] [Citation(s) in RCA: 56] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
30
|
Hu J, Li T, Wang H, Fujita H. Hierarchical cluster ensemble model based on knowledge granulation. Knowl Based Syst 2016. [DOI: 10.1016/j.knosys.2015.10.006] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
31
|
|
32
|
Granular computing, computational intelligence, and the analysis of non-geometric input spaces. GRANULAR COMPUTING 2015. [DOI: 10.1007/s41066-015-0003-0] [Citation(s) in RCA: 79] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
33
|
Clustering in augmented space of granular constraints: A study in knowledge-based clustering. Pattern Recognit Lett 2015. [DOI: 10.1016/j.patrec.2015.08.019] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
34
|
Abstract
With the rapid development of uncertain artificial intelligent and the arrival of big data era, conventional clustering analysis and granular computing fail to satisfy the requirements of intelligent information processing in this new case. There is the essential relationship between granular computing and clustering analysis, so some researchers try to combine granular computing with clustering analysis. In the idea of granularity, the researchers expand the researches in clustering analysis and look for the best clustering results with the help of the basic theories and methods of granular computing. Granularity clustering method which is proposed and studied has attracted more and more attention. This paper firstly summarizes the background of granularity clustering and the intrinsic connection between granular computing and clustering analysis, and then mainly reviews the research status and various methods of granularity clustering. Finally, we analyze existing problem and propose further research.
Collapse
|
35
|
|
36
|
|
37
|
|
38
|
Salehi S, Selamat A, Reza Mashinchi M, Fujita H. The synergistic combination of particle swarm optimization and fuzzy sets to design granular classifier. Knowl Based Syst 2015. [DOI: 10.1016/j.knosys.2014.12.017] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
39
|
Song M, Wang Y. Human centricity and information granularity in the agenda of theories and applications of soft computing. Appl Soft Comput 2015. [DOI: 10.1016/j.asoc.2014.04.040] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
40
|
Wang S, Watada J, Pedrycz W. Granular robust mean-CVaR feedstock flow planning for waste-to-energy systems under integrated uncertainty. IEEE TRANSACTIONS ON CYBERNETICS 2014; 44:1846-1857. [PMID: 25222726 DOI: 10.1109/tcyb.2013.2296500] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
In the context of robust optimization with information granules for distributional parameters, this paper investigates a two-stage waste-to-energy feedstock flow planning problem with uncertain capacity expansion costs. The objective is to minimize the worst-case overall loss in a mean-risk criterion where the risk is measured by a conditional value-at-risk operator. As a salient feature, an integrated uncertainty is considered which consists of not only the uncertainty in distribution shapes of the uncertain variables, but also the manifold uncertainties of the mean parameters. To tackle the robust optimization under such integrated uncertainty, we first discuss a distributional robust two-stage feedstock flow planning model with precise mean parameters that handles the uncertainty in distribution shape, and the model can be equivalently transformed into a linear program (LP). Furthermore, the precise-mean-based robust model is extended into the case of multifaceted uncertainty for mean-parameters that are allowed to assume intervals, historical-data-based probabilistic estimates, and/or human-knowledge-centric fuzzy set estimates, under different circumstances. These multifaceted uncertain mean-parameters are uniformly represented by using information granules, and a granular robust optimization model is then developed which maximizes the robustness of the solution within a shortfall tolerance, and realizes a tradeoff between the solution conservativeness and robustness. It is showed that the granular robust model is equivalent to solving a series of LPs and can be efficiently handled by a nested binary search algorithm. Finally, the computational study illustrates the model performance, solution analysis, and underlines a much higher scalability of the developed robust model compared to the stochastic programming approach.
Collapse
|
41
|
Systematic studies on three-way decisions with interval-valued decision-theoretic rough sets. Inf Sci (N Y) 2014. [DOI: 10.1016/j.ins.2014.02.054] [Citation(s) in RCA: 124] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
42
|
Zhang H, Pedrycz W, Miao D, Wei Z. From principal curves to granular principal curves. IEEE TRANSACTIONS ON CYBERNETICS 2014; 44:748-760. [PMID: 23996588 DOI: 10.1109/tcyb.2013.2270294] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Principal curves arising as an essential construct in dimensionality reduction and data analysis have recently attracted much attention from theoretical as well as practical perspective. In many real-world situations, however, the efficiency of existing principal curves algorithms is often arguable, in particular when dealing with massive data owing to the associated high computational complexity. A certain drawback of these constructs stems from the fact that in several applications principal curves cannot fully capture some essential problem-oriented facets of the data dealing with width, aspect ratio, width change, etc. Information granulation is a powerful tool supporting processing and interpreting massive data. In this paper, invoking the underlying ideas of information granulation, we propose a granular principal curves approach, regarded as an extension of principal curves algorithms, to improve efficiency and achieve a sound accuracy-efficiency tradeoff. First, large amounts of numerical data are granulated into C intervals-information granules developed with the use of fuzzy C-means clustering and the two criteria of information granulation, which significantly reduce the amount of data to be processed at the later phase of the overall design. Granular principal curves are then constructed by determining the upper and the lower bounds of the interval data. Finally, we develop an objective function using the criteria of information confidence and specificity to evaluate the granular output formed by the principal curves. We also optimize the granular principal curves by adjusting the level of information granularity (the number of clusters), which is realized with the aid of the particle swarm optimization. A number of numeric studies completed for synthetic and real-world datasets provide a useful quantifiable insight into the effectiveness of the proposed algorithm.
Collapse
|
43
|
|
44
|
Pedrycz W, Al-Hmouz R, Morfeq A, Balamash A. The design of free structure granular mappings: the use of the principle of justifiable granularity. IEEE TRANSACTIONS ON CYBERNETICS 2013; 43:2105-2113. [PMID: 23757519 DOI: 10.1109/tcyb.2013.2240384] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
The study introduces a concept of mappings realized in presence of information granules and offers a design framework supporting the formation of such mappings. Information granules are conceptually meaningful entities formed on a basis of a large number of experimental input–output numeric data available for the construction of the model. We develop a conceptually and algorithmically sound way of forming information granules. Considering the directional nature of the mapping to be formed, this directionality aspect needs to be taken into account when developing information granules. The property of directionality implies that while the information granules in the input space could be constructed with a great deal of flexibility, the information granules formed in the output space have to inherently relate to those built in the input space. The input space is granulated by running a clustering algorithm; for illustrative purposes, the focus here is on fuzzy clustering realized with the aid of the fuzzy C-means algorithm. The information granules in the output space are constructed with the aid of the principle of justifiable granularity (being one of the underlying fundamental conceptual pursuits of Granular Computing). The construct exhibits two important features. First, the constructed information granules are formed in the presence of information granules already constructed in the input space (and this realization is reflective of the direction of the mapping from the input to the output space). Second, the principle of justifiable granularity does not confine the realization of information granules to a single formalism such as fuzzy sets but helps form the granules expressed any required formalism of information granulation. The quality of the granular mapping (viz. the mapping realized for the information granules formed in the input and output spaces) is expressed in terms of the coverage criterion (articulating how well the experimental data are “covered” by information granules produced by the granular mapping for any input experimental data). Some parametric studies are reported by quantifying the performance of the granular mapping (expressed in terms of the coverage and specificity criteria) versus the values of a certain parameters utilized in the construction of output information granules through the principle of justifiable granularity. The plots of coverage–specificity dependency help determine a knee point and reach a sound compromise between these two conflicting requirements imposed on the quality of the granular mapping. Furthermore, quantified is the quality of the mapping with regard to the number of information granules (implying a certain granularity of the mapping). A series of experiments is reported as well.
Collapse
|
45
|
Pedrycz W, Homenda W. Building the fundamentals of granular computing: A principle of justifiable granularity. Appl Soft Comput 2013. [DOI: 10.1016/j.asoc.2013.06.017] [Citation(s) in RCA: 194] [Impact Index Per Article: 17.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
46
|
|
47
|
Grimaldo F, Paolucci M. A Simulation Of Disagreement For Control Of Rational Cheating In Peer Review. ECMS 2012 PROCEEDINGS EDITED BY: K. G. TROITZSCH, M. MOEHRING, U. LOTZMANN 2012. [DOI: 10.7148/2012-0676-0682] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
|