51
|
Yun SS, Choi MT, Kim M, Song JB. Intention Reading from a Fuzzy-Based Human Engagement Model and Behavioural Features. INT J ADV ROBOT SYST 2012. [DOI: 10.5772/50648] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
This paper presents a novel approach for a quantitative appraisal model to identify human intent so as to interact with a robot and determine an engagement level. To efficiently select an attention target for communication in multi-person interactions, we propose a fuzzy-based classification algorithm which is developed by an incremental learning procedure and which facilitates a multi-dimensional pattern analysis for ambiguous human behaviours. From acquired participants' non-verbal behaviour patterns, we extract the dominant feature data, analyse the generality of the model and verify the effectiveness for proper and prompt gaze behaviour. The proposed model works successfully in multiple people interactions.
Collapse
Affiliation(s)
- Sang-Seok Yun
- Department of Mechanical Engineering, Korea University, Korea
- Center for Intelligent Robotics at Korea Institute of Science and Technology (KIST), Korea
| | - Mun-Taek Choi
- Center for Intelligent Robotics at Korea Institute of Science and Technology (KIST), Korea
| | - Munsang Kim
- Center for Intelligent Robotics at Korea Institute of Science and Technology (KIST), Korea
| | - Jae-Bok Song
- Department of Mechanical Engineering, Korea University, Korea
| |
Collapse
|
52
|
Leite D, Ballini R, Costa P, Gomide F. Evolving fuzzy granular modeling from nonstationary fuzzy data streams. EVOLVING SYSTEMS 2012. [DOI: 10.1007/s12530-012-9050-9] [Citation(s) in RCA: 96] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
53
|
An improved approach of self-organising fuzzy neural network based on similarity measures. EVOLVING SYSTEMS 2012. [DOI: 10.1007/s12530-012-9045-6] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022]
|
54
|
Seera M, Lim CP, Ishak D, Singh H. Fault detection and diagnosis of induction motors using motor current signature analysis and a hybrid FMM-CART model. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2012; 23:97-108. [PMID: 24808459 DOI: 10.1109/tnnls.2011.2178443] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
In this paper, a novel approach to detect and classify comprehensive fault conditions of induction motors using a hybrid fuzzy min-max (FMM) neural network and classification and regression tree (CART) is proposed. The hybrid model, known as FMM-CART, exploits the advantages of both FMM and CART for undertaking data classification and rule extraction problems. A series of real experiments is conducted, whereby the motor current signature analysis method is applied to form a database comprising stator current signatures under different motor conditions. The signal harmonics from the power spectral density are extracted as discriminative input features for fault detection and classification with FMM-CART. A comprehensive list of induction motor fault conditions, viz., broken rotor bars, unbalanced voltages, stator winding faults, and eccentricity problems, has been successfully classified using FMM-CART with good accuracy rates. The results are comparable, if not better, than those reported in the literature. Useful explanatory rules in the form of a decision tree are also elicited from FMM-CART to analyze and understand different fault conditions of induction motors.
Collapse
|
55
|
|
56
|
Tung SW, Quek C, Guan C. SaFIN: a self-adaptive fuzzy inference network. IEEE TRANSACTIONS ON NEURAL NETWORKS 2011; 22:1928-1940. [PMID: 22020678 DOI: 10.1109/tnn.2011.2167720] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
There are generally two approaches to the design of a neural fuzzy system: 1) design by human experts, and 2) design through a self-organization of the numerical training data. While the former approach is highly subjective, the latter is commonly plagued by one or more of the following major problems: 1) an inconsistent rulebase; 2) the need for prior knowledge such as the number of clusters to be computed; 3) heuristically designed knowledge acquisition methodologies; and 4) the stability-plasticity tradeoff of the system. This paper presents a novel self-organizing neural fuzzy system, named Self-Adaptive Fuzzy Inference Network (SaFIN), to address the aforementioned deficiencies. The proposed SaFIN model employs a new clustering technique referred to as categorical learning-induced partitioning (CLIP), which draws inspiration from the behavioral category learning process demonstrated by humans. By employing the one-pass CLIP, SaFIN is able to incorporate new clusters in each input-output dimension when the existing clusters are not able to give a satisfactory representation of the incoming training data. This not only avoids the need for prior knowledge regarding the number of clusters needed for each input-output dimension, but also allows SaFIN the flexibility to incorporate new knowledge with old knowledge in the system. In addition, the self-automated rule formation mechanism proposed within SaFIN ensures that it obtains a consistent resultant rulebase. Subsequently, the proposed SaFIN model is employed in a series of benchmark simulations to demonstrate its efficiency as a self-organizing neural fuzzy system, and excellent performances have been achieved.
Collapse
Affiliation(s)
- Sau Wai Tung
- Centre for Computational Intelligence, School of Computer Engineering, Nanyang Technological University, 639798 Singapore.
| | | | | |
Collapse
|
57
|
Huaguang Zhang, Jinhai Liu, Dazhong Ma, Zhanshan Wang. Data-Core-Based Fuzzy Min–Max Neural Network for Pattern Classification. ACTA ACUST UNITED AC 2011; 22:2339-52. [DOI: 10.1109/tnn.2011.2175748] [Citation(s) in RCA: 103] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
58
|
Haibo He, Sheng Chen, Kang Li, Xin Xu. Incremental Learning From Stream Data. ACTA ACUST UNITED AC 2011; 22:1901-14. [DOI: 10.1109/tnn.2011.2171713] [Citation(s) in RCA: 92] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
59
|
Mirghasemi S, Sadoghi Yazdi H, Lotfizad M. A target-based color space for sea target detection. APPL INTELL 2011. [DOI: 10.1007/s10489-011-0307-y] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
60
|
|
61
|
Fuzzy min–max neural networks for categorical data: application to missing data imputation. Neural Comput Appl 2011. [DOI: 10.1007/s00521-011-0574-x] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
62
|
|
63
|
A semi-supervised dynamic version of Fuzzy K-Nearest Neighbours to monitor evolving systems. EVOLVING SYSTEMS 2010. [DOI: 10.1007/s12530-010-9001-2] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
64
|
Liu X, Ren Y. Novel artificial intelligent techniques via AFS theory: Feature selection, concept categorization and characteristic description. Appl Soft Comput 2010. [DOI: 10.1016/j.asoc.2009.09.009] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
65
|
Granular Approach for Evolving System Modeling. COMPUTATIONAL INTELLIGENCE FOR KNOWLEDGE-BASED SYSTEMS DESIGN 2010. [DOI: 10.1007/978-3-642-14049-5_35] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
|
66
|
Nandedkar AV, Biswas PK. A granular reflex fuzzy min-max neural network for classification. ACTA ACUST UNITED AC 2009; 20:1117-34. [PMID: 19482576 DOI: 10.1109/tnn.2009.2016419] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Granular data classification and clustering is an upcoming and important issue in the field of pattern recognition. Conventionally, computing is thought to be manipulation of numbers or symbols. However, human recognition capabilities are based on ability to process nonnumeric clumps of information (information granules) in addition to individual numeric values. This paper proposes a granular neural network (GNN) called granular reflex fuzzy min-max neural network (GrRFMN) which can learn and classify granular data. GrRFMN uses hyperbox fuzzy set to represent granular data. Its architecture consists of a reflex mechanism inspired from human brain to handle class overlaps. The network can be trained online using granular or point data. The neuron activation functions in GrRFMN are designed to tackle data of different granularity (size). This paper also addresses an issue to granulate the training data and learn from it. It is observed that such a preprocessing of data can improve performance of a classifier. Experimental results on real data sets show that the proposed GrRFMN can classify granules of different granularity more correctly. Results are compared with general fuzzy min-max neural network (GFMN) proposed by Gabrys and Bargiela and with some classical methods.
Collapse
Affiliation(s)
- Abhijeet V Nandedkar
- Department of Electronics and Tele-Communication Engineering, Shri Guru Gobind Singhji Institute of Engineering and Technology,Maharashtra 431606, India.
| | | |
Collapse
|
67
|
Quteishat A, Peng Lim C, Tweedale J, Jain LC. A neural network-based multi-agent classifier system. Neurocomputing 2009. [DOI: 10.1016/j.neucom.2008.08.012] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
68
|
Bacciu D, Starita A. Competitive repetition suppression (CoRe) clustering: a biologically inspired learning model with application to robust clustering. ACTA ACUST UNITED AC 2009; 19:1922-41. [PMID: 19000963 DOI: 10.1109/tnn.2008.2004407] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Determining a compact neural coding for a set of input stimuli is an issue that encompasses several biological memory mechanisms as well as various artificial neural network models. In particular, establishing the optimal network structure is still an open problem when dealing with unsupervised learning models. In this paper, we introduce a novel learning algorithm, named competitive repetition-suppression (CoRe) learning, inspired by a cortical memory mechanism called repetition suppression (RS). We show how such a mechanism is used, at various levels of the cerebral cortex, to generate compact neural representations of the visual stimuli. From the general CoRe learning model, we derive a clustering algorithm, named CoRe clustering, that can automatically estimate the unknown cluster number from the data without using a priori information concerning the input distribution. We illustrate how CoRe clustering, besides its biological plausibility, posses strong theoretical properties in terms of robustness to noise and outliers, and we provide an error function describing CoRe learning dynamics. Such a description is used to analyze CoRe relationships with the state-of-the art clustering models and to highlight CoRe similitude with rival penalized competitive learning (RPCL), showing how CoRe extends such a model by strengthening the rival penalization estimation by means of loss functions from robust statistics.
Collapse
Affiliation(s)
- Davide Bacciu
- IMT Lucca Institute for Advanced Studies, 55100 Lucca, Italy.
| | | |
Collapse
|
69
|
|
70
|
Quteishat A, Lim CP. A modified fuzzy min–max neural network with rule extraction and its application to fault detection and classification. Appl Soft Comput 2008. [DOI: 10.1016/j.asoc.2007.07.013] [Citation(s) in RCA: 88] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
71
|
Tan SC, Rao M, Lim CP. Fuzzy ARTMAP dynamic decay adjustment: An improved fuzzy ARTMAP model with a conflict resolving facility. Appl Soft Comput 2008. [DOI: 10.1016/j.asoc.2007.03.006] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
72
|
Nandedkar A, Biswas P. Reflex Fuzzy Min Max Neural Network for Semi-supervised Learning. JOURNAL OF INTELLIGENT SYSTEMS 2008. [DOI: 10.1515/jisys.2008.17.1-3.5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
|
73
|
|
74
|
|
75
|
Kaburlasos VG, Athanasiadis IN, Mitkas PA. Fuzzy lattice reasoning (FLR) classifier and its application for ambient ozone estimation. Int J Approx Reason 2007. [DOI: 10.1016/j.ijar.2006.08.001] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
76
|
Wang S, Chung KF, Zhaohong D, Dewen H. Robust fuzzy clustering neural network based on ɛ-insensitive loss function. Appl Soft Comput 2007. [DOI: 10.1016/j.asoc.2006.04.008] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
77
|
|
78
|
|
79
|
Nandedkar AV, Biswas PK. A Fuzzy Min-Max Neural Network Classifier With Compensatory Neuron Architecture. ACTA ACUST UNITED AC 2007; 18:42-54. [PMID: 17278460 DOI: 10.1109/tnn.2006.882811] [Citation(s) in RCA: 76] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
This paper proposes a fuzzy min-max neural network classifier with compensatory neurons (FMCNs). FMCN uses hyperbox fuzzy sets to represent the pattern classes. It is a supervised classification technique with new compensatory neuron architecture. The concept of compensatory neuron is inspired from the reflex system of human brain which takes over the control in hazardous conditions. Compensatory neurons (CNs) imitate this behavior by getting activated whenever a test sample falls in the overlapped regions amongst different classes. These neurons are capable to handle the hyperbox overlap and containment more efficiently. Simpson used contraction process based on the principle of minimal disturbance, to solve the problem of hyperbox overlaps. FMCN eliminates use of this process since it is found to be erroneous. FMCN is capable to learn the data online in a single pass through with reduced classification and gradation errors. One of the good features of FMCN is that its performance is less dependent on the initialization of expansion coefficient, i.e., maximum hyperbox size. The paper demonstrates the performance of FMCN by comparing it with fuzzy min-max neural network (FMNN) classifier and general fuzzy min-max neural network (GFMN) classifier, using several examples.
Collapse
Affiliation(s)
- Abhijeet V Nandedkar
- Department of Electronics and Electrical Communication Engineering, Indian Institute of Technology, Kharagpur 721302, India.
| | | |
Collapse
|
80
|
|
81
|
Kim HJ, Lee J, Yang HS. A Weighted FMM Neural Network and Its Application to Face Detection. NEURAL INFORMATION PROCESSING 2006. [DOI: 10.1007/11893257_20] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
|
82
|
Abstract
Data analysis plays an indispensable role for understanding various phenomena. Cluster analysis, primitive exploration with little or no prior knowledge, consists of research developed across a wide variety of communities. The diversity, on one hand, equips us with many tools. On the other hand, the profusion of options causes confusion. We survey clustering algorithms for data sets appearing in statistics, computer science, and machine learning, and illustrate their applications in some benchmark data sets, the traveling salesman problem, and bioinformatics, a new field attracting intensive efforts. Several tightly related topics, proximity measure, and cluster validation, are also discussed.
Collapse
Affiliation(s)
- Rui Xu
- Department of Electrical and Computer Engineering, University of Missouri-Rolla, Rolla, MO 65409, USA.
| | | |
Collapse
|
83
|
|
84
|
Gabrys B, Petrakieva L. Combining labelled and unlabelled data in the design of pattern classification systems. Int J Approx Reason 2004. [DOI: 10.1016/j.ijar.2003.08.005] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
85
|
Bargiela A, Pedrycz W. Recursive information granulation: aggregation and interpretation issues. ACTA ACUST UNITED AC 2003; 33:96-112. [DOI: 10.1109/tsmcb.2003.808190] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
86
|
Bargiela A, Pedrycz W. Recursive Information Granulation. GRANULAR COMPUTING 2003. [DOI: 10.1007/978-1-4615-1033-8_7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
87
|
From Numbers to Information Granules. GRANULAR COMPUTING 2003. [DOI: 10.1007/978-1-4615-1033-8_6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
88
|
Granular Prototyping in Fuzzy Clustering. GRANULAR COMPUTING 2003. [DOI: 10.1007/978-1-4615-1033-8_8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
89
|
|
90
|
Neuro-fuzzy approach to processing inputs with missing values in pattern recognition problems. Int J Approx Reason 2002. [DOI: 10.1016/s0888-613x(02)00070-1] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
91
|
Pedrycz W, Bargiela A. Granular clustering: a granular signature of data. ACTA ACUST UNITED AC 2002; 32:212-24. [DOI: 10.1109/3477.990878] [Citation(s) in RCA: 125] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
92
|
Rizzi A, Panella M, Frattale Mascioli F. Adaptive resolution min-max classifiers. ACTA ACUST UNITED AC 2002; 13:402-14. [DOI: 10.1109/72.991426] [Citation(s) in RCA: 84] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|