1
|
Huang W, Sun M, Zhu L, Oh SK, Pedrycz W. Deep Fuzzy Min-Max Neural Network: Analysis and Design. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2024; 35:8229-8240. [PMID: 37015551 DOI: 10.1109/tnnls.2022.3226040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/19/2023]
Abstract
Fuzzy min-max neural network (FMNN) is one kind of three-layer models based on hyperboxes that are constructed in a sequential way. Such a sequential mechanism inevitably leads to the input order and overlap region problem. In this study, we propose a deep FMNN (DFMNN) based on initialization and optimization operation to overcome these limitations. Initialization operation that can solve the input order problem is to design hyperboxes in a simultaneous way, and side parameters have been proposed to control the size of hyperboxes. Optimization operation that can eliminate overlap region problem is realized by means of deep layers, where the number of layers is immediately determined when the overlap among hyperboxes is eliminated. In the optimization process, each layer consists of three sections, namely, the partition section, combination section, and union section. The partition section aims to divide the hyperboxes into a nonoverlapping hyperbox set and an overlapping hyperbox set. The combination section eliminates the overlap problem of overlapping hyperbox set. The union section obtains the optimized hyperbox set in the current layer. DFMNN is evaluated based on a series of benchmark datasets. A comparative analysis illustrates that the proposed DFMNN model outperforms several models previously reported in the literature.
Collapse
|
2
|
Han H, Sun C, Wu X, Yang H, Qiao J. Self-Organizing Interval Type-2 Fuzzy Neural Network Using Information Aggregation Method. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:6428-6442. [PMID: 34982701 DOI: 10.1109/tnnls.2021.3136678] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
Interval type-2 fuzzy neural networks (IT2FNNs) usually stack adequate fuzzy rules to identify nonlinear systems with high-dimensional inputs, which may result in an explosion of fuzzy rules. To cope with this problem, a self-organizing IT2FNN, based on the information aggregation method (IA-SOIT2FNN), is developed to avoid the explosion of fuzzy rules in this article. First, a relation-aware strategy is proposed to construct rotatable type-2 fuzzy rules (RT2FRs). This strategy uses the individual RT2FR, instead of multiple standard fuzzy rules, to interpret interactive features of high-dimensional inputs. Second, a comprehensive information evaluation mechanism, associated with the interval information and rotation information of RT2FR, is developed to direct the structural adjustment of IA-SOIT2FNN. This mechanism can achieve a compact structure of IA-SOIT2FNN by growing and pruning RT2FRs. Third, a multicriteria-based optimization algorithm is designed to optimize the parameters of IA-SOIT2FNN. The algorithm can simultaneously update the rotatable parameters and the conventional parameters of RT2FR, and further maintain the accuracy of IA-SOIT2FNN. Finally, the experiments showcase that the proposed IA-SOIT2FNN can compete with the state-of-the-art approaches in terms of identification performance.
Collapse
|
3
|
Khuat TT, Gabrys B. Random Hyperboxes. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2023; 34:1008-1022. [PMID: 34424848 DOI: 10.1109/tnnls.2021.3104896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
This article proposes a simple yet powerful ensemble classifier, called Random Hyperboxes, constructed from individual hyperbox-based classifiers trained on the random subsets of sample and feature spaces of the training set. We also show a generalization error bound of the proposed classifier based on the strength of the individual hyperbox-based classifiers as well as the correlation among them. The effectiveness of the proposed classifier is analyzed using a carefully selected illustrative example and compared empirically with other popular single and ensemble classifiers via 20 datasets using statistical testing methods. The experimental results confirmed that our proposed method outperformed other fuzzy min-max neural networks (FMNNs), popular learning algorithms, and is competitive with other ensemble methods. Finally, we identify the existing issues related to the generalization error bounds of the real datasets and inform the potential research directions.
Collapse
|
4
|
Kenger ÖN, Özceylan E. Fuzzy min–max neural networks: a bibliometric and social network analysis. Neural Comput Appl 2023. [DOI: 10.1007/s00521-023-08267-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
|
5
|
A. SK, Kumar A, Bajaj V, Singh G. A compact fuzzy min max network with novel trimming strategy for pattern classification. Knowl Based Syst 2022. [DOI: 10.1016/j.knosys.2022.108620] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
6
|
Liu S, Wang Z, Shen B, Wei G. Partial-neurons-based state estimation for delayed neural networks with state-dependent noises under redundant channels. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2020.08.047] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
7
|
Khuat TT, Gabrys B. Accelerated learning algorithms of general fuzzy min-max neural network using a novel hyperbox selection rule. Inf Sci (N Y) 2021. [DOI: 10.1016/j.ins.2020.08.046] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
8
|
Nonlinear systems modelling based on self-organizing fuzzy neural network with hierarchical pruning scheme. Appl Soft Comput 2020. [DOI: 10.1016/j.asoc.2020.106516] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
9
|
Khuat TT, Ruta D, Gabrys B. Hyperbox-based machine learning algorithms: a comprehensive survey. Soft comput 2020. [DOI: 10.1007/s00500-020-05226-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/22/2023]
|
10
|
Chen CLP, Feng S. Generative and Discriminative Fuzzy Restricted Boltzmann Machine Learning for Text and Image Classification. IEEE TRANSACTIONS ON CYBERNETICS 2020; 50:2237-2248. [PMID: 30295638 DOI: 10.1109/tcyb.2018.2869902] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
The restricted Boltzmann machine (RBM) is an excellent generative learning model for feature extraction. By extending its parameters from real numbers to fuzzy ones, we have developed the fuzzy RBM (FRBM) which is demonstrated to possess better generative capability than RBM. In this paper, we first propose a generative model named Gaussian FRBM (GFRBM) to deal with real-valued inputs. Then, motivated by the fact that the discriminative variant of RBM can provide a self-contained framework for classification with competitive performance compared with some traditional classifiers, we establish the discriminative FRBM (DFRBM) and discriminative GFRBM (DGFRBM) that combine both the generative and discriminative facility by adding extra neurons next to the input units. Specifically, they can be trained into excellent stand-alone classifiers and retain outstanding generative capability simultaneously. The experimental results including text and image (both clean and noisy) classification indicate that DFRBM and DGFRBM outperform discriminative RBM models in terms of reconstruction and classification accuracy, and they behave more stable when encountering noisy data. Moreover, the proposed learning models show some promising advantages over other standard classifiers.
Collapse
|
11
|
A comparative study of general fuzzy min-max neural networks for pattern classification problems. Neurocomputing 2020. [DOI: 10.1016/j.neucom.2019.12.090] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
12
|
Liu J, Ma Y, Qu F, Zang D. Semi-supervised Fuzzy Min–Max Neural Network for Data Classification. Neural Process Lett 2019. [DOI: 10.1007/s11063-019-10142-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
13
|
A modified neuro-fuzzy classifier and its parallel implementation on modern GPUs for real time intrusion detection. Appl Soft Comput 2019. [DOI: 10.1016/j.asoc.2019.105595] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
14
|
The Combination of Fuzzy Min–Max Neural Network and Semi-supervised Learning in Solving Liver Disease Diagnosis Support Problem. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING 2019. [DOI: 10.1007/s13369-018-3351-7] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
15
|
Han H, Zhang L, Wu X, Qiao J. An Efficient Second-Order Algorithm for Self-Organizing Fuzzy Neural Networks. IEEE TRANSACTIONS ON CYBERNETICS 2019; 49:14-26. [PMID: 29990034 DOI: 10.1109/tcyb.2017.2762521] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Intelligent computing technologies are useful and important for online data modeling, where system dynamics may be nonstationary with some uncertainties. In this paper, an efficient learning mechanism is developed for building self-organizing fuzzy neural networks (SOFNNs), where a second-order algorithm (SOA) with adaptive learning rate is employed, the network size and the parameters can be determined simultaneously in the learning process. First, all parameters of SOFNN are adjusted by using the SOA strategy to achieve fast convergence through a powerful search scheme. Second, the structure of SOFNN can be self-organized using the relative importance index of each rule. The fuzzy rules used in SOFNN with SOA (SOA-SOFNN) are generated or pruned automatically to reduce the computational complexity and potentially improve the generalization power. Finally, a theoretical analysis on the learning convergence of the proposed SOA-SOFNN is given to show the computational efficiency. To demonstrate the merits of our proposed approach for data modeling, several benchmark datasets, and a real world application associated with nonlinear systems modeling problems are examined with comparisons against other existing methods. The results indicate that our proposed SOA-SOFNN performs favorably in terms of both learning speed and prediction accuracy for online data modeling.
Collapse
|
16
|
Tang HA, Duan S, Hu X, Wang L. Passivity and synchronization of coupled reaction–diffusion neural networks with multiple time-varying delays via impulsive control. Neurocomputing 2018. [DOI: 10.1016/j.neucom.2018.08.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
17
|
Pourpanah F, Lim CP, Hao Q. A reinforced fuzzy ARTMAP model for data classification. INT J MACH LEARN CYB 2018. [DOI: 10.1007/s13042-018-0843-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
18
|
Li L, Qiao Z, Liu Y, Chen Y. A convergent smoothing algorithm for training max –min fuzzy neural networks. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.04.046] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
19
|
Shinde S, Kulkarni U. Extended fuzzy hyperline-segment neural network with classification rule extraction. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.03.036] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
20
|
Liu J, Ma Y, Zhang H, Su H, Xiao G. A modified fuzzy min–max neural network for data clustering and its application on pipeline internal inspection data. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.01.036] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
21
|
An enhanced fuzzy min–max neural network with ant colony optimization based-rule-extractor for decision making. Neurocomputing 2017. [DOI: 10.1016/j.neucom.2017.02.017] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
22
|
Improving the Fuzzy Min-Max neural network with a K-nearest hyperbox expansion rule for pattern classification. Appl Soft Comput 2017. [DOI: 10.1016/j.asoc.2016.12.001] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022]
|
23
|
A new hyperbox selection rule and a pruning strategy for the enhanced fuzzy min–max neural network. Neural Netw 2017; 86:69-79. [DOI: 10.1016/j.neunet.2016.10.012] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2015] [Revised: 10/19/2016] [Accepted: 10/27/2016] [Indexed: 11/20/2022]
|
24
|
|
25
|
Mirzamomen Z, Kangavari MR. Evolving Fuzzy Min–Max Neural Network Based Decision Trees for Data Stream Classification. Neural Process Lett 2016. [DOI: 10.1007/s11063-016-9528-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|