1
|
Hou Y, Wang Q, Zhou K, Zhang L, Tan T. Integrated machine learning methods with oversampling technique for regional suitability prediction of waste-to-energy incineration projects. WASTE MANAGEMENT (NEW YORK, N.Y.) 2024; 174:251-262. [PMID: 38070444 DOI: 10.1016/j.wasman.2023.12.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/16/2023] [Revised: 11/12/2023] [Accepted: 12/04/2023] [Indexed: 01/16/2024]
Abstract
China's tiered strategy to enhance county-level waste incineration for energy aligns with the sustainable development goals (SDGs), emphasizing the need for comprehensive assessments of waste-to-energy (WtE) plant suitability. Traditional assessment methodologies face challenges, particularly in suggesting innovative site alternatives, adapting to new data sets, and their dependence on strict assumptions. This study introduced enhancements in three pivotal dimensions. Methodologically, it leverages data-driven machine learning (ML) approaches to capture the complex relationships essential for site selection, reducing dependency on strict assumptions. In terms of predictive performance, the integration of oversampling with stacked ensemble models enhances the diversity and generalizability of ML models. The area under curve (AUC) scores from four ML models, enhanced by the oversampled dataset, demonstrated significant improvements compared to the original dataset. The stacking model excelled, achieving a score of 92%. It also led in overall Precision and Recall, reaching 85.2% and 85.08% respectively. Nevertheless, a noticeable discrepancy existed in Precision and Recall for positive classes. The stacking model topped Precision scores at 83.1%, followed by eXtreme Gradient Boosting (XGBoost) (82.61%). In terms of Recall, XGBoost recorded the lowest at 85.07%, while the other three classifiers all marked 88.06%. From an industry applicability standpoint, the stacking model provides innovative location alternatives and demonstrates adaptability in Hunan province, offering a reusable tool for WtE location. In conclusion, this study not only enhances the methodological aspects of WtE site selection but also provides practical and adaptable solutions, contributing positively to sustainable waste management practices.
Collapse
Affiliation(s)
- Yali Hou
- College of Information Engineering, Nanjing Xiaozhuang University, Nanjing 211171, China
| | - Qunwei Wang
- College of Economics and Management, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
| | - Kai Zhou
- College of Information Engineering, Nanjing Xiaozhuang University, Nanjing 211171, China
| | - Ling Zhang
- College of Economics and Management, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China; Research Centre for Soft Energy Science, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
| | - Tao Tan
- College of Public Administration, Nanjing Agricultural University, Nanjing 210095, China.
| |
Collapse
|
2
|
Rabie OBJ, Selvarajan S, Hasanin T, Alshareef AM, Yogesh CK, Uddin M. A novel IoT intrusion detection framework using Decisive Red Fox optimization and descriptive back propagated radial basis function models. Sci Rep 2024; 14:386. [PMID: 38172185 PMCID: PMC10764843 DOI: 10.1038/s41598-024-51154-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2023] [Accepted: 01/01/2024] [Indexed: 01/05/2024] Open
Abstract
The Internet of Things (IoT) is extensively used in modern-day life, such as in smart homes, intelligent transportation, etc. However, the present security measures cannot fully protect the IoT due to its vulnerability to malicious assaults. Intrusion detection can protect IoT devices from the most harmful attacks as a security tool. Nevertheless, the time and detection efficiencies of conventional intrusion detection methods need to be more accurate. The main contribution of this paper is to develop a simple as well as intelligent security framework for protecting IoT from cyber-attacks. For this purpose, a combination of Decisive Red Fox (DRF) Optimization and Descriptive Back Propagated Radial Basis Function (DBRF) classification are developed in the proposed work. The novelty of this work is, a recently developed DRF optimization methodology incorporated with the machine learning algorithm is utilized for maximizing the security level of IoT systems. First, the data preprocessing and normalization operations are performed to generate the balanced IoT dataset for improving the detection accuracy of classification. Then, the DRF optimization algorithm is applied to optimally tune the features required for accurate intrusion detection and classification. It also supports increasing the training speed and reducing the error rate of the classifier. Moreover, the DBRF classification model is deployed to categorize the normal and attacking data flows using optimized features. Here, the proposed DRF-DBRF security model's performance is validated and tested using five different and popular IoT benchmarking datasets. Finally, the results are compared with the previous anomaly detection approaches by using various evaluation parameters.
Collapse
Affiliation(s)
- Osama Bassam J Rabie
- Department of Information Systems, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Kingdom of Saudi Arabia
- Cybersecurity Center, King Abdulaziz University, Jeddah, Kingdom of Saudi Arabia
| | - Shitharth Selvarajan
- School of Built Environment, Engineering and Computing, Leeds Beckett University, Leeds, LS1 3HE, UK.
- Department of Computer Science, Kebri Dehar University, Kebri Dehar, Ethiopia.
| | - Tawfiq Hasanin
- Department of Information Systems, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Kingdom of Saudi Arabia
| | - Abdulrhman M Alshareef
- Department of Information Systems, Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Kingdom of Saudi Arabia
| | - C K Yogesh
- School of Computer Science and Engineering, ViT Chennai Campus, Chennai, India
| | - Mueen Uddin
- College of Computing and IT, University of Doha for Science and Technology, 24449, Doha, Qatar
| |
Collapse
|
3
|
Caroline Misbha J, Ajith Bosco Raj T, Jiji G. Novel deep learning approach for DDoS attack using elephant heard optimization algorithm along with a fuzzy classifier for rules learning. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS 2023. [DOI: 10.3233/jifs-224149] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
The research aims to provide network security so that it can be protected from several attacks, especially DoS (Denial-of-Service) or DDoS (Distributed Denial-of-Service) attacks that could at some point render the server inoperable. Security is one of the main obstacles. There are a lot of network risks and attacks available today. One of the most common and disruptive attacks is a DDoS attack. In this study, upgraded deep learning Elephant Herd Optimization with random forest classifier is employed for early DDos attack detection. The DDoS dataset’s number of characteristics is decreased by the proposed IDN-EHO method for classifying data learning that works with a lot of data. In the feature extraction stage, deep neural networks (DNN) approach is used, and the classified data packages are compared to return the DDoS attack traffic characteristics with a significant percentage. In the classification stage, the proposed deep learning Elephant Herd Optimization with random forest classifier used to classify the data learning which deal with a huge amount of data and minimise the number of features of the DDoS dataset. During the detection step, when the extracted features are used as input features, the attack detection model is trained using the improved deep learning Elephant Herd Optimization. The proposed framework has the potential to be a promising method for identifying unidentified DDoS attacks, according to experiments. 99% recall, precision, and accuracy can be attained using the suggested strategy, according on the findings of the experiments.
Collapse
Affiliation(s)
- J. Caroline Misbha
- Department of Computer Science and Engineering, Arunachala College of Engineering for Women, Nagercoil, Tamil Nadu, India
| | - T. Ajith Bosco Raj
- Department of Electronics and Communication Engineering, PSN College of Engineering and Technology, Melathediyoor, Tirunelveli, Tamil Nadu, India
| | - G. Jiji
- Department of Electronics and Communication Engineering, Lord Jegannath College of Engineering and Technology, Nagercoil, Tamil Nadu, India
| |
Collapse
|
4
|
Tariq A, Jiango Y, Li Q, Gao J, Lu L, Soufan W, Almutairi KF, Habib-ur-Rahman M. Modelling, mapping and monitoring of forest cover changes, using support vector machine, kernel logistic regression and naive bayes tree models with optical remote sensing data. Heliyon 2023; 9:e13212. [PMID: 36785833 PMCID: PMC9918775 DOI: 10.1016/j.heliyon.2023.e13212] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2022] [Revised: 01/16/2023] [Accepted: 01/19/2023] [Indexed: 01/27/2023] Open
Abstract
The present study is designed to monitor the spatio-temporal changes in forest cover using Remote Sensing (RS) and Geographic Information system (GIS) techniques from 1990 to 2017. Landsat data from 1990 (Thematic mapper [TM]), 2000 and 2010 (Enhanced Thematic Mapper [ETM+]), and 2013 to 2017 (Operational Land Imager/Thermal Infrared Sensor [OLI/TIRS]) were classified into the classes termed snow, water, barren land, built-up area, forest, and vegetation. The method was built using multitemporal Landsat images and the machine learning techniques Support Vector Machine (SVM), Naive Bayes Tree (NBT) and Kernel Logistic Regression (KLR). According to the results, forest area was decreased from 19,360 km2 (26.0%) to 18,784 km2 (25.2%) from 1990 to 2010, while forest area was increased from 18,640 km2 (25.0%) to 26,765 km2 (35.9%) area from 2013 to 2017 due to "One billion tree Project". According to our findings, SVM performed better than KLR and NBT on all three accuracy metrics (recall, precision, and accuracy) and the F1 score was >0.89. The study demonstrated that concurrent reforestation in barren land areas improved methods of sustaining the forest and RS and GIS into everyday forestry organization practices in Khyber Pakhtun Khwa (KPK), Pakistan. The study results were beneficial, especially at the decision-making level for the local or provincial government of KPK and for understanding the global scenario for regional planning.
Collapse
Affiliation(s)
- Aqil Tariq
- State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan, 430079, Hubei, China
| | - Yan Jiango
- State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan, 430079, Hubei, China,Corresponding author.
| | - Qingting Li
- Airborne Remote Sensing Center, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
| | - Jianwei Gao
- Institute of Spacecraft Application System Engineering, China Academy of Space Technology, Beijing, 100094, China
| | - Linlin Lu
- Key Laboratory of Digital Earth Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China,Corresponding author.
| | - Walid Soufan
- Plant Production Department, College of Food and Agriculture Sciences, King Saud University, P.O. Box 2460, Riyadh 11451, Saudi Arabia
| | - Khalid F. Almutairi
- Plant Production Department, College of Food and Agriculture Sciences, King Saud University, P.O. Box 2460, Riyadh 11451, Saudi Arabia
| | - Muhammad Habib-ur-Rahman
- Institute of Crop Science and Resource Conservation (INRES), Crop Science, University of Bonn, 53115, Bonn, Germany
| |
Collapse
|
5
|
Negera WG, Schwenker F, Debelee TG, Melaku HM, Ayano YM. Review of Botnet Attack Detection in SDN-Enabled IoT Using Machine Learning. SENSORS (BASEL, SWITZERLAND) 2022; 22:9837. [PMID: 36560204 PMCID: PMC9787631 DOI: 10.3390/s22249837] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Revised: 12/07/2022] [Accepted: 12/10/2022] [Indexed: 06/17/2023]
Abstract
The orchestration of software-defined networks (SDN) and the internet of things (IoT) has revolutionized the computing fields. These include the broad spectrum of connectivity to sensors and electronic appliances beyond standard computing devices. However, these networks are still vulnerable to botnet attacks such as distributed denial of service, network probing, backdoors, information stealing, and phishing attacks. These attacks can disrupt and sometimes cause irreversible damage to several sectors of the economy. As a result, several machine learning-based solutions have been proposed to improve the real-time detection of botnet attacks in SDN-enabled IoT networks. The aim of this review is to investigate research studies that applied machine learning techniques for deterring botnet attacks in SDN-enabled IoT networks. Initially the first major botnet attacks in SDN-IoT networks have been thoroughly discussed. Secondly a commonly used machine learning techniques for detecting and mitigating botnet attacks in SDN-IoT networks are discussed. Finally, the performance of these machine learning techniques in detecting and mitigating botnet attacks is presented in terms of commonly used machine learning models' performance metrics. Both classical machine learning (ML) and deep learning (DL) techniques have comparable performance in botnet attack detection. However, the classical ML techniques require extensive feature engineering to achieve optimal features for efficient botnet attack detection. Besides, they fall short of detecting unforeseen botnet attacks. Furthermore, timely detection, real-time monitoring, and adaptability to new types of attacks are still challenging tasks in classical ML techniques. These are mainly because classical machine learning techniques use signatures of the already known malware both in training and after deployment.
Collapse
Affiliation(s)
- Worku Gachena Negera
- Addis Ababa Institute of Technology, Addis Ababa University, Addis Ababa 445, Ethiopia
| | | | - Taye Girma Debelee
- Ethiopian Artificial Intelligence Institute, Addis Ababa 40782, Ethiopia
- College of Electrical and Computer Engineering, Addis Ababa Science and Technology University, Addis Ababa 16417, Ethiopia
| | | | | |
Collapse
|
6
|
Bahaa A, Sayed A, Elfangary L, Fahmy H. A novel hybrid optimization enabled robust CNN algorithm for an IoT network intrusion detection approach. PLoS One 2022; 17:e0278493. [PMID: 36454861 PMCID: PMC9714761 DOI: 10.1371/journal.pone.0278493] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2022] [Accepted: 11/16/2022] [Indexed: 12/05/2022] Open
Abstract
Due to the huge number of connected Internet of Things (IoT) devices within a network, denial of service and flooding attacks on networks are on the rise. IoT devices are disrupted and denied service because of these attacks. In this study, we proposed a novel hybrid meta-heuristic adaptive particle swarm optimization-whale optimizer algorithm (APSO-WOA) for optimization of the hyperparameters of a convolutional neural network (APSO-WOA-CNN). The APSO-WOA optimization algorithm's fitness value is defined as the validation set's cross-entropy loss function during CNN model training. In this study, we compare our optimization algorithm with other optimization algorithms, such as the APSO algorithm, for optimization of the hyperparameters of CNN. In model training, the APSO-WOA-CNN algorithm achieved the best performance compared to the FNN algorithm, which used manual parameter settings. We evaluated the APSO-WOA-CNN algorithm against APSO-CNN, SVM, and FNN. The simulation results suggest that APSO-WOA-CNf[N is effective and can reliably detect multi-type IoT network attacks. The results show that the APSO-WOA-CNN algorithm improves accuracy by 1.25%, average precision by 1%, the kappa coefficient by 11%, Hamming loss by 1.2%, and the Jaccard similarity coefficient by 2%, as compared to the APSO-CNN algorithm, and the APSO-CNN algorithm achieves the best performance, as compared to other algorithms.
Collapse
Affiliation(s)
- Ahmed Bahaa
- Faculty of Computers and Artificial Intelligence, Department of Information Systems, Helwan University, Helwan, Egypt
- Faculty of Computers and Artificial Intelligence, Department of Information Systems, Beni-Suef University, Beni Suef, Egypt
| | - Abdalla Sayed
- Faculty of Computers and Artificial Intelligence, Department of Information Systems, Helwan University, Helwan, Egypt
- * E-mail:
| | - Laila Elfangary
- Faculty of Computers and Artificial Intelligence, Department of Information Systems, Helwan University, Helwan, Egypt
| | - Hanan Fahmy
- Faculty of Computers and Artificial Intelligence, Department of Information Systems, Helwan University, Helwan, Egypt
| |
Collapse
|
7
|
Latif Z, Umer Q, Lee C, Sharif K, Li F, Biswas S. A Machine Learning-Based Anomaly Prediction Service for Software-Defined Networks. SENSORS (BASEL, SWITZERLAND) 2022; 22:8434. [PMID: 36366129 PMCID: PMC9658740 DOI: 10.3390/s22218434] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 10/28/2022] [Accepted: 10/29/2022] [Indexed: 06/16/2023]
Abstract
Software-defined networking (SDN) has gained tremendous growth and can be exploited in different network scenarios, from data centers to wide-area 5G networks. It shifts control logic from the devices to a centralized entity (programmable controller) for efficient traffic monitoring and flow management. A software-based controller enforces rules and policies on the requests sent by forwarding elements; however, it cannot detect anomalous patterns in the network traffic. Due to this, the controller may install the flow rules against the anomalies, reducing the overall network performance. These anomalies may indicate threats to the network and decrease its performance and security. Machine learning (ML) approaches can identify such traffic flow patterns and predict the systems' impending threats. We propose an ML-based service to predict traffic anomalies for software-defined networks in this work. We first create a large dataset for network traffic by modeling a programmable data center with a signature-based intrusion-detection system. The feature vectors are pre-processed and are constructed against each flow request by the forwarding element. Then, we input the feature vector of each request to a machine learning classifier for training to predict anomalies. Finally, we use the holdout cross-validation technique to evaluate the proposed approach. The evaluation results specify that the proposed approach is highly accurate. In contrast to baseline approaches (random prediction and zero rule), the performance improvement of the proposed approach in average accuracy, precision, recall, and f-measure is (54.14%, 65.30%, 81.63%, and 73.70%) and (4.61%, 11.13%, 9.45%, and 10.29%), respectively.
Collapse
Affiliation(s)
- Zohaib Latif
- Department of Computer Science, Hanyang University, Seoul 04763, Korea
| | - Qasim Umer
- Department of Computer Science, COMSATS University Islamabad, Vehari Campus, Vehari 61100, Pakistan
| | - Choonhwa Lee
- Department of Computer Science, Hanyang University, Seoul 04763, Korea
| | - Kashif Sharif
- School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, China
| | - Fan Li
- School of Computer Science and Technology, Beijing Institute of Technology, Beijing 100081, China
| | - Sujit Biswas
- Computer Science and Digital Technologies Department, University of East London, London E16 2RD, UK
| |
Collapse
|
8
|
Asaithambi S, Ravi L, Kotb H, Milyani AH, Azhari AA, Nallusamy S, Varadarajan V, Vairavasundaram S. An Energy-Efficient and Blockchain-Integrated Software Defined Network for the Industrial Internet of Things. SENSORS (BASEL, SWITZERLAND) 2022; 22:7917. [PMID: 36298266 PMCID: PMC9607010 DOI: 10.3390/s22207917] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/29/2022] [Revised: 10/11/2022] [Accepted: 10/14/2022] [Indexed: 06/16/2023]
Abstract
The number of unsecured and portable Internet of Things (IoT) devices in the smart industry is growing exponentially. A diversity of centralized and distributed platforms have been implemented to defend against security attacks; however, these platforms are insecure because of their low storage capacities, high power utilization, single node failure, underutilized resources, and high end-to-end delay. Blockchain and Software-Defined Networking (SDN) are growing technologies to create a secure system and to ensure safe network connectivity. Blockchain technology offers a strong and trustworthy foundation to deal with threats and problems, including safety, privacy, adaptability, scalability, and security. However, the integration of blockchain with SDN is still in the implementation phase, which provides an efficient resource allocation and reduced latency that can overcome the issues of industrial IoT networks. We propose an energy-efficient blockchain-integrated software-defined networking architecture for Industrial IoT (IIoT) to overcome these challenges. We present a framework for implementing decentralized blockchain integrated with SDN for IIoT applications to achieve efficient energy utilization and cluster-head selection. Additionally, the blockchain-enabled distributed ledger ensures data consistency throughout the SDN controller network and keeps a record of the nodes enforced in the controller. The simulation result shows that the proposed model provides the best energy consumption, end-to-end latency, and overall throughput compared to the existing works.
Collapse
Affiliation(s)
- Sasikumar Asaithambi
- Department of Electronics and Communication Engineering, Vel Tech Rangarajan Dr. Sagunthala R&D Institute of Science and Technology, Avadi, Chennai 600062, Tamil Nadu, India
| | - Logesh Ravi
- SENSE, Vellore Institute of Technology, Chennai 600127, Tamil Nadu, India
- Data Engineering Research Group (DERG–SENSE), Vellore Institute of Technology, Chennai 600127, Tamil Nadu, India
| | - Hossam Kotb
- Department of Electrical Power and Machines, Faculty of Engineering, Alexandria University, Alexandria 21544, Egypt
| | - Ahmad H. Milyani
- Department of Electrical and Computer Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
- Center of Research Excellence in Renewable Energy and Power Systems, King Abdulaziz University, Jeddah 21589, Saudi Arabia
| | | | - Senthilkumar Nallusamy
- Department of Electronics and Communication Engineering, M.Kumarasamy College of Engineering, Karur 639113, Tamil Nadu, India
| | - Vijayakumar Varadarajan
- School of Computer Science and Engineering, University of New South Wales, Sydney 2052, Australia
- Ajeenkya DY Patil University, Pune 412105, Maharashtra, India
- Swiss School of Business Management, SSBM, 1213 Geneva, Switzerland
| | | |
Collapse
|
9
|
Early Detection of Abnormal Attacks in Software-Defined Networking Using Machine Learning Approaches. Symmetry (Basel) 2022. [DOI: 10.3390/sym14061178] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
Recent developments have made software-defined networking (SDN) a popular technology for solving the inherent problems of conventional distributed networks. The key benefit of SDN is the decoupling between the control plane and the data plane, which makes the network more flexible and easier to manage. SDN is a new generation network architecture; however, its configuration settings are centralized, making it vulnerable to hackers. Our study investigated the feasibility of applying artificial intelligence technology to detect abnormal attacks in an SDN environment based on the current unit network architecture; therefore, the concept of symmetry includes the sustainability of SDN applications and robust performance of machine learning (ML) models in the case of various malicious attacks. In this study, we focus on the early detection of abnormal attacks in an SDN environment. On detection of malicious traffic in SDN topology, the AI module in the topology is applied to detect and act against the attack source through machine learning algorithms, making the network architecture more flexible. Under multiple abnormal attacks, we propose a hierarchical multi-class (HMC) architecture to effectively address the imbalanced dataset problem and improve the performance of minority classes. The experimental results show that the decision tree, random forest, bagging, AdaBoost, and deep learning models exhibit the best performance for distributed denial-of-service (DDoS) attacks. In addition, for the imbalanced dataset problem of multiclass classification, our proposed HMC architecture performs better than previous single classifiers. We also simulated the SDN topology and scenario verification. In summary, we concatenated the AI module to enhance the security and effectiveness of SDN networks in a practical manner.
Collapse
|
10
|
The Robustness of Detecting Known and Unknown DDoS Saturation Attacks in SDN via the Integration of Supervised and Semi-Supervised Classifiers. FUTURE INTERNET 2022. [DOI: 10.3390/fi14060164] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Abstract
The design of existing machine-learning-based DoS detection systems in software-defined networking (SDN) suffers from two major problems. First, the proper time window for conducting network traffic analysis is unknown and has proven challenging to determine. Second, it is unable to detect unknown types of DoS saturation attacks. An unknown saturation attack is an attack that is not represented in the training data. In this paper, we evaluate three supervised classifiers for detecting a family of DDoS flooding attacks (UDP, TCP-SYN, IP-Spoofing, TCP-SARFU, and ICMP) and their combinations using different time windows. This work represents an extension of the runner-up best-paper award entitled ‘Detecting Saturation Attacks in SDN via Machine Learning’ published in the 2019 4th International Conference on Computing, Communications and Security (ICCCS). The results in this paper show that the trained supervised models fail in detecting unknown saturation attacks, and their overall detection performance decreases when the time window of the network traffic increases. Moreover, we investigate the performance of four semi-supervised classifiers in detecting unknown flooding attacks. The results indicate that semi-supervised classifiers outperform the supervised classifiers in the detection of unknown flooding attacks. Furthermore, to further increase the possibility of detecting the known and unknown flooding attacks, we propose an enhanced hybrid approach that combines two supervised and semi-supervised classifiers. The results demonstrate that the hybrid approach has outperformed individually supervised or semi-supervised classifiers in detecting the known and unknown flooding DoS attacks in SDN.
Collapse
|