One of the most widespread diseases among women today is breast cancer. Early and accurate diagno... more One of the most widespread diseases among women today is breast cancer. Early and accurate diagnosis is key in rehabilitation and treatment. The usage of mammograms has some uncertainties in the detection rate. To develop tools for physicians for effective and early detection and diagnosis, machine learning techniques can be adopted. The introduction of Machine Learning (ML) in developing the tool will increase the survival rate of patients with breast cancer. This research work proposed different six ML techniques; Logistic Regression, Linear Discriminant Analysis, Decision Tree (DT), KNN, Naïve Bayes (NB), and Support Vector Machine (SVM), and then recommended the model with the highest accuracy for breast cancer detection. The experiment was carried out in a python environment and all the aforementioned techniques were validated with Wisconsin Breast Cancer dataset and evaluated with accuracy, precision, and recall.
Journal of information and organizational sciences
Machine learning has been useful for prediction in the various sectors of the economy. The resear... more Machine learning has been useful for prediction in the various sectors of the economy. The research work proposed an ensemble SA-CCT machine learning algorithm that gives early and accurate prediction of blackpod disease to farmers and agricultural extension officers in South-West, Nigeria. Since data mining put into consideration the types of pattern in a given dataset, the study considered the pattern in climatic dataset retrieved from Nigeria Meteorological agency (NIMET). The proposed model uses climatic parameters (Rainfall and Temperature) to predict the outbreak of blackpod disease. The ensemble SA-CCT model was formulated by hybridizing a linear algorithm Seasonal Auto Regressive Integrated Moving Average (SARIMA) and a nonlinear algorithm Compact Classification Tree (CCT), the implementation was done with python programming. The proposed SA-CCT model gives the following results after evaluation. Precision: 0.9429, Recall 0.9167, Mean Square Error: 0.2357, Accuracy: 0.9444
advances in multidisciplinary & scientific research journal publication, 2021
The trend at which cyber threats are gaining access to companies, industries and other sectors of... more The trend at which cyber threats are gaining access to companies, industries and other sectors of the economy is becoming alarming, and this is posting a serious challenge to network administrators, governments and other business owners. A formidable intrusion detection system is needed to outplay the activities of the cyberattacks. An ensemble system is believed to perform better than a single classifier. With this fact, five different Machine Learning (ML) ensemble algorithms are suggested at the perception phase of Situation Awareness (SA) model for threat detection and the algorithms include; Artificial Neural Network Based Decision Tree (ANN based DT), Bayesian Based Artificial Neural Network (BN based ANN), J48 Based Naïve Bayes Model (J48 based NB), Decision Tree based Bayesian Network (BN) and Random Forest based on Support Vector Machine (RF based SVM). The efficiency and effectiveness of all the aforementioned algorithms were evaluated based on precision, recall and accura...
Quite a number of scheduling algorithms have been implemented in the past, including First Come F... more Quite a number of scheduling algorithms have been implemented in the past, including First Come First Served (FCFS), Shortest Job First (SJF), Priority and Round Robin (RR). However, RR seems better than others because of its impartiality during the usage of its quantum time. Despite this, there is a big challenge with respect to the quantum time to use. This is because when the quantum time is too large, it leads to FCFS, and if the quantum time is too short, it increases the number of switches from the processes. As a result of this, this paper provides a descriptive review of various algorithms that have been implemented in the past 10 years, for various quantum time in order to optimize the performance of CPU utilization. This attempt will open more research areas for researchers, serve as a reference source and articulate various algorithms that have been used in the previous years – and as such, the paper will serve as a guide for future work. This research work further sugges...
International Journal of Advanced Computer Science and Applications
With the increase in cyber threats, computer network security has raised a lot of issues among va... more With the increase in cyber threats, computer network security has raised a lot of issues among various companies. In order to guide against all these threats, a formidable Intrusion Detection System (IDS) is needed. Various Machine Learning (ML) algorithms such as Artificial Neural Network (ANN), Decision Tree (DT), Support Vector Machine (SVM), Naïve Bayes, etc. has been used for threat detection. In light of the novel threats, there is a need to use a combination of tools to accurately enhance intrusion detection in computer networks, this is because intruders are gaining ground in the cyber world and the side effects on organizations cannot be quantified. The aim of this work is to provide an enhanced model for the detection of threats on the computer network. The combination of DT and ANN is proposed to accurately predict threats. With this model, a network administrator will be rest assured to some extent based on the prediction of the model. Two different supervised machine algorithms were hybridized in this research. NSL-KDD dataset was deployed for the simulation process in WEKA environment. The proposed model gave 0.984 precision, 0.982 sensitivity and 0.987 accuracy.
International Journal of Advanced Computer Science and Applications, 2019
Background: This paper proposed a Simplified Improved Dynamic Round Robin (SIDRR) algorithm that ... more Background: This paper proposed a Simplified Improved Dynamic Round Robin (SIDRR) algorithm that further improved on some existing Improvements on Round Robin CPU scheduling algorithms; Most of these improvements on Round Robin rely on arithmetic mean in selecting their Time Quantum (TQ). The arithmetic mean approach does not adequately represent the data. Aim: the aim of this study is to develop a simplified dynamic improved Round Robin CPU scheduling algorithm. Method: this study implemented five existing Round Robin scheduling algorithm using C++. The algorithms are; New Improved Round Robin (NIRR), Dynamic Average Burst Round Robin (DABRR), Improved Round Robin with Varying time Quantum (IRRVQ), Revamped Mean Round Robin (RMRR) and Efficient Dynamic Round Robin (EDRR). A new algorithm was also developed based on the Numeric Outlier Detection technique and geometric mean for dynamic time quantum determination. The proposed algorithm was compared with the five implemented using parameters such as average turnaround time, average waiting time and number of context switching. Results: the result of this study showed that the proposed algorithm performs better than the other five algorithms in terms of Average waiting time, average turnaround time & number of context switches. This study therefore, recommends the adoption of SIDRR for CPU scheduling and other emerging areas such as cloud computing resource allocation.
One of the most widespread diseases among women today is breast cancer. Early and accurate diagno... more One of the most widespread diseases among women today is breast cancer. Early and accurate diagnosis is key in rehabilitation and treatment. The usage of mammograms has some uncertainties in the detection rate. To develop tools for physicians for effective and early detection and diagnosis, machine learning techniques can be adopted. The introduction of Machine Learning (ML) in developing the tool will increase the survival rate of patients with breast cancer. This research work proposed different six ML techniques; Logistic Regression, Linear Discriminant Analysis, Decision Tree (DT), KNN, Naïve Bayes (NB), and Support Vector Machine (SVM), and then recommended the model with the highest accuracy for breast cancer detection. The experiment was carried out in a python environment and all the aforementioned techniques were validated with Wisconsin Breast Cancer dataset and evaluated with accuracy, precision, and recall.
Journal of information and organizational sciences
Machine learning has been useful for prediction in the various sectors of the economy. The resear... more Machine learning has been useful for prediction in the various sectors of the economy. The research work proposed an ensemble SA-CCT machine learning algorithm that gives early and accurate prediction of blackpod disease to farmers and agricultural extension officers in South-West, Nigeria. Since data mining put into consideration the types of pattern in a given dataset, the study considered the pattern in climatic dataset retrieved from Nigeria Meteorological agency (NIMET). The proposed model uses climatic parameters (Rainfall and Temperature) to predict the outbreak of blackpod disease. The ensemble SA-CCT model was formulated by hybridizing a linear algorithm Seasonal Auto Regressive Integrated Moving Average (SARIMA) and a nonlinear algorithm Compact Classification Tree (CCT), the implementation was done with python programming. The proposed SA-CCT model gives the following results after evaluation. Precision: 0.9429, Recall 0.9167, Mean Square Error: 0.2357, Accuracy: 0.9444
advances in multidisciplinary & scientific research journal publication, 2021
The trend at which cyber threats are gaining access to companies, industries and other sectors of... more The trend at which cyber threats are gaining access to companies, industries and other sectors of the economy is becoming alarming, and this is posting a serious challenge to network administrators, governments and other business owners. A formidable intrusion detection system is needed to outplay the activities of the cyberattacks. An ensemble system is believed to perform better than a single classifier. With this fact, five different Machine Learning (ML) ensemble algorithms are suggested at the perception phase of Situation Awareness (SA) model for threat detection and the algorithms include; Artificial Neural Network Based Decision Tree (ANN based DT), Bayesian Based Artificial Neural Network (BN based ANN), J48 Based Naïve Bayes Model (J48 based NB), Decision Tree based Bayesian Network (BN) and Random Forest based on Support Vector Machine (RF based SVM). The efficiency and effectiveness of all the aforementioned algorithms were evaluated based on precision, recall and accura...
Quite a number of scheduling algorithms have been implemented in the past, including First Come F... more Quite a number of scheduling algorithms have been implemented in the past, including First Come First Served (FCFS), Shortest Job First (SJF), Priority and Round Robin (RR). However, RR seems better than others because of its impartiality during the usage of its quantum time. Despite this, there is a big challenge with respect to the quantum time to use. This is because when the quantum time is too large, it leads to FCFS, and if the quantum time is too short, it increases the number of switches from the processes. As a result of this, this paper provides a descriptive review of various algorithms that have been implemented in the past 10 years, for various quantum time in order to optimize the performance of CPU utilization. This attempt will open more research areas for researchers, serve as a reference source and articulate various algorithms that have been used in the previous years – and as such, the paper will serve as a guide for future work. This research work further sugges...
International Journal of Advanced Computer Science and Applications
With the increase in cyber threats, computer network security has raised a lot of issues among va... more With the increase in cyber threats, computer network security has raised a lot of issues among various companies. In order to guide against all these threats, a formidable Intrusion Detection System (IDS) is needed. Various Machine Learning (ML) algorithms such as Artificial Neural Network (ANN), Decision Tree (DT), Support Vector Machine (SVM), Naïve Bayes, etc. has been used for threat detection. In light of the novel threats, there is a need to use a combination of tools to accurately enhance intrusion detection in computer networks, this is because intruders are gaining ground in the cyber world and the side effects on organizations cannot be quantified. The aim of this work is to provide an enhanced model for the detection of threats on the computer network. The combination of DT and ANN is proposed to accurately predict threats. With this model, a network administrator will be rest assured to some extent based on the prediction of the model. Two different supervised machine algorithms were hybridized in this research. NSL-KDD dataset was deployed for the simulation process in WEKA environment. The proposed model gave 0.984 precision, 0.982 sensitivity and 0.987 accuracy.
International Journal of Advanced Computer Science and Applications, 2019
Background: This paper proposed a Simplified Improved Dynamic Round Robin (SIDRR) algorithm that ... more Background: This paper proposed a Simplified Improved Dynamic Round Robin (SIDRR) algorithm that further improved on some existing Improvements on Round Robin CPU scheduling algorithms; Most of these improvements on Round Robin rely on arithmetic mean in selecting their Time Quantum (TQ). The arithmetic mean approach does not adequately represent the data. Aim: the aim of this study is to develop a simplified dynamic improved Round Robin CPU scheduling algorithm. Method: this study implemented five existing Round Robin scheduling algorithm using C++. The algorithms are; New Improved Round Robin (NIRR), Dynamic Average Burst Round Robin (DABRR), Improved Round Robin with Varying time Quantum (IRRVQ), Revamped Mean Round Robin (RMRR) and Efficient Dynamic Round Robin (EDRR). A new algorithm was also developed based on the Numeric Outlier Detection technique and geometric mean for dynamic time quantum determination. The proposed algorithm was compared with the five implemented using parameters such as average turnaround time, average waiting time and number of context switching. Results: the result of this study showed that the proposed algorithm performs better than the other five algorithms in terms of Average waiting time, average turnaround time & number of context switches. This study therefore, recommends the adoption of SIDRR for CPU scheduling and other emerging areas such as cloud computing resource allocation.
Uploads
Papers by sunday samuel