Skip to content

Uses Harris Hawk and Whale Nature Inspired Algorithm to Train the weights of Neural Network. An approach to adjust the parameters of NN connection weights using the hybrid of Harris Hawk Optimization and Whale Optimization algorithm was proposed. The results showed that the hybrid algorithm has been successfully applied to train neural networks.…

Notifications You must be signed in to change notification settings

iinaimaf/Hybridized-Harris-hawk-whale-optimization-algorithm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hybridized-Harris-hawk-whale-optimization-algorithm

Uses Harris Hawk and Whale Nature Inspired Algorithm to Train the weights of Neural Network. An approach to adjust the parameters of NN connection weights using the hybrid of Harris Hawk Optimization and Whale Optimization algorithm was proposed. The results showed that the hybrid algorithm has been successfully applied to train neural networks. The results showed that there is no superiority of one algorithm over another, however, the results of the proposed algorithm are a competitive alternative to other P-Metaheuristic algorithms. Hybrid Harris Hawk with Whale optimization to train weights of the neural network was used to increase the efficiency of fraud detection and cancer datasets. Our method for anomaly detection is a supervised method based on classification. The performance of Harris Hawk with Whale is acceptable and has promising results that nominate it for other optimization applications such as scheduling.

Problem Statement: The point of our undertaking is to improve the viability of the backpropagation Neural network algorithm. In spite of the fact that different arrangements have been proposed to optimize the weights of the neural networks, there is consistently space for enhancements. The old model where they enhanced the weights of the neural networks by backpropagation had different downsides, along these lines there was a requirement for procedures which can enhance the weights, the number of nodes in each hidden layer, the number of hidden layers, network of the model which straightforwardly influence the accuracy of our model. We tested different nature-inspired algorithms hybrid with neural networks. We studied algorithms like Particle swarm optimization, bat algorithm, fireflies algorithm, ant optimization algorithm, artificial bee colony algorithm, salp algorithm, whale algorithm, harris hawk algorithm, etc. We proposed two new algorithms 1) Hybrid of Harris hawk with the neural networks and 2) Hybrid of Harris hawk optimization and whale optimization to train the weights of neural networks.

Overall Description of the Project The learning process of artificial neural-networks is considered as one of the most burdensome challenges to the researchers. The major dilemma of training the neural-networks is the nonlinear nature and unknown controlling parameters like weights and biases. The dominant demerits of the traditional training algorithms are the slow convergence speed and getting caught up in local optima. This report proposes a hybridization of the harris hawk optimization algorithm with neural networks. Harris hawk is a metaheuristic evolutionary algorithm. We use the Harris hawk algorithm to optimize the weights of neural networks. Later, we have also combined the exploring phase of harris hawk with the exploitation phase of the whale optimization algorithm. We also present a comparative study of different evolutionary algorithms hybridized with neural networks and compared it to our new proposed algorithm. The excellence of the proposed algorithm is certified by exercising it on many kinds of datasets of fraud and cancer and these statistical results are compared with the results of rival optimization algorithms. The procured results demonstrate that our proposed algorithm functions better than other evolutionary methods. Multilayer feedforward network owns many attributes and characteristics which suit best for nonlinear optimization. There are two major classifications of supervised training methods for multilayer perceptron neural networks: gradient-based and stochastic based. Through the technique of backpropagation (a standard example of gradient-based methods), it is possible to train weights of Neural-Networks and obtain better outcomes. But it has a few drawbacks. It can get stuck in local optima and hence giving an unreliable output. Also, backpropagation is very much dependent on the input data and it can be quite sensitive to noisy data. The convergence rate of backpropagation is also quite slow. Thus we use metaheuristic algorithms that try to preserve a balance between global and local search. They are more popular among other optimization techniques because they are easy to implement, exhibit higher efficiency to overcome local optima, and do not require gradient information. The point of our undertaking is to improve the viability of the backpropagation Neural network algorithm. In spite of the fact that different arrangements have been proposed to optimize the weights of the neural networks, there is consistently space for enhancements. The old model where they enhanced the weights of the neural networks by backpropagation had different downsides, along these lines there was a requirement for procedures which can enhance the weights, the number of nodes in each hidden layer, the number of hidden layers, network of the model which straightforwardly influence the accuracy of our model. We tested different nature-inspired algorithms hybrid with neural networks. We studied algorithms like Particle swarm optimization, bat algorithm, fireflies algorithm, ant optimization algorithm, artificial bee colony algorithm, salp algorithm, whale algorithm, harris hawk algorithm, etc. We proposed two new algorithms: 1) Hybrid of Harris hawk with the neural networks and 2) Hybrid of hawk optimization and whale optimization to train the weights of neural networks.

About

Uses Harris Hawk and Whale Nature Inspired Algorithm to Train the weights of Neural Network. An approach to adjust the parameters of NN connection weights using the hybrid of Harris Hawk Optimization and Whale Optimization algorithm was proposed. The results showed that the hybrid algorithm has been successfully applied to train neural networks.…

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages