Here you can find the paper and the available public pages, files, and source code options for the work
Harris hawks optimization (HHO) is a newly developed swarm-based algorithm, which mimics the cooperation behavior of Harris hawks. Although the original HHO has a specific weight compared to other popular methods when working on the local exploitation of feasible solutions, it still may fail to achieve an excellent set of scales between locally accurate exploitation and globally exploratory search. This imbalanced behavior forms an overall perspective, which can be experienced by slow convergence, inaccuracy or inadequate search coverage, and quickly dropping into local solutions. In order to strengthen the performance of the HHO, two strategies of Gaussian mutation and a dimension decision strategy observed in the cuckoo search method are introduced into this optimizer. The mechanism of cuckoo search is very useful in improving the convergence speed of the search agents as well as sufficient excavation of the solutions in the search area, while the Gaussian mutation strategy performs well in increasing the accuracy and jumping out of the local optimum. In order to verify the remarkable performance of the proposed method, the enhanced paradigm is compared with other mate-heuristic algorithms on 30 IEEE CEC2017 benchmark functions and three typical engineering problems. The experimental results illustrate that the novel developed GCHHO has an excellent ability to achieve superior performance in competition with the original HHO as well as other well-established optimizers. Supportive data and info for this research will be publicly provided in http://aliasgharheidari.com.
The population-based HHO was the most successful and popular optimization method currently. The HHO focuses on performance and provides a variety of search patterns based on random switching statements. It is a gradient-free optimization algorithm with several energetic and time-varying stages of exploration and exploitation tendencies. In spite of previous methods published in lower impact journals, the HHO was published in the Journal of Future Generation Computer Systems (FGCS) with an impact factor of 6.125 in 2019, and from the first day of publication, it has gotten growing consideration among researchers owing to its flexible structure, high performance, and first-rate results. The leading logic of the HHO technique is created according to some successful life patterns of Harris' hawks in nature called "surprise pounce". Due to the HHO technique's efficacy, there are many variants of HHO now in the best leading Elsevier and IEEE transaction journals.
Go to Webpage of Harris Hawks Optimization (HHO) for full info
Download MATLAB source codes of Harris Hawks Optimization (HHO)
Download JAVA source codes of Harris Hawks Optimization (HHO)
Download Python source codes of Harris Hawks Optimization (HHO)
Download LATEX source codes of Harris Hawks Optimization (HHO)
Download Visio source files of Harris Hawks Optimization (HHO)
The HHO algorithm is a high performance and easy to code, and straightforward to understand optimizer, while it has some time-varying components. The primary method was published in a top prestigious computer science journal. In 2020, it turned into the most used method for solving any problem. This method's source codes are widely available in almost all programming languages, and it has both a latex template and word office file for the pleasure of users. This method is backed up with a 24-h online service for reacting to users' questions on the code..
How to cite?
Song, Shiming, Pengjun Wang, Ali Asghar Heidari, Mingjing Wang, Xuehua Zhao, Huiling Chen, Wenming He, and Suling Xu. "Dimension decided Harris hawks optimization with Gaussian mutation: Balance analysis and diversity patterns." Knowledge-Based Systems (2020): 106425. https://doi.org/10.1016/j.knosys.2020.106425
" The question of whether a computer can think is no more interesting than the question of whether a submarine can swim."
Edsger W. Dijkstra