Optimization of Benchmark Functions Using Local Search and Stochastic Hill Climbing
The optimization of five benchmark functions using the Local Search and Stochastic Hill Climbing algorithm.
In Local Search, neighborhoods was generated by radius size and selected the best neighborhood.
Local Search (LS):
1- Generate an initial solution
2- In each iteration,
- Generate a set of neighboring solutions based on the current solution using a predefined radius size.
- Select the best solution from the generated neighbors.
- Compare the selected neighbor with the current best solution.
- If the selected neighbor is superior to the current solution, update the current solution and proceed to the next iteration; otherwise, terminate the algorithm.
Stochastic Hill Climbing (SHC):
1- Generate an initial solution.
2- In each iteration,
- Generate a set of neighboring solutions based on the current solution using a predefined radius size.
- Randomly select one of the neighboring solutions.
- Compare the selected neighbor with the current best solution.
- If the selected neighbor is superior to the current solution, update the current solution and continue to the next iteration until the maximum number of iterations is reached.
Neighborhood count: 1000
Radius scale size: 0.1
Domain: (-5.12, 5.12)
Table 1 – Functions and Global Minimum Values
2nd De Jong Rosenbrock’s Saddle Function
Local Search Results
– De Jong 1st Function
- 2nd De Jong Rosenbrock’s Saddle Function
- Rastrigin’s Function
– Griewangk’s Function
Stochastic Hill Climbing Results
– 2nd De Jong Rosenbrock’s Saddle Function
– 4th De Jong Function
– Rastrigin’s Function
– Griewangk’s Function
Comparison Tables
Local Search (LS) demonstrated high computational efficiency, whereas Stochastic Hill Climbing (SHC) generally yielded better results. Both methods are variants of local search; however, their neighborhood selection strategies differ. SHC selects a neighboring solution randomly, while LS always chooses the best available option. Additionally, LS terminates if no improvement is found, whereas SHC continues until the maximum number of iterations is reached.
Local search (LS) also employs a greedy approach in this context, as it aims to improve the current solution at each step by selecting the best neighboring solution. This approach often leads the algorithm to become stuck in a local optimum, as it consistently chooses the most favorable option available without considering alternative paths.
Link for codes: https://github.com/gokhanergen-tech/python-local-search-stochastic-hill-climbing
References
-
https://algorithmafternoon.com/stochastic/stochastic_hill_climbing/
-
https://machinelearningmastery.com/stochastic-hill-climbing-in-python-from-scratch/
-
https://www.geeksforgeeks.org/introduction-hill-climbing-artificial-intelligence/
原文链接:Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis
暂无评论内容