Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis

Optimization of Benchmark Functions Using Local Search and Stochastic Hill Climbing

The optimization of five benchmark functions using the Local Search and Stochastic Hill Climbing algorithm.

In Local Search, neighborhoods was generated by radius size and selected the best neighborhood.

Local Search (LS):

1- Generate an initial solution

2- In each iteration,

  • Generate a set of neighboring solutions based on the current solution using a predefined radius size.
  • Select the best solution from the generated neighbors.
  • Compare the selected neighbor with the current best solution.
  • If the selected neighbor is superior to the current solution, update the current solution and proceed to the next iteration; otherwise, terminate the algorithm.

Stochastic Hill Climbing (SHC):

1- Generate an initial solution.

2- In each iteration,

  • Generate a set of neighboring solutions based on the current solution using a predefined radius size.
  • Randomly select one of the neighboring solutions.
  • Compare the selected neighbor with the current best solution.
  • If the selected neighbor is superior to the current solution, update the current solution and continue to the next iteration until the maximum number of iterations is reached.

Neighborhood count: 1000

Radius scale size: 0.1

Domain: (-5.12, 5.12)

图片[1]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
Table 1 – Functions and Global Minimum Values

图片[2]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
De Jong 1st Function

图片[3]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
2nd De Jong Rosenbrock’s Saddle Function

图片[4]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
4th De Jong Function

图片[5]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
Rastrigin’s Function

图片[6]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
Griewangk’s Function

Local Search Results

– De Jong 1st Function

  • 2nd De Jong Rosenbrock’s Saddle Function

图片[7]-Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis - 拾光赋-拾光赋
– 4th De Jong Function

  • Rastrigin’s Function

– Griewangk’s Function

Stochastic Hill Climbing Results

*– De Jong 1st Function
*

– 2nd De Jong Rosenbrock’s Saddle Function

– 4th De Jong Function

– Rastrigin’s Function

– Griewangk’s Function

Comparison Tables

Local Search (LS) demonstrated high computational efficiency, whereas Stochastic Hill Climbing (SHC) generally yielded better results. Both methods are variants of local search; however, their neighborhood selection strategies differ. SHC selects a neighboring solution randomly, while LS always chooses the best available option. Additionally, LS terminates if no improvement is found, whereas SHC continues until the maximum number of iterations is reached.

Local search (LS) also employs a greedy approach in this context, as it aims to improve the current solution at each step by selecting the best neighboring solution. This approach often leads the algorithm to become stuck in a local optimum, as it consistently chooses the most favorable option available without considering alternative paths.

Link for codes: https://github.com/gokhanergen-tech/python-local-search-stochastic-hill-climbing

References

原文链接:Local Search vs Stochastic Hill Climbing: Python Implementation, Testing, and Comparative Analysis

© 版权声明
THE END
喜欢就支持一下吧
点赞9 分享
Be happy for this moment, this moment is your life.
享受当下的快乐,因为这一刻正是你的人生
评论 抢沙发

请登录后发表评论

    暂无评论内容