Abstract:
To tackle the seagull optimization algorithm (SOA) issue, including a slow convergence speed and easily attaining its local optima, we propose three optimization strategies. We improve the nonlinear convergence factor and the spiral coefficient to further improve the global and local search coordination ability and accelerate the convergence speed. By expanding the attack behavior and angle, we improve the local optimization performance by a parallel search. Furthermore, we introduce dynamic reverse learning to avoid local optima and optimized the global search process. We analyze the convergence of the improved SOA (ISOA) based on a Markov process. Additionally, we test the optimization performance of ISOA using 16 benchmark functions and apply it to proportional-integral-derivative (PID) parameter tuning. The results indicate that the proposed strategies remarkably improves the convergence speed and solution precision of SOA and that ISOA is effective in parameter optimization.