Abstract:
Most of the existing optimization algorithms based on the opposite learning strategy are under the initial population fitness value. The problem of opposition and slow convergence is not fully considered. Thus, a fast convergent natural computation method based on a cosine similarity opposition strategy is proposed to solve the aforementioned problem. The particles are divided into similar and nonsimilar subgroups by calculating the cosine similarity between each particle and the particle in the regional center. The nonsimilar subgroups are weighted opposition according to the similarity degree, thus the convergence speed is accelerated. The Cauchy disturbance is then introduced to improve population diversity. The strategy is applied to three different natural computing methods, and 12 classical test functions are used to analyze the convergence and verify its performance. Finally, nonparametric tests are performed on the experimental data. Experiments show that the method performs well in most test functions and has good universality and effectiveness.