Deep Learning Optimization Algorithm Based on High Order Exponential Smoothing Dynamic Boundary Constraint
-
-
Abstract
To solve the problem of non-convergence of the adaptive algorithm, we propose an improved Adam algorithm. By introducing multiple hyperparameters and performing exponential smoothing for many times, it makes up for the deficiency of one exponential smoothing. In addition, we modify the second-order momentum calculation to prevent the bad fluctuation of the second-order momentum data, so as to smooth the unexpected university study rate. We conduct comparative experiments on cifar10 and cifar100 datasets based on the Resnet model and the Densenet model. As we can see from the experiment, the proposed algorithm is suitable for different model structures and different data sets. Compared with Adam algorithm, the accuracy of the proposed algorithm is improved by 1.3% on average, and the algorithm provides an effective solution for the convergence problem.
-
-