Abstract:
In order to improve the performance of traditional stochastic parallel gradient descent (SPGD) algorithm for wavefront distortion correction, a novel SPGD optimization algorithm based on AdaBelief optimizer was proposed. The algorithm integrated the first-order momentum and second-order momentum of the AdaBelief optimizer in deep learning into the SPGD algorithm to improve the convergence speed of the algorithm and enable the algorithm to adaptively adjust the gain coefficient adaptively. In addition, adaptive dynamic clipping of the actual gain coefficient was carried out to avoid the oscillation caused by the extreme value of the actual gain coefficient. Simulation results show that under the 37-element deformable mirror (DM), the novel SPGD optimization algorithm can effectively correct wavefront distortion under different turbulence intensities, and the Strehl ratio (SR) of different wavefront distortions after correction is improved to 0.83, 0.47 and 0.31, respectively. In addition, the SR of the proposed algorithm only needs 149, 229 and 230 iterations to reach the threshold under different turbulence intensities. Compared with the traditional SPGD algorithm and other optimization algorithms, the proposed algorithm has a faster convergence speed, and has certain advantages in stability and parameter adjustment.