Abstract:
SPGD is a control algorithm widely used in wavefront sensorless adaptive optics (AO) systems. The gain is commonly set to a fixed value in the traditional SPGD algorithm. With the increase of the number of DM elements, which can easily lead to the slow convergence speed of the algorithm and the increase of the probability of falling into the local extreme value. Adam optimizer is an optimized stochastic gradient descent algorithm commonly used in deep learning. It has the advantage of achieving adaptive learning rate. The advantages of Adam optimizer adaptive gain and SPGD algorithm were combined to realize adaptive gain for AO system control. The simulation model of wavefront sensorless AO system was established with 32, 61, 97 and 127 elements DM as wavefront correction devices respectively, wavefront aberrations with different turbulence intensities as correction objects. The results show that the optimized algorithm converges faster than basic SPGD algorithm and the probability of falling into local extremum decreases. As the number of DM elements increases and the turbulence intensity increases, the advantages of the optimized algorithm are more obvious. The above research results provide a theoretical basis for the practical application of the SPGD algorithm based on Adam optimization.