采用Voigt模型的布里渊散射谱关键特征高精度提取方法

Highly accurate key parameters extraction algorithm for Brillouin scattering spectrum using Voigt profile

  • 摘要: 布里渊散射谱满足Voigt函数,现有的拟合算法提取得到的特征参数容易存在误差。为了保证布里渊谱特征提取时的准确性,从而提高温度和应变测量的准确性,提出了一种采用Voigt函数的布里渊散射谱特征参数提取算法。采用了高斯-厄米特积分来计算Voigt函数,采用最小二乘拟合方法给出了对应的目标函数,同时给出了初值提取方法,采用Levenberg-Marquardt算法来最小化目标函数,一旦目标函数趋于最小化即可获得特征参数。此外还实现另外一种算法,该算法采用随机值方式求解初值,然后采用Levenberg-Marquardt算法来优化目标函数。基于数值产生大量不同信噪比的布里渊谱和实测布里渊谱,两种算法的计算结果表明,随机值算法的收敛概率为80%~90%,文中算法在所有情况下均能收敛。文中算法的计算误差仅为随机值算法的1/1011~1/7,计算耗时仅为随机值算法的1/8~1/3。

     

    Abstract: The Brillouin scattering spectrum follows Voigt profile. The existing key parameters extraction algorithm for Brillouin scattering spectrum is easy to introduce errors. To ensure high accuracy in the extracted key parameters, the temperature and strain measurement, a key parameters extraction algorithm for Brillouin scattering spectrum using Voigt profile was proposed. The Voigt profile was calculated using the Gauss-Hermite quadrature, the objective function was determined based on the least-squares method and Voigt profile. Besides the initial guesses obtainment method of key parameters was presented. The objective function was optimized using the Levenberg-Marquardt algorithm. Once the objective function was minimized, the key parameters were obtained. Additionally, another algorithm was implemented, in which the initial guesses were set to some random values within a certain range, then the Levenberg-Marquardt algorithm was used to optimize the objective function. A large number of Brillouin scattering spectra with different values of signal-to-noise ratio were numerically generated and measured. The results calculated by the two algorithms reveal that the probability of convergence of the random algorithm fall within a range of 80% to 90%. The proposed algorithm always converges in all cases. The errors by the proposed algorithm are only 1/1011-1/7 of that by the random algorithm. The computation time by the proposed algorithm is only 1/8-1/3 of that by the random algorithm.

     

/

返回文章
返回