Abstract:
The irradiance fluctuation induced by atmospheric turbulence is a major factor to degrade the Signal-to-Noise Ratio (SNR) and bit error rate (BER) of laser communication systems. This paper aims to explore the effect of atmospheric turbulence state on the performance of wireless communication systems. Firstly, the channel was modeled by the logarithmic normal distribution and Gamma-Gamma distribution, respectively. Analysis showed lights on that the former model was not fit for irradiance fluctuation behavior in moderate or strong turbulent conditions, while the latter had wider applications. On this basis, the SNR and BER resulting from the phase noise introduced by irradiance were studied. Simulations were conducted for different channel models, which displayed the change of planisphere and degradation of BER along with irradiance variance. Results illustrates that the phase angle of planisphere and BER increase when the irradiance variance becomes larger. Up to a point, these discussions are valuable for real optical communication applications.