DOI: https://doie.org/10.0612/Jbse.2024946272
Avinash S Joshi, Dr. Gopal A. Bidkar
Machine learning; channel estimation; MIMO-OFDM; frequency selective channels
The system performance of wireless networks heavily relies on channel estimation. Moreover, the application of deep learning has shown substantial advancements in improving communication reliability and decreasing the computational complexity of 5G and beyond networks. While least squares (LS) estimation is widely employed for obtaining channel estimates due to its cost-effectiveness and independence from prior statistical information about the channel, it is associated with a relatively high estimation error. The presented paper suggests a novel channel estimation framework leveraging deep learning to enhance the accuracy of channel estimates obtained through the least squares (LS) approach. The objective is realized by employing a MIMO (multiple-input multiple-output) system with a multi-path channel profile, simulating scenarios in 5G and beyond networks, considering the level of mobility indicated by Doppler effects. The construction of the system model is applicable to any number of transceiver antennas, and the machine learning module is designed to be versatile, allowing the utilization of various neural network architectures. The numerical findings illustrate the effectiveness of the newly introduced deep learning-based channel estimation framework compared to conventional methods widely employed in prior research. Furthermore, among the examined artificial neural network architectures, bidirectional long short-term memory exhibits the highest quality in channel estimation and the lowest bit error ratio.