Please use this identifier to cite or link to this item: https://hdl.handle.net/1959.11/63407
Title: Cellular Traffic Prediction using Recurrent Neural Networks
Contributor(s): Jaffry, Shan (author); Hasan, Syed Faraz  (author)orcid 
Publication Date: 2023
DOI: 10.1109/ISTT50966.2020.9279373
Handle Link: https://hdl.handle.net/1959.11/63407
Abstract: 

Autonomous network traffic prediction will be a key feature in beyond 5G networks. In the past, researchers have used statistical methods such as Auto Regressive Integrated Moving Average (ARIMA) to provide traffic prediction. However ARIMA based models fail to provide accurate predictions in highly dynamic cellular environment. Hence, researchers are exploring deep learning techniques such as Recurrent Neural Networks (RNN) and Long-Short-Term-Memory (LSTM) to develop autonomous cellular traffic prediction models. This paper proposes a LSTM based cellular traffic prediction model using real world call data record. We have compared the LSTM based prediction with ARIMA model and vanilla Feed-Forward Neural Network (FFNN). The results show that LSTM and FFNN can accurately predict cellular traffic. However, it has been found that LSTM models converged more quickly in terms of training the model for prediction.

Publication Type: Conference Publication
Conference Details: ISTT 2020: 5th International Symposium on Telecommunication Technologies, Shah Alam, Malaysia, 9th - 11th November, 2020
Source of Publication: Proceedings of the 5th International Symposium on Telecommunication Technologies, p. 94-98
Publisher: Institute of Electrical and Electronics Engineers
Place of Publication: United States of America
Fields of Research (FoR) 2020: 4006 Communications engineering
Peer Reviewed: Yes
HERDC Category Description: E1 Refereed Scholarly Conference Publication
Appears in Collections:Conference Publication

Files in This Item:
1 files
File SizeFormat 
Show full item record
Google Media

Google ScholarTM

Check

Altmetric


Items in Research UNE are protected by copyright, with all rights reserved, unless otherwise indicated.