Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm

Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM n...

Full description

Saved in:
Bibliographic Details
Published inSN computer science Vol. 5; no. 1; p. 68
Main Authors Rollin, Ndom Francis, Giquel, Sassa, Chantal, Mveh-Abia, Raoul, Ayissi, Remy, Etoua, Yves, Emvudu
Format Journal Article
LanguageEnglish
Published Singapore Springer Nature Singapore 01.01.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN2661-8907
2662-995X
2661-8907
DOI10.1007/s42979-023-02376-x

Cover

More Information
Summary:Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R -squared statistics or determination coefficients show that 58.31 % of user traffic consumption can be explained by LSTM model, while 96.86 % of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that 61.24 % of user traffic consumption can also be explained by LSTM model and 95.20 % can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2661-8907
2662-995X
2661-8907
DOI:10.1007/s42979-023-02376-x