Randomising the Simple Recurrent Network: a lightweight, energy-efficient RNN model with application to forecasting problems

Authors: Khennour, M.E., Bouchachia, A., Kherfi, M.L. and Bouanane, K.

Journal: Neural Computing and Applications

Volume: 35

Issue: 27

Pages: 19707-19718

eISSN: 1433-3058

ISSN: 0941-0643

DOI: 10.1007/s00521-023-08775-8

Abstract:

Multi-variate time-series (MTS) forecasting is the prediction of future for a sequence of data. The process of analysing obtained data can benefit the community financially and securely, for instance observing stock exchange trends and predicting malicious attacks whenabout. MTS forecasting models face many problems including data and model complexity, energy constraints and computational cost. These problems could affect budget allocation, latency and carbon emission. Recurrent neural networks are one of these models, which are known for their computational complexity due to slow learning process which requires more energy to train. Contributing to green AI, in this paper, we propose a competitive and energy-efficient lightweight recurrent neural network based on a hybrid neural architecture that combines Random Neural Network (RaNN) and Simple Recurrent Network (SRN), namely Random Simple Recurrent Network (RSRN). We consider RaNN for its distinctive probabilistic properties and SRN for adding lightweight recurrent ability to the RaNN to process sequential data. The paper shows how RSRN is trained using adapted and optimised versions of back propagation (BP), back propagation through time (BPTT) and truncated BPTT (TBPTT). The latter two algorithms use penalised gradient descent to prevent gradient explosion problems by employing the average of total gradient over time. Evaluated on several datasets, RSRN achieves best performance when using TBPTT. Moreover, we performed a comparative study against well-known recurrent models showing its superiority compared to the state-of-the-art models, while requiring much less computational time and training parameters. In addition, we investigated the multi-layer architecture and its properties.

https://eprints.bournemouth.ac.uk/38885/

Source: Scopus

Randomising the Simple Recurrent Network: a lightweight, energy-efficient RNN model with application to forecasting problems

Authors: Khennour, M.E., Bouchachia, A., Kherfi, M.L. and Bouanane, K.

Journal: NEURAL COMPUTING & APPLICATIONS

Volume: 35

Issue: 27

Pages: 19707-19718

eISSN: 1433-3058

ISSN: 0941-0643

DOI: 10.1007/s00521-023-08775-8

https://eprints.bournemouth.ac.uk/38885/

Source: Web of Science (Lite)

Randomising the Simple Recurrent Network: a lightweight, energy-efficient RNN model with application to forecasting problems

Authors: Khennour, M.E., Bouchachia, A., Kherfi, M.L. and Bouanane, K.

Journal: Neural Computing and Applications

Volume: 35

Pages: 19707-19718

ISSN: 0941-0643

Abstract:

Multi-variate time-series (MTS) forecasting is the prediction of future for a sequence of data. The process of analysing obtained data can benefit the community financially and securely, for instance observing stock exchange trends and predicting malicious attacks whenabout. MTS forecasting models face many problems including data and model complexity, energy constraints and computational cost. These problems could affect budget allocation, latency and carbon emission. Recurrent neural networks are one of these models, which are known for their computational complexity due to slow learning process which requires more energy to train. Contributing to green AI, in this paper, we propose a competitive and energy-efficient lightweight recurrent neural network based on a hybrid neural architecture that combines Random Neural Network (RaNN) and Simple Recurrent Network (SRN), namely Random Simple Recurrent Network (RSRN). We consider RaNN for its distinctive probabilistic properties and SRN for adding lightweight recurrent ability to the RaNN to process sequential data. The paper shows how RSRN is trained using adapted and optimised versions of back propagation (BP), back propagation through time (BPTT) and truncated BPTT (TBPTT). The latter two algorithms use penalised gradient descent to prevent gradient explosion problems by employing the average of total gradient over time. Evaluated on several datasets, RSRN achieves best performance when using TBPTT. Moreover, we performed a comparative study against well-known recurrent models showing its superiority compared to the state-of-the-art models, while requiring much less computational time and training parameters. In addition, we investigated the multi-layer architecture and its properties.

https://eprints.bournemouth.ac.uk/38885/

Source: BURO EPrints