Research Specifications

Home \Improving time series ...
Title
Improving time series forecasting using LSTM and attention models
Type of Research Article
Keywords
Time series forecasting, LSTM , Multi-head attention, Hybrid mode
Abstract
Accurate time series forecasting has been recognized as an essential task in many application domains. Real-world time series data often consist of non-linear patterns with complexities that prevent conventional forecasting techniques from accurate predictions. To forecast a given time series accurately, a hybrid model based on two deep learning methods, i.e., long short-term memory (LSTM) and multi-head attention is proposed in this study. The proposed method leverages the two learned representations from these techniques. The performance of this method is also compared with some standard time series forecasting techniques as well as some hybrid cases proposed in the related literature using 16 datasets. Moreover, the individual models based on LSTM and multi-head attention are implemented to perform a comprehensive evaluation. The results of experiments in this study indicate that the proposed model outperforms all benchmarking methods in most datasets in terms of symmetric mean absolute percentage error (SMAPE). It yields the best average rank (AR) among the utilized methods. Besides, the results reveal that model based on multi-head attention is the second-best method with regard to AR, which demonstrates the predictive power of attention mechanism in time series forecasting.
Researchers Hossein Abbasimehr (First Researcher)، Reza Paki (Second Researcher)