TCLN: A Transformer-Based Conv-LSTM Network for Multivariate Time Series Forecasting
Shusen Ma,
Tianhao Zhang,
Yun-Bo Zhao ,
Yu Kang,
and Peng Bai
Appl Intell
2023
[Abs]
[doi]
[pdf]
The study of multivariate time series forecasting (MTSF) problems has high a significance in many areas, such as industrial forecasting and traffic flow forecastm ing. Traditional forecasting models pay more attention to the temporal features of variables and lack depth in extracting spatial and spatiotemporal features between variables. In this paper, a novel model based on the Transformer, convolutional neural network (CNN), and long short-term memory (LSTM) network is proposed to address the issues. The model first extracts the spatial feature vectors through the proposed Multi-kernel CNN. Then it fully extracts the temporal d information by the Encoder layer that consists of the Transformer encoder layer and the LSTM network, which can also obtain the potential spatiotemporal correlation. To extract more feature information, we stack multiple Encoder layers. e Finally, the output is decoded by the Decoder layer composed of the ReLU activat tion function and the Linear layer. To further improve the model’s robustness, we also integrate an autoregressive model. In model evaluation, the proposed model p achieves significant performance improvements over the current benchmark methods for MTSF tasks on four datasets. Further experiments demonstrate that the e model can be used for long-horizon forecasting and achieve satisfactory results on the yield forecasting of test items (our private dataset, TIOB).