Search papers, labs, and topics across Lattice.
This paper introduces two deep learning models for multivariate time series traffic prediction: a network-temporal graph attention network (GAT) and a fine-tuned multi-modal large language model (LLM) with a clustering pre-processing step. The LLM-based model achieves superior prediction and generalization performance on a real-world network dataset compared to an LSTM baseline. The GAT model excels at reducing prediction variance across different time series and horizons.
Fine-tuning a multi-modal LLM, when combined with clustering, beats specialized GATs and LSTMs for multivariate time series traffic prediction.
Time series analysis is critical for emerging net- work intelligent control and management functions. However, existing statistical-based and shallow machine learning models have shown limited prediction capabilities on multivariate time series. The intricate topological interdependency and complex temporal patterns in network data demand new model approaches. In this paper, based on a systematic multivariate time series model study, we present two deep learning models aiming for learning both temporal patterns and network topological correlations at the same time: a customized network-temporal graph attention network (GAT) model and a fine-tuned multi-modal large language model (LLM) with a clustering overture. Both models are studied against an LSTM model that already outperforms the statistical methods. Through extensive training and performance studies on a real-world network dataset, the LLM-based model demonstrates superior overall prediction and generalization performance, while the GAT model shows its strength in reducing prediction variance across the time series and horizons. More detailed analysis also reveals important insights into correlation variability and prediction distribution discrepancies over time series and different prediction horizons.