基于图神经网络的复杂时空数据挖掘方法综述
作者:
基金项目:

国家重点研发计划(2020AAA0108504); 国家自然科学基金(62302397, 62102321); 中央高校基本科研业务费专项资金(D5000230191, D5000230095); 陕西省创新能力支撑计划(2021TD-06); 先进计算与智能工程(国家级)实验室基金(2023-LYJJ-01-021)


Survey on Complex Spatio-temporal Data Mining Methods Based on Graph Neural Network
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [144]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    随着传感技术的发展, 不同领域产生了大量时空数据. 时空图是其中一种主要的时空数据类型, 具有复杂的结构、时空特征和时空关系. 如何从复杂的时空图数据中挖掘关键模式, 并应用于不同的下游任务成为复杂时空数据挖掘任务的主要问题. 目前, 日渐成熟的时序图神经网络为该研究领域的发展提供了有力的工具. 此外, 新兴的时空大模型在现有时空图神经网络方法的基础上提供了新的研究视角. 然而, 现有的大多数综述对该领域方法的分类框架较为粗略, 对复杂数据类型(如动态异质图和动态超图)缺乏全面和深入的介绍, 并且没有对时空图大模型相关的最新研究进展进行详细总结. 因此, 基于图神经网络的复杂时空数据挖掘方法分成时空融合架构和时空大模型, 旨在从传统和新兴两个角度进行介绍. 将时空融合架构根据具体的复杂数据类型划分成动态图、动态异质图和动态超图. 将时空大模型根据时间维度和空间维度划分成时间序列和图, 并在基于图的大模型中列举时空图相关的最新研究. 详细介绍不同关键算法的核心细节并对比不同方法的优缺点, 列举基于图神经网络的复杂时空数据挖掘的应用领域和常用数据集, 并对未来可能的研究方向进行展望.

    Abstract:

    With the development of sensing technology, lots of spatio-temporal data springs up in different fields. The spatio-temporal graph is a major type of spatio-temporal data with complex structure, spatio-temporal features, and relationships. How to mine key patterns from complex spatio-temporal graph data for various downstream tasks has become the main problem of complex spatio-temporal data mining tasks. Currently, the increasingly mature temporal graph neural networks provide powerful tools for the development of this research field. In addition, the emerging spatio-temporal large models provide a new research perspective based on the existing spatio-temporal graph neural network methods. However, most existing reviews in this field have relatively rough classification frameworks for methods, lack comprehensive and in-depth introduction to complex data types (e.g., dynamic heterogeneous graphs and dynamic hypergraphs), and do not provide a detailed summary of the latest research progress related to spatio-temporal graph large models. Therefore, in this study, the complex spatio-temporal data mining methods based on graph neural networks are divided into spatio-temporal fusion architecture and spatio-temporal large models to introduce them from traditional and emerging perspectives. According to specific complex data types, spatio-temporal fusion architecture is divided into dynamic graphs, dynamic heterogeneous graphs, and dynamic hypergraphs. Moreover, the spatio-temporal large models are divided into time series and graphs according to temporal and spatial dimensions. The latest research related to spatio-temporal graphs is listed in graph-based large models. The core details of multiple key algorithms are introduced, and the pros and cons of different methods are compared. Finally, the application fields and commonly used datasets of complex spatio-temporal data mining methods based on graph neural networks are listed, and possible future research directions are outlined.

    参考文献
    [1] Hamdi A, Shaban K, Erradi A, Mohamed A, Rumi SK, Salim FD. Spatiotemporal data mining: A survey on challenges and open problems. Artificial Intelligence Review, 2022, 55(2): 1441–1488.
    [2] Wang SZ, Cao JN, Yu PS. Deep learning for spatio-temporal data mining: A survey. IEEE Trans. on Knowledge and Data Engineering, 2022, 34(8): 3681–3700.
    [3] Jin GY, Liang YX, Fang YC, Shao ZZ, Huang JC, Zhang JB, Zheng Y. Spatio-temporal graph neural networks for predictive learning in urban computing: A survey. IEEE Trans. on Knowledge and Data Engineering, 2023, 36(10): 5388–5408.
    [4] 刘杰, 尚学群, 宋凌云, 谭亚聪. 图神经网络在复杂图挖掘上的研究进展. 软件学报, 2022, 33(10): 3582–3618. http://www.jos.org.cn/1000-9825/6626.htm
    Liu J, Shang XQ, Song LY, Tan YC. Progress of graph neural networks on complex graph mining. Ruan Jian Xue Bao/Journal of Software, 2022, 33(10): 3582–3618 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/6626.htm
    [5] Yu B, Yin HT, Zhu ZX. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In: Proc. of the 27th Int’l Joint Conf. on Artificial Intelligence. Stockholm: ijcai.org, 2018. 3634–3640. [doi: 10.24963/ijcai.2018/505]
    [6] Defferrard M, Bresson X, Vandergheynst P. Convolutional neural networks on graphs with fast localized spectral filtering. In: Proc. of the 30th Int’l Conf. on Neural Information Processing Systems. Barcelona: Curran Associates Inc., 2016. 3844–3852.
    [7] Dauphin YN, Fan A, Auli M, Grangier D. Language modeling with gated convolutional networks. In: Proc. of the 34th Int’l Conf. on Machine Learning. Sydney: JMLR.org, 2017. 933–941.
    [8] Qian WZ, Zhang DL, Zhao Y, Zheng K, Yu JJQ. Uncertainty quantification for traffic forecasting: A unified approach. In: Proc. of the 39th IEEE Int’l Conf. on Data Engineering. Anaheim: IEEE, 2023. 992–1004.
    [9] Dey R, Salem FM. Gate-variants of gated recurrent unit (GRU) neural networks. In: Proc. of the 60th IEEE Int’l Midwest Symp. on Circuits and Systems. Boston: IEEE, 2017. 1597–1600. [doi: 10.1109/MWSCAS.2017.8053243]
    [10] Bai L, Yao LN, Li C, Wang XZ, Wang C. Adaptive graph convolutional recurrent network for traffic forecasting. In: Proc. of the 34th Int’l Conf. on Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2020. 17804–17815.
    [11] Lu B, Gan XY, Zhang WN, Yao HX, Fu LY, Wang XB. Spatio-temporal graph few-shot learning with cross-city knowledge transfer. In: Proc. of the 28th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Washington: ACM, 2022. 1162–1172. [doi: 10.1145/3534678.3539281]
    [12] Veličković P, Cucurull G, Casanova A, Romero A, Liò P, Bengio Y. Graph attention networks. arXiv:1710.10903, 2018.
    [13] Guo SN, Lin YF, Feng N, Song C, Wan HY. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: Proc. of the 33rd AAAI Conf. on Artificial Intelligence. Honolulu: AAAI, 2019. 922–929. [doi: 10.1609/aaai.v33i01.3301922]
    [14] Liu J, Li TR, Xie P, Du SD, Teng F, Yang X. Urban big data fusion based on deep learning: An overview. Information Fusion, 2020, 53: 123–133.
    [15] Jiang WW, Luo JY. Graph neural network for traffic forecasting: A survey. Expert Systems with Applications, 2022, 207: 117921.
    [16] Jin M, Wen QS, Liang YX, Zhang CL, Xue SQ, Wang X, Zhang J, Wang Y, Chen HF, Li XL, Pan SR, Tseng VS, Zheng Y, Chen L, Xiong H. Large models for time series and spatio-temporal data: A survey and outlook. arXiv:2310.10196, 2023.
    [17] Sankar A, Wu YH, Gou L, Zhang W, Yang H. DySAT: Deep neural representation learning on dynamic graphs via self-attention networks. In: Proc. of the 13th Int’l Conf. on Web Search and Data Mining. Houston: ACM, 2020. 519–527.
    [18] Wu ZH, Pan SR, Long GD, Jiang J, Zhang CQ. Graph WaveNet for deep spatial-temporal graph modeling. In: Proc. of the 28th Int’l Joint Conf. on Artificial Intelligence. Macao: ijcai.org, 2019. 1907–1913. [doi: 10.24963/ijcai.2019/264]
    [19] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Proc. of the 5th Int’l Conf. on Learning Representations. Toulon: OpenReview.net, 2017.
    [20] van den Oord A, Dieleman S, Zen HG, Simonyan K, Vinyals O, Graves A, Kalchbrenner N, Senior AW, Kavukcuoglu K. WaveNet: A generative model for raw audio. In: Proc. of the 9th ISCA Speech Synthesis Workshop. Sunnyvale: ISCA, 2016. 125.
    [21] Gal Y, Ghahramani Z. Dropout as a Bayesian approximation: Representing model uncertainty in deep learning. In: Proc. of the 33rd Int’l Conf. on Machine Learning. New York: JMLR.org, 2016. 1050–1059.
    [22] Li YG, Yu R, Shahabi C, Liu Y. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In: Proc. of the 6th Int’l Conf. on Learning Representations. Vancouver: OpenReview.net, 2018.
    [23] Huang RZ, Huang CY, Liu YB, Dai GN, Kong WY. LSGCN: Long short-term traffic prediction with graph convolutional networks. In: Proc. of the 29th Int’l Joint Conf. Artificial Intelligence. Yokohama: ijcai.org, 2020. 2355–2361.
    [24] Han LZ, Ma XJ, Sun LL, Du BW, Fu YJ, Lv WF, Xiong H. Continuous-time and multi-level graph representation learning for origin-destination demand prediction. In: Proc. of the 28th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Washington: ACM, 2022. 516–524.
    [25] Xu Y, Han LZ, Zhu TY, Sun LL, Du BW, Lv WF. Continuous-time and discrete-time representation learning for origin-destination demand prediction. IEEE Trans. on Intelligent Transportation Systems, 2024, 25(3): 2382–2393.
    [26] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I. Attention is all you need. In: Proc. of the 31st Int’l Conf. on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 6000–6010.
    [27] Wang S, Li YR, Zhang J, Meng QY, Meng LW, Gao F. PM2.5-GNN: A domain knowledge enhanced graph neural network for PM2.5 forecasting. In: Proc. of the 28th Int’l Conf. on Advances in Geographic Information Systems. Seattle: ACM, 2020. 163–166. [doi: 10.1145/3397536.3422208]
    [28] Wang CY, Lin ZY, Yang XC, Sun J, Yue MX, Shahabi C. HAGEN: Homophily-aware graph convolutional recurrent network for crime forecasting. In: Proc. of the 36th AAAI Conf. on Artificial Intelligence. AAAI, 2022. 4193–4200.
    [29] Zheng CP, Fan XL, Wang C, Qi JZ. GMAN: A graph multi-attention network for traffic prediction. In: Proc. of the 34th AAAI Conf. on Artificial Intelligence. New York: AAAI, 2020. 1234–1241. [doi: 10.1609/aaai.v34i01.5477]
    [30] Wang LJ, Adiga A, Chen JZ, Sadilek A, Venkatramanan S, Marathe M. CausalGNN: Causal-based graph neural networks for spatio-temporal epidemic forecasting. In: Proc. of the 36th AAAI Conf. on Artificial Intelligence. AAAI, 2022. 12191–12199.
    [31] Yu S, Xia F, Li SH, Hou ML, Sheng QZ. Spatio-temporal graph learning for epidemic prediction. ACM Trans. on Intelligent Systems and Technology, 2023, 14(2): 36.
    [32] Zhou Q, Gu JJ, Lu XJ, Zhuang FZ, Zhao YC, Wang QH, Zhang X. Modeling heterogeneous relations across multiple modes for potential crowd flow prediction. In: Proc. of the 35th AAAI Conf. on Artificial Intelligence. AAAI, 2021. 4723–4731.
    [33] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation, 1997, 9(8): 1735–1780.
    [34] Zhang JS, An KQ, Liu GP, Wen X, Hu RB, Shao J. Understanding the semantics of GPS-based trajectories for road closure detection. In: Proc. of the 29th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Long Beach: ACM, 2023. 5554–5563. [doi: 10.1145/3580305.3599926]
    [35] Zhang ZW, Wang HJ, Fan ZP, Song X, Shibasaki R. Missing road condition imputation using a multi-view heterogeneous graph network from GPS trajectory. IEEE Trans. on Intelligent Transportation Systems, 2023, 24(5): 4917–4931.
    [36] Hong HT, Lin YC, Yang XQ, Yang XQ, Li Z, Fu K, Wang Z, Qie XH, Ye JP. HetETA: Heterogeneous information network embedding for estimating time of arrival. In: Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. ACM, 2020. 2444–2454.
    [37] Ling S, Yu Z, Cao SS, Zhang HP, Hu SM. STHAN: Transportation demand forecasting with compound spatio-temporal relationships. ACM Trans. on Knowledge Discovery from Data, 2023, 17(4): 54.
    [38] Li ZH, Huang C, Xia LH, Xu Y, Pei J. Spatial-temporal hypergraph self-supervised learning for crime prediction. In: Proc. of the 38th IEEE Int’l Conf. on Data Engineering. Kuala Lumpur: IEEE, 2022. 2984–2996. [doi: 10.1109/ICDE53745.2022.00269]
    [39] Wang JC, Zhang Y, Wei Y, Hu YL, Piao XL, Yin BC. Metro passenger flow prediction via dynamic hypergraph convolution networks. IEEE Trans. on Intelligent Transportation Systems, 2021, 22(12): 7891–7903.
    [40] Yin N, Feng FL, Luo ZG, Zhang X, Wang WJ, Luo X, Chen C, Hua XS. Dynamic hypergraph convolutional network. In: Proc. of the 38th IEEE Int’l Conf. on Data Engineering. Kuala Lumpur: IEEE, 2022. 1621–1634. [doi: 10.1109/ICDE53745.2022.00167]
    [41] Zhao YS, Luo X, Ju W, Chen C, Hua XS, Zhang M. Dynamic hypergraph structure learning for traffic flow forecasting. In: Proc. of the 39th IEEE Int’l Conf. on Data Engineering. Anaheim: IEEE, 2023. 2303–2316. [doi: 10.1109/ICDE55515.2023.00178]
    [42] Garza A, Challu C, Mergenthaler-Canseco M. TimeGPT-1. arXiv:2310.03589, 2024.
    [43] Rasul K, Ashok A, Williams AR, Ghonia H, Bhagwatkar R, Khorasani A, Bayazi MJD, Adamopoulos G, Riachi R, Hassen N, Biloš M, Garg S, Schneider A, Chapados N, Drouin A, Zantedeschi V, Nevmyvaka Y, Rish I. Lag-Llama: Towards foundation models for probabilistic time series forecasting. arXiv:2310.08278, 2024.
    [44] Xue H, Salim FD. PromptCast: A new prompt-based learning paradigm for time series forecasting. IEEE Trans. on Knowledge and Data Engineering, 2024, 36(11): 6851–6864.
    [45] Jin M, Wang SY, Ma LT, Chu ZX, Zhang JY, Shi XM, Chen PY, Liang YX, Li YF, Pan SR, Wen QS. Time-LLM: Time series forecasting by reprogramming large language models. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [46] Kim T, Kim J, Tae Y, Park C, Choi JH, Choo J. Reversible instance normalization for accurate time-series forecasting against distribution shift. In: Proc. of the 10th Int’l Conf. on Learning Representations. OpenReview.net, 2022.
    [47] Nie YQ, Nguyen NH, Sinthong P, Kalagnanam J. A time series is worth 64 words: Long-term forecasting with transformers. In: Proc. of the 11th Int’l Conf. on Learning Representations. Kigali: OpenReview.net, 2023.
    [48] Chang C, Peng WC, Chen TF. LLM4TS: Two-stage fine-tuning for time-series forecasting with pre-trained LLMs. arXiv:2308.08469V1, 2023.
    [49] Zhou T, Niu PS, Wang X, Sun L, Jin R. One fits all: Power general time series analysis by pretrained LM. In: Proc. of the 37th Conf. on Neural Information Processing Systems. New Orleans: NeurIPS, 2024. 36.
    [50] Sun CX, Li HY, Li YL, Hong SD. TEST: Text prototype aligned embedding to activate LLM’s ability for time series. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [51] Cao DF, Jia FR, Arik SÖ, Pfister T, Zheng YX, Ye W, Liu Y. TEMPO: Prompt-based generative pre-trained transformer for time series forecasting. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [52] Gruver N, Finzi M, Qiu SK, Wilson AG. Large language models are zero-shot time series forecasters. In: Proc. of the 37th Conf. on Neural Information Processing Systems. New Orleans: NeurIPS, 2023. 36.
    [53] Xie QQ, Han WG, Lai YZ, Peng M, Huang JM. The wall street neophyte: A zero-shot analysis of ChatGPT over multimodal stock movement prediction challenges. arXiv:2304.05351, 2023.
    [54] Yu XL, Chen Z, Ling Y, Dong SJ, Liu ZY, Lu YB. Temporal data meets LLM—Explainable financial time series forecasting. arXiv:2306.11025, 2023.
    [55] Zhang BY, Yang HY, Liu XY. Instruct-FinGPT: Financial sentiment analysis by instruction tuning of general-purpose large language models. arXiv:2306.12659, 2023.
    [56] Lopez-Lira A, Tang YH. Can ChatGPT forecast stock price movements? Return predictability and large language models. arXiv:2304.07619, 2024.
    [57] Liu X, McDuff D, Kovacs G, Galatzer-Levy I, Sunshine J, Zhan JN, Poh MZ, Liao S, Di Achille P, Patel S. Large language models are few-shot health learners. arXiv:2305.15525, 2023.
    [58] Li J, Liu C, Cheng SB, Arcucci R, Hong SD. Frozen language model helps ECG zero-shot learning. In: Proc. of the 2024 Medical Imaging with Deep Learning. Nashville: PMLR, 2024. 402–415.
    [59] Xue H, Voutharoja BP, Salim FD. Leveraging language foundation models for human mobility forecasting. In: Proc. of the 30th Int’l Conf. on Advances in Geographic Information Systems. Seattle: ACM, 2022. 90. [doi: 10.1145/3557915.356102]
    [60] Kaplan J, McCandlish S, Henighan T, Brown TB, Chess B, Child R, Gray S, Radford A, Wu J, Amodei D. Scaling laws for neural language models. arXiv:2001.08361, 2020.
    [61] Zhang ZW, Li HY, Zhang ZY, Qin YJ, Wang X, Zhu WW. Graph meets LLMs: Towards large graph models. In: Proc. of the 2023 NeurIPS New Frontiers in Graph Learning Workshop. NeurIPS GLFrontiers, 2023. 1–12.
    [62] Chen ZK, Mao HT, Li H, Jin W, Wen HZ, Wei XC, Wang SQ, Yin DW, Fan WQ, Liu H, Tang JL. Exploring the potential of large language models (LLMs) in learning on graphs. ACM SIGKDD Explorations Newsletter, 2024, 25(2): 42–61.
    [63] He XX, Bresson X, Laurent T, Perold A, LeCun Y, Hooi B. Harnessing explanations: LLM-to-LM interpreter for enhanced text-attributed graph representation learning. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [64] Brown TB, Mann B, Ryder N, et al. Language models are few-shot learners. In: Proc. of the 34th Conf. on Neural Information Processing Systems. Vancouver: NeurIPS, 2020. 1877–1901.
    [65] Sun TX, Shao YF, Qian H, Huang XJ, Qiu XP. Black-box tuning for language-model-as-a-service. In: Proc. of the 39th Int’l Conf. on Machine Learning. Baltimore: PMLR, 2022. 20841–20855.
    [66] He PC, Liu XD, Gao JF, Chen WZ. Deberta: Decoding-enhanced bert with disentangled attention. In: Proc. of the 9th Int’l Conf. on Learning Representations. OpenReview.net, 2021.
    [67] Ren XB, Wei W, Xia LH, Su LX, Cheng SQ, Wang JF, Yin DW, Huang C. Representation learning with large language models for recommendation. In: Proc. of the 2024 ACM Web Conf. Singapore: ACM, 2024. 3464–3475. [doi: 10.1145/3589334.3645458]
    [68] Koren Y, Rendle S, Bell R. Advances in collaborative filtering. In: Ricci F, Rokach L, Shapira B, eds. Recommender Systems Handbook. New York: Springer, 2022. 91–142.
    [69] Tang JB, Yang YH, Wei W, Shi L, Su LX, Cheng SQ, Yin DW, Huang C. GraphGPT: Graph instruction tuning for large language models. In: Proc. of the 47th Int’l ACM SIGIR Conf. on Research and Development in Information Retrieval. Washington: ACM, 2024. 491–500. [doi: 10.1145/3626772.3657775]
    [70] Yun S, Jeong M, Kim R, Kang J, Kim HJ. Graph transformer networks. In: Proc. of the 33rd Int’l Conf. on Neural Information Processing Systems. Vancouver: Curran Associates Inc., 2019. 1073.
    [71] Wei J, Wang XZ, Schuurmans D, Bosma M, Ichter B, Xia F, Chi EH, Le QV, Zhou D. Chain-of-thought prompting elicits reasoning in large language models. In: Proc. of the 36th Int’l Conf. on Neural Information Processing Systems. New Orleans: Curran Associates Inc., 2022. 1800.
    [72] Liu CX, Yang S, Xu QX, Li ZS, Long C, Li ZY, Zhao R. Spatial-temporal large language model for traffic prediction. In: Proc. of the 25th IEEE Int’l Conf. on Mobile Data Management. Brussels: IEEE, 2024. 31–40. [doi: 10.1109/MDM61037.2024.00025]
    [73] Li ZH, Xia LH, Tang JB, Xu Y, Shi L, Xia L, Yin DW, Huang C. UrbanGPT: Spatio-temporal large language models. In: Proc. of the 30th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Barcelona: ACM, 2024. 5351–5362.
    [74] Huang L, Yu WJ, Ma WT, Zhong WH, Feng ZY, Wang HT, Chen QL, Peng WH, Feng XC, Qin B, Liu T. A survey on hallucination in large language models: Principles, taxonomy, challenges, and open questions. arXiv:2311.05232, 2023.
    [75] Ye RS, Zhang CQ, Wang RH, Xu SY, Zhang YF. Natural language is all a graph needs. arXiv:2308.07134v2, 2023.
    [76] Duan KY, Liu Q, Chua TS, Yan SC, Ooi WT, Xie QZ, He JX. SimTeG: A frustratingly simple approach improves textual graph learning. arXiv:2308.02565, 2023.
    [77] Liu H, Feng JR, Kong LC, Liang NY, Tao DC, Chen YX, Zhang MH. One for all: Towards training one graph model for all classification tasks. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [78] Tian YJ, Song H, Wang ZC, Wang HZ, Hu ZQ, Wang F, Chawla NV, Xu PP. Graph neural prompting with large language models. In: Proc. of the 38th AAAI Conf. on Artificial Intelligence. Vancouver: AAAI, 2024. 19080–19088. [doi: 10.1609/aaai.v38i17.29875]
    [79] Luo LH, Li YF, Haffari G, Pan SR. Reasoning on graphs: Faithful and interpretable large language model reasoning. In: Proc. of the 12th Int’l Conf. on Learning Representations. Vienna: OpenReview.net, 2024.
    [80] Xu WJ, Liu B, Peng M, Jia X, Peng M. Pre-trained language model with prompts for temporal knowledge graph completion. In: Proc. of the 2023 Findings of the Association for Computational Linguistics. Toronto: Association for Computational Linguistics, 2023. 7790–7803. [doi: 10.18653/v1/2023.findings-acl.493]
    [81] Zhang MM, Sun MW, Wang P, Fan S, Mo YH, Xu XX, Liu H, Yang C, Shi C. GraphTranslator: Aligning graph model to large language model for open-ended tasks. In: Proc. of the 2024 ACM Web Conf. Singapore: ACM, 2024. 1003–1014.
    [82] Ren YL, Chen Y, Liu S, Wang BY, Yu HY, Cui ZY. TPLLM: A traffic prediction framework based on pretrained large language models. arXiv:2403.02221, 2024.
    [83] Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I. Language models are unsupervised multitask learners. OpenAI Blog, 2019, 1(8): 9.
    [84] Hu EJ, Shen YL, Wallis P, Allen-Zhu Z, Li YZ, Wang SA, Wang L, Chen WZ. LoRA: Low-rank adaptation of large language models. In: Proc. of the 10th Int’l Conf. on Learning Representations. OpenReview.net, 2022.
    [85] Jin M, Koh HY, Wen QS, Zambon D, Alippi C, Webb GI, King I, Pan SR. A survey on graph neural networks for time series: Forecasting, classification, imputation, and anomaly detection. arXiv:2307.03759, 2024.
    [86] Song C, Lin YF, Guo SN, Wan HY. Spatial-temporal synchronous graph convolutional networks: A new framework for spatial-temporal network data forecasting. In: Proc. of the 34th AAAI Conf. on Artificial Intelligence. New York: AAAI, 2020. 914–921.
    [87] Lan SY, Ma YT, Huang WK, Wang WW, Yang HY, Li P. DSTAGNN: Dynamic spatial-temporal aware graph neural network for traffic flow forecasting. In: Proc. of the 39th Int’l Conf. on Machine Learning. Baltimore: PMLR, 2022. 11906–11917.
    [88] Fang Z, Long QQ, Song GJ, Xie KQ. Spatial-temporal graph ODE networks for traffic flow forecasting. In: Proc. of the 27th ACM SIGKDD Conf. on Knowledge Discovery & Data Mining. Singapore: ACM, 2021. 364–373. [doi: 10.1145/3447548.3467430]
    [89] Luo X, Yuan JY, Huang ZJ, Jiang HY, Qin YF, Ju W, Zhang M, Sun YZ. HOPE: High-order graph ODE for modeling interacting dynamics. In: Proc. of the 40th Int’l Conf. on Machine Learning. Honolulu: PMLR, 2023. 23124–23139.
    [90] Fang XM, Huang JZ, Wang F, Zeng LK, Liang HJ, Wang HF. ConSTGAT: Contextual spatial-temporal graph attention network for travel time estimation at baidu maps. In: Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. ACM, 2020. 2697–2705.
    [91] Fu K, Meng FL, Ye JP, Wang Z. CompactETA: A fast inference system for travel time prediction. In: Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. ACM, 2020. 3337–3345.
    [92] Zhou ZY, Wang Y, Xie XK, Chen LL, Liu HC. RiskOracle: A minute-level citywide traffic accident forecasting framework. In: Proc. of the 34th AAAI Conf. on Artificial Intelligence. New York: AAAI, 2020. 1258–1265. [doi: 10.1609/aaai.v34i01.5480]
    [93] Wang ZN, Jiang RH, Xue H, Salim FD, Xue H, Salim FD, Song X, Shibasaki R. Event-aware multimodal mobility nowcasting. In: Proc. of the 36th AAAI Conf. on Artificial Intelligence. AAAI, 2022. 4228–4236.
    [94] Shen GJ, Zhou WF, Zhang WY, Liu NL, Liu Z, Kong XJ. Bidirectional spatial-temporal traffic data imputation via graph attention recurrent neural network. Neurocomputing, 2023, 531: 151–162.
    [95] Chen YK, Li ZH, Yang C, Wang XZ, Long GD, Xu GD. Adaptive graph recurrent network for multivariate time series imputation. In: Proc. of the 29th Int’l Conf. on Neural Information Processing. Singapore: Springer, 2022. 64–73.
    [96] Kong XJ, Zhou WF, Shen GJ, Zhang WY, Liu NL, Yang Y. Dynamic graph convolutional recurrent imputation network for spatiotemporal traffic missing data. Knowledge-based Systems, 2023, 261: 110188.
    [97] Wu XS, Xu MY, Fang J, Wu XW. A multi-attention tensor completion network for spatiotemporal traffic data imputation. IEEE Internet of Things Journal, 2022, 9(20): 20203–20213.
    [98] Sun MJ, Zhou PY, Tian H, Liao Y, Xie HY. Spatial-temporal attention network for crime prediction with adaptive graph learning. In: Proc. of the 31st Int’l Conf. on Artificial Neural Networks. Bristol: Springer, 2022. 656–669. [doi: 10.1007/978-3-031-15931-2_54]
    [99] Xia LH, Huang C, Xu Y, Dai P, Bo LF, Zhang XY, Chen TY. Spatial-temporal sequential hypergraph network for crime prediction with dynamic multiplex relation learning. In: Proc. of the 30th Int’l Joint Conf. on Artificial Intelligence. Montreal: ijcai.org, 2021. 1631–1637. [doi: 10.24963/ijcai.2021/225]
    [100] Panagopoulos G, Nikolentzos G, Vazirgiannis M. Transfer graph neural networks for pandemic forecasting. In: Proc. of the 35th AAAI Conf. on Artificial Intelligence. AAAI, 2021. 4838–4845.
    [101] Jin RD, Xia TQ, Liu X, Murata T, Kim KS. Predicting emergency medical service demand with bipartite graph convolutional networks. IEEE Access, 2021, 9: 9903–9915.
    [102] Liang YX, Xia YT, Ke SY, Wang YW, Wen QS, Zhang JB, Zheng Y, Zimmermann R. AirFormer: Predicting nationwide air quality in China with Transformers. In: Proc. of the 37th AAAI Conf. on Artificial Intelligence. Washington: AAAI, 2023. 14329–14337. [doi: 10.1609/aaai.v37i12.26676]
    [103] Chen SY, Zwart JA, Jia XW. Physics-guided graph meta learning for predicting water temperature and streamflow in stream networks. In: Proc. of the 28th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Washington: ACM, 2022. 2752–2761. [doi: 10.1145/3534678.3539115]
    [104] Khodayar M, Liu GY, Wang JH, Kaynak O, Khodayar ME. Spatiotemporal behind-the-meter load and PV power forecasting via deep graph dictionary learning. IEEE Trans. on Neural Networks and Learning Systems, 2021, 32(10): 4713–4727.
    [105] Karimi AM, Wu YH, Koyuturk M, French RH. Spatiotemporal graph neural network for performance prediction of photovoltaic power systems. In: Proc. of the 35th AAAI Conf. on Artificial Intelligence. AAAI, 2021. 15323–15330.
    [106] Yan JD, Chen YZ, Xiao ZX, Zhang S, Jiang MX, Wang TQ, Zhang T, Lv JL, Becker B, Zhang R, Zhu DJ, Han JW, Yao DZ, Kendrick KM, Liu TM, Jiang X. Modeling spatio-temporal patterns of holistic functional brain networks via multi-head guided attention graph neural networks (Multi-Head GAGNNs). Medical Image Analysis, 2022, 80: 102518.
    [107] Qiu WY, Ma L, Jiang TZ, Zhang Y. Unrevealing reliable cortical parcellation of individual brains using resting-state functional magnetic resonance imaging and masked graph convolutions. Frontiers in Neuroscience, 2022, 16: 838347.
    [108] Kim BH, Ye JC, Kim JJ. Learning dynamic graph representation of brain connectome with spatio-temporal attention. In: Proc. of the 35th Conf. on Neural Information Processing Systems. NeurIPS, 2021. 4314–4327.
    [109] Yang HZ, Li XX, Wu YF, Li SY, Lu S, Duncan JS, Gee JC, Gu S. Interpretable multimodality embedding of cerebral cortex using attention graph network for identifying bipolar disorder. In: Proc. of the 22nd Int’l Conf. on Medical Image Computing and Computer-assisted Intervention. Shenzhen: Springer, 2019. 799–807. [doi: 10.1007/978-3-030-32248-9_89]
    [110] Li ML, Chen HB, Cheng ZX. An attention-guided spatiotemporal graph convolutional network for sleep stage classification. Life, 2022, 12(5): 622.
    [111] Stankevičiūtė K, Azevedo T, Campbell A, Bethlehem R, Liò P. Population graph GNNs for brain age prediction. In: Proc. of the 2020 ICML Workshop on Graph Representation Learning and Beyond. 2020. 202.
    [112] Liu YX, Zheng YZ, Zhang DK, Lee VCS, Pan SR. Beyond smoothing: Unsupervised graph representation learning with edge heterophily discriminating. In: Proc. of the 37th AAAI Conf. on Artificial Intelligence. Washington: AAAI, 2023. 4516–4524. [doi: 10.1609/aaai.v37i4.25573]
    [113] Duan JC, Wang SW, Zhang P, Zhu E, Hu JT, Jin H, Liu Y, Dong ZB. Graph anomaly detection via multi-scale contrastive learning networks with augmented view. In: Proc. of the 37th AAAI Conf. on Artificial Intelligence. Washington: AAAI, 2023. 7459–7467. [doi: 10.1609/aaai.v37i6.25907]
    [114] Chen JL, Kou G. Attribute and structure preserving graph contrastive learning. In: Proc. of the 37th AAAI Conf. on Artificial Intelligence. Washington: AAAI, 2023. 7024–7032. [doi: 10.1609/aaai.v37i6.25858]
    [115] Tian YJ, Dong KW, Zhang CH, Zhang CX, Chawla NV. Heterogeneous graph masked autoencoders. In: Proc. of the 37th AAAI Conf. on Artificial Intelligence. Washington: AAAI, 2023. 9997–10005. [doi: 10.1609/aaai.v37i8.26192]
    [116] Sun MC, Zhou KX, He X, Wang Y, Wang X. GPPT: Graph pre-training and prompt tuning to generalize graph neural networks. In: Proc. of the 28th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Washington: ACM, 2022. 1717–1727.
    [117] Sun XG, Cheng H, Li J, Liu B, Guan JH. All in one: Multi-task prompting for graph neural networks. In: Proc. of the 29th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Long Beach: ACM, 2023. 2120–2131. [doi: 10.1145/3580305.3599256]
    [118] Li SR, Han XT, Bai J. AdapterGNN: Efficient delta tuning improves generalization ability in graph neural networks. arXiv:2304.09595v1, 2023.
    [119] Gui AC, Ye JQ, Xiao H. G-Adapter: Towards structure-aware parameter-efficient transfer learning for graph Transformer networks. In: Proc. of the 38th AAAI Conf. on Artificial Intelligence. Vancouver: AAAI, 2024. 12226–12234. [doi: 10.1609/aaai.v38i11.29112]
    [120] Tian YJ, Pei SC, Zhang XL, Zhang CX, Chawla NV. Knowledge distillation on graphs: A survey. arXiv:2302.00219, 2023.
    [121] Zhang SC, Sohrabizadeh A, Wan C, Huang ZJ, Hu ZN, Wang YW, Lin YY, Cong J, Sun YZ. A survey on graph neural network acceleration: Algorithms, systems, and customized hardware. arXiv:2306.14052, 2023.
    [122] Wan BR, Zhao JT, Wu C. Adaptive message quantization and parallelization for distributed full-graph GNN training. In: Proc. of the 6th MLSys Conf. Miami Beach: mlsys.org, 2023. 5.
    [123] Jin BW, Liu G, Han C, Jiang M, Ji H, Han JW. Large language models on graphs: A comprehensive survey. arXiv:2312.02783, 2024.
    [124] Zhang ZY, Wang X, Zhang ZW, Li HY, Qin YJ, Zhu WW. LLM4DyG: Can large language models solve spatial-temporal problems on dynamic graphs? In: Proc. of the 30th ACM SIGKDD Conf. on Knowledge Discovery and Data Mining. Barcelona: ACM, 2024. 4350–4361. [doi: 10.1145/3637528.3671709]
    [125] Wu LK, Zheng Z, Qiu ZP, Wang H, Gu HC, Shen TJ, Qin C, Zhu C, Zhu HS, Liu Q, Xiong H, Chen EH. A survey on large language models for recommendation. World Wide Web, 2024, 27(5): 60.
    [126] Pan SR, Luo LH, Wang YF, Chen C, Wang JP, Wu XD. Unifying large language models and knowledge graphs: A roadmap. IEEE Trans. on Knowledge and Data Engineering, 2024, 36(7): 3580–3599.
    [127] Qian C, Tang HY, Yang ZR, Liang H, Liu Y. Can large language models empower molecular property prediction? arXiv:2307.07443, 2023.
    [128] Dwivedi VP, Joshi CK, Luu AT, Laurent T, Bengio Y, Bresson X. Benchmarking graph neural networks. The Journal of Machine Learning Research, 2024, 24(43): 1–48.
    [129] California Department of Transportation. Performance measurement system (PeMS). 2001. https://pems.dot.ca.gov/
    [130] Li Y. DCRNN: Diffusion convolutional recurrent neural network. 2018. https://github.com/liyaguang/DCRNN
    [131] New York City Taxi & Limousine Commission. TLC trip record data. 2015. https://www1.nyc.gov/site/tlc/about/tlc-trip-record-data.page
    [132] Citi Bike. Citi Bike trip data. 2013. https://s3.amazonaws.com/tripdata/index.html
    [133] ABIDE (Autism Brain Imaging Data Exchange). ABIDE I dataset. 2012. http://fcon_1000.projects.nitrc.org/indi/abide/abide_I.html
    [134] Human Connectome Project. HCP young adult: 1200 subjects data release. 2017. https://www.humanconnectome.org/study/hcp-young-adult/document/1200-subjects-data-release
    [135] Akaxlh. ST-SHN: Spatio-temporal self-hierarchical networks. 2021. https://github.com/akaxlh/ST-SHN
    [136] Marine Cadastre. Automatic identification system (AIS) data. 2009. https://marinecadastre.gov/ais/
    [137] Zheng Y, Wang LH, Zhang RC, Xie X, Ma WY. GeoLife: Managing and understanding your past life over maps. In: Proc. of the 9th Int’l Conf. on Mobile Data Management. Beijing: IEEE, 2008. 211–212. [doi: 10.1109/MDM.2008.20]
    [138] Microsoft Research. T-Drive trajectory data sample. 2011. https://www.microsoft.com/en-us/research/publication/t-drive-trajectory-data-sample/
    [139] Huang XC, Yin YF, Lim S, Wang GF, Hu B, Varadarajan J, Zheng SL, Bulusu A, Zimmermann R. Grab-posisi: An extensive real-life GPS trajectory dataset in Southeast Asia. In: Proc. of the 3rd ACM SIGSPATIAL Int’l Workshop on Prediction of Human Mobility. Chicago: ACM, 2019. 1–10. [doi: 10.1145/3356995.3364536]
    [140] Didi. Chengdu dataset. 2017. https://github.com/Whale2021/Dataset
    [141] Cho E, Myers SA, Leskovec J. Friendship and mobility: User movement in location-based social networks. In: Proc. of the 17th ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. San Diego: ACM, 2011. 1082–1090. [doi: 10.1145/2020408.2020579]
    [142] Feng SS, Cong G, An B, Chee YM. POI2Vec: Geographical latent representation for predicting future visitors. In: Proc. of the 31st AAAI Conf. on Artificial Intelligence. San Francisco: AAAI, 2018. 102–108. [doi: 10.1609/aaai.v31i1.10500]
    [143] Deng S, Rangwala H, Ning Y. Robust event forecasting with spatiotemporal confounder learning. In: Proc. of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. Washington: Association for Computing Machinery, 2022. 294–304.
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

邹慧琪,史彬泽,宋凌云,韩笑琳,尚学群.基于图神经网络的复杂时空数据挖掘方法综述.软件学报,2025,36(4):1811-1843

复制
相关视频

分享
文章指标
  • 点击次数:399
  • 下载次数: 843
  • HTML阅读次数: 163
  • 引用次数: 0
历史
  • 收稿日期:2024-05-09
  • 最后修改日期:2024-06-27
  • 在线发布日期: 2025-01-08
文章二维码
您是第20048331位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号