图神经网络在复杂图挖掘上的研究进展
作者:
作者简介:

智慧信息系统新技术专题

通讯作者:

尚学群,shang@nwpu.edu.cn;宋凌云,lysong@nwpu.edu.cn

基金项目:

国家重点研发计划(2020AAA0108504);国家自然科学基金(62102321,61772426,U1811262);中央高校基本科研专项基金(D5000200146);中国博士后科学基金(2020M673487)


Progress of Graph Neural Networks on Complex Graph Mining
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [179]
  • |
  • 相似文献 [20]
  • |
  • 引证文献
  • | |
  • 文章评论
    摘要:

    图神经网络对非欧式空间数据建立了深度学习框架,相比传统网络表示学习模型,它对图结构能够实施更加深层的信息聚合操作.近年来,图神经网络完成了向复杂图结构的迁移,诞生了一系列基于复杂图的图神经网络模型.然而,现有综述文章缺乏对复杂图神经网络全面、系统的归纳和总结工作.将复杂图分为异质图、动态图和超图3种类型.将异质图神经网络按照信息聚合方式划分为关系类型感知和元路径感知两大类,在此基础上,分别介绍普通异质图和知识图谱.将动态图神经网络按照处理时序信息的方式划分成基于循环神经网络、基于自编码器以及时空图神经网络三大类.将超图神经网络按照是否将超图展开成成对图划分为展开型和非展开型两大类,进一步按照展开方式将展开型划分成星形展开、团式展开和线形展开3种类型.详细阐述了每种算法的核心思想,比较了不同算法间的优缺点,系统列举了各类复杂图神经网络的关键算法、(交叉)应用领域和常用数据集,并对未来可能的研究方向进行了展望.

    Abstract:

    Graph neural networks (GNNs) establish a deep learning framework for non-Euclidean spatial data. Compared with traditional network embedding methods, they perform deeper aggregating operations on graph structures. In recent years, GNNs have been extended to complex graphs. Nevertheless, there lacks qualified surveys which give comprehensive and systematic classification and summary on GNNs based on complex graphs. This study divides the complex graphs into 3 categories, i.e., heterogeneous graphs, dynamic graphs, and hypergraphs. GNNs based on heterogeneous graphs are divided into 2 types, i.e., edge-type aware and meta-path aware, according to the procedure that the information is aggregated. Dynamic GNNs graphs are divided into three categories: RNN-based methods, autoencoderbased methods, and spatio-temporal graph neural networks. Hypergraph GNNs are divided into expansion methods and non-expansion methods, and the expansion methods are further divided into star-expansion, clique-expansion, and line-expansion according to the expansion mode they use. The core idea of every method is illustrated in detail, the advantages and disadvantages of different algorithms are compared, the key procedures, (cross) application fields, and commonly used data sets of different complex graph GNNs are systematically listed, and some possible research directions are proposed.

    参考文献
    [1] Perozzi B, Al-Rfou R, Skiena S. Deepwalk: Online learning of social representations. In: Macskassy SA, ed. Proc. of the Int’l Conf. on Knowledge Discovery and Data Mining (SIGKDD). New York: ACM, 2014. 701-710.
    [2] Grover A, Leskovec J. node2vec: Scalable feature learning for networks. In: Proc. of the Int’l Conf. on Knowledge Discovery and Data Mining. New York: ACM, 2016. 855-864.
    [3] Sperduti A, Starita A. Supervised neural networks for the classification of structures. IEEE Trans. on Neural Networks, 1997, 8(3): 714-735.
    [4] Gori M, Monfardini G, Scarselli F. A new model for learning in graph domains. In: Jiang T, ed. Proc. of the IEEE Int’l Joint Conf. on Neural Networks. Piscataway: IEEE, 2005. 729-734.
    [5] Gallicchio C, Micheli A. Graph echo state networks. In: Jiang T, ed. Proc. of the IEEE Int’l Joint Conf. on Neural Networks (IJCNN). Piscataway: IEEE, 2010. 1-8.
    [6] Scarselli F, Gori M, Tsoi AC, Hagenbuchner M, Monfardini G. The graph neural network model. IEEE Trans. on Neural Networks, 2008, 20(1): 61-80.
    [7] Xu B, Shen H, Cao Q, Qiu Y, Cheng X. Graph wavelet neural network. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New Orleans: OpenReview, 2019.
    [8] Defferrard M, Bresson X, Vandergheynst PJ. Convolutional neural networks on graphs with fast localized spectral filtering. In: Lee DD, ed. Proc. of the Conf. on Neural Information Processing Systems. Cambridge: MIT Press, 2016.
    [9] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: OpenReview, 2017. 1-14.
    [10] Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE. Neural message passing for quantum chemistry. In: Jebara T, ed. Proc. of the Int’l Conf. on Machine Learning (ICML). New York: ACM, 2017. 1263-1272.
    [11] Hamilton WL, Ying R, Leskovec J. Inductive representation learning on large graphs. In: Guyon I, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2017. 1024-1034.
    [12] Schlichtkrull M, Kipf TN, Bloem P, Van Den Berg R, Titov I, Welling M. Modeling relational data with graph convolutional networks. In: Gangemi A, ed. Proc. of the European Semantic Web Conf. Springer, 2018. 593-607.
    [13] Hu Z, Dong Y, Wang K, Sun Y. Heterogeneous graph transformer. In: Huang Y, ed. Proc. of the Web Conf. (WWW). New York: ACM, 2020. 2704-2710.
    [14] Wang X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS. Heterogeneous graph attention network. In: Huang Y, ed. Proc. of the Web Conf. (WWW). New York: ACM, 2019. 2022-2032.
    [15] Jain A, Zamir AR, Savarese S, Saxena A. Structural-RNN: Deep learning on spatio-temporal graphs. In: Bajcsy R, ed. Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR). Piscataway: IEEE, 2016. 5308-5317.
    [16] Mohanty S, Pozdnukhov A. Graph CNN+ LSTM framework for dynamic macroscopic traffic congestion prediction. In: Fakhraei S, ed. Proc. of the Int’l Workshop on Mining and Learning with Graphs. New York: ACM, 2018. 1-14.
    [17] García-Durán A, Dumančić S, Niepert M. Learning sequence encoders for temporal knowledge graph completion. In: Riloff E, ed. Proc. of the Conf. on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg: ACL, 2018.
    [18] Yan S, Xiong Y, Lin D. Spatial temporal graph convolutional networks for skeleton-based action recognition. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2018. 1-9.
    [19] Sun X, Yin H, Liu B, Chen H, Cao J, Shao Y, Viet Hung NQ. Heterogeneous hypergraph embedding for graph classification. In: Carmel D, ed. Proc. of the ACM Int’l Conf. on Web Search and Data Mining. New York: ACM, 2021. 725-733.
    [20] Bai S, Zhang F, Torr PH. Hypergraph convolution and hypergraph attention. Pattern Recognition, 2021, 110(1): 107637.
    [21] Yadati N, Nimishakavi M, Yadav P, Nitin V, Louis A, Talukdar P. HyperGCN: A new method of training graph convolutional networks on hypergraphs. In: Guyon I, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2019.
    [22] Arya D, Gupta DK, Rudinac S, Worring M. HyperSAGE: Generalizing inductive representation learning on hypergraphs. CoRR, 2020.
    [23] Tu K, Cui P, Wang X, Wang F, Zhu W. Structural deep embedding for hyper-networks. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2018. 1-10.
    [24] Wu Z, Pan S, Chen F, Long G, Zhang C, Philip SY, Systems l. A comprehensive survey on graph neural networks. IEEE Trans. on Neural Networks, 2021, 32(1): 4-24.
    [25] Zhang Z, Cui P, Zhu W, Engineering D. Deep learning on graphs: A survey. IEEE Trans. on Knowledge, 2022, 34(1): 249-270.
    [26] Zhou J, Cui G, Hu S, Zhang Z, Yang C, Liu Z, Wang L, Li C, Sun MJAO. Graph neural networks: A review of methods and applications. AI Open, 2020, 1: 57-81.
    [27] Jiang W, Luo JJ. Graph neural network for traffic forecasting: A survey. arXiv preprint arXiv: 2101.11174, 2021.
    [28] Wu S, Sun F, Zhang W, Cui BJ. Graph neural networks in recommender systems: A survey. arXiv preprint arXiv: 2011.02260, 2020.
    [29] Frasconi P, Gori M, Sperduti A. A general framework for adaptive processing of data structures. IEEE Trans. on Neural Networks, 1998, 9(5): 768-786.
    [30] Powell MJ. An efficient method for finding the minimum of a function of several variables without calculating derivatives. The Computer Journal, 1964, 7(2): 155-162.
    [31] Pineda FJ. Generalization of back-propagation to recurrent neural networks. Physical Review Letters, 1987, 59(19): 2229.
    [32] Li Y, Tarlow D, Brockschmidt M, Zemel RJ. Gated graph sequence neural networks. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: Open Review, 2015. 1-20.
    [33] Cho K, Van Merriënboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Riloff E, ed. Proc. of the Empirical Methods Natural Language Processing (EMNLP). Stroudsburg: ACL, 2014. 1724-1734.
    [34] Dai H, Kozareva Z, Dai B, Smola A, Song L. Learning steady-states of iterative algorithms over graphs. In: Jebara T, ed. Proc. of the Int’l Conf. on Machine Learning (ICML). New York: ACM, 2018. 1106-1114.
    [35] Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl T, Leiserson C. Evolvegcn: Evolving graph convolutional networks for dynamic graphs. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2020. 5363-5370.
    [36] Qiu J, Dong Y, Ma H, Li J, Wang K, Tang J. Network embedding as matrix factorization: Unifying deepwalk, line, PTE and node2vec. In: Carmel D, ed. Proc. of the ACM Int’l Conf. on Web Search and Data Mining. New York: ACM, 2018. 459-467.
    [37] Wang D, Cui P, Zhu W. Structural deep network embedding. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery and Data Mining (KDD). New York: ACM, 2016. 1225-1234.
    [38] Cao S, Lu W, Xu Q. Deep neural networks for learning graph representations. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2016. 1-10.
    [39] Berg RVD, Kipf TN, Welling M. Graph convolutional matrix completion. arXiv preprint arXiv: 02263, 2017.
    [40] Kingma DP, Welling M. Auto-encoding variational bayes. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: Open Review, 2014. 1-10.
    [41] Kipf TN, Welling M. Variational graph auto-encoders. In: Gal Y, ed. Proc. of the NIPS Workshop on Bayesian Deep Learning. Cambridge: MIT Press, 2016. 1-14.
    [42] Pan S, Hu R, Long G, Jiang J, Yao L, Zhang C. Adversarially regularized graph autoencoder for graph embedding. In: Rosenschein JS, ed. Proc. of the Int’l Joint Conf. on Artificial Intelligence. MA: Morgan Kaufmann Publishers, 2018.
    [43] Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y. Generative adversarial networks. In: Lee DD, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2014. 1-14.
    [44] Tu K, Cui P, Wang X, Yu PS, Zhu W. Deep recursive network embedding with regular equivalence. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2018. 2357-2366.
    [45] Hochreiter S, Schmidhuber J. Long short-term memory. Neural Computation, 1997, 9(8): 1735-1780.
    [46] Yu W, Zheng C, Cheng W, Aggarwal CC, Song D, Zong B, Chen H, Wang W. Learning deep network representations with adversarially regularized autoencoders. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2018. 2663-2671.
    [47] Bruna J, Zaremba W, Szlam A, LeCun YJ. Spectral networks and locally connected networks on graphs. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representation (ICLR). New York: OpenReview, 2014. 1-10.
    [48] Levie R, Monti F, Bresson X, Bronstein MM. Cayleynets: Graph convolutional neural networks with complex rational spectral filters. IEEE Trans. on Signal Processing, 2018, 67(1): 97-109.
    [49] Li R, Wang S, Zhu F, Huang J. Adaptive graph convolutional neural networks. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2018. 1-10.
    [50] Micheli A. Neural network for graphs: A contextual constructive approach. IEEE Trans. on Neural Networks, 2009, 20(3): 498-511.
    [51] Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, Gómez-Bombarelli R, Hirzel T, Aspuru-Guzik A, Adams RPJ. Convolutional networks on graphs for learning molecular fingerprints. In: Lee DD, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2015. 2224-2232.
    [52] Kearnes S, McCloskey K, Berndl M, Pande V, Riley P. Molecular graph convolutions: Moving beyond fingerprints. Journal of Computer-aided Molecular Design, 2016, 30(8): 595-608.
    [53] Li Q, Han Z, Wu XM. Deeper insights into graph convolutional networks for semi-supervised learning. In: Proc. of the AAAI Conf. on Artificial Intelligence. New York: AAAI, 2018. 1-14.
    [54] Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y. Graph attention networks. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: Open Review, 2018. 1-10.
    [55] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I. Attention is all you need. In: Lee DD, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2017. 1-10.
    [56] Xu K, Hu W, Leskovec J, Jegelka S. How powerful are graph neural networks? In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: Open Review, 2019. 1-10.
    [57] Qu Y, Bai T, Zhang W, Nie J, Tang J. An end-to-end neighborhood-based interaction model for knowledge-enhanced recommendation. In: Proc. of the 1st Int’l Workshop on Deep Learning Practice for High-dimensional Sparse Data. New York: ACM, 2019. 1-9.
    [58] Wang X, He X, Cao Y, Liu M, Chua TS. KGAT: Knowledge graph attention network for recommendation. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2019. 950-958.
    [59] Yang B, Yih WT, He X, Gao J, Deng L. Embedding entities and relations for learning and inference in knowledge bases. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations (ICLR). New York: Open Review, 2015. 1-10.
    [60] Zheng VW, Sha M, Li Y, Yang H, Fang Y, Zhang Z, Tan KL, Chang KCC. Heterogeneous embedding propagation for large-scale e-commerce user alignment. In: Aggarwal C, ed. Proc. of the 2018 IEEE Int’l Conf. on Data Mining (ICDM). Piscataway: IEEE, 2018. 1434-1439.
    [61] Hong H, Guo H, Lin Y, Yang X, Li Z, Ye J. An attention-based graph neural network for heterogeneous structural learning. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2020. 4132-4139.
    [62] Zhang C, Song D, Huang C, Swami A, Chawla NV. Heterogeneous graph neural network. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2019. 793-803.
    [63] Dettmers T, Minervini P, Stenetorp P, Riedel S. Convolutional 2D knowledge graph embeddings. In: Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2018. 1-10.
    [64] Zhao J, Wang X, Shi C, Hu B, Song G, Ye Y. Heterogeneous graph structure learning for graph neural networks. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence (AAAI). Menlo Park: AAAI, 2021. 1-12.
    [65] Jiang P, Han Y. Reasoning with heterogeneous graph alignment for video question answering. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2020. 11109-11116.
    [66] Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O. Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS). 2013. 1-9.
    [67] Wang Z, Zhang J, Feng J, Chen Z. Knowledge graph embedding by translating on hyperplanes. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2014. 1112-1119.
    [68] Lin Y, Liu Z, Sun M, Liu Y, Zhu X. Learning entity and relation embeddings for knowledge graph completion. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2015. 2181-2187.
    [69] Nickel M, Tresp V, Kriegel HP. A three-way model for collective learning on multi-relational data. In: Jebara T, ed. Proc. of the ICML. New York: ACM, 2011. 809-816.
    [70] Kazemi SM, Poole DJ. Simple embedding for link prediction in knowledge graphs. In: Lee DD, ed. Neural Information Processing Systems (NIPS). Cambridge, 2018. 4289-4300.
    [71] Wang H, Zhao M, Xie X, Li W, Guo M. Knowledge graph convolutional networks for recommender systems. In: Liu L, ed. Proc. of the Web Conf. (WWW). New York: ACM, 2019. 3307-3313.
    [72] Zhu Z, Fan X, Chu X, Bi J. HGCN: A heterogeneous graph convolutional network-based deep learning model toward collective classification. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2020. 1161-1171.
    [73] Fu X, Zhang J, Meng Z, King I. MAGNN: metapath aggregated graph neural network for heterogeneous graph embedding. In: Liu L, ed. Proc. of the Web Conf. (WWW). New York: ACM, 2020. 2331-2341.
    [74] Yun S, Jeong M, Kim R, Kang J, Kim HJJ. Graph transformer networks. In: Lee DD, ed. Proc. of the Conf. on Neural Information Processing Systems (NIPS). Cambridge: MIT Press, 2019. 11983-11993.
    [75] Hu B, Shi C, Zhao WX, Yu PS. Leveraging meta-path based context for top-n recommendation with a neural co-attention model. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2018. 1531-1540.
    [76] Sun Z, Yang J, Zhang J, Bozzon A, Huang LK, Xu C. Recurrent knowledge graph embedding for effective recommendation. In: Proc. of the ACM Conf. on Recommender Systems. New York: ACM, 2018. 297-305.
    [77] Dong Y, Chawla NV, Swami A. metapath2vec: Scalable representation learning for heterogeneous networks. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery and Data Mining (KDD). New York: ACM, 2017. 135-144.
    [78] Fu TY, Lee WC, Lei Z. Hin2vec: Explore meta-paths in heterogeneous information networks for representation learning. In: Winslett MS, ed. Proc. of the Conf. on Information and Knowledge Management (CIKM). New York: ACM, 2017. 1797-1806.
    [79] Wang X, Zhang Y, Shi C. Hyperbolic heterogeneous information network embedding. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI, 2019. 5337-5344.
    [80] Chen T, Sun Y. Task-guided and path-augmented heterogeneous network embedding for author identification. In: Carmel D, ed. Proc. of the ACM Int’l Conf. on Web Search and Data Mining. New York: ACM, 2017. 295-304.
    [81] Yu X, Ren X, Sun Y, Sturt B, Khandelwal U, Gu Q, Norick B, Han J. Recommendation in heterogeneous information networks with implicit user feedback. In: Proc. of the ACM Conf. on Recommender Systems. New York: ACM, 2013. 347-350.
    [82] Yu X, Ren X, Sun Y, Gu Q, Sturt B, Khandelwal U, Norick B, Han J. Personalized entity recommendation: A heterogeneous information network approach. In: Carmel D, ed. Proc. of the ACM Int’l Conf. on Web Search and Data Mining. New York: ACM, 2014. 283-292.
    [83] Zhang J, Shi X, Xie J, Ma H, King I, Yeung DYJ. GAAN: Gated attention networks for learning on large and spatiotemporal graphs. In: Proc. of the UAI. Arlington: AUAI, 2018. 339-349.
    [84] Seo Y, Defferrard M, Vandergheynst P, Bresson X. Structured sequence modeling with graph convolutional recurrent networks. In: Lee DD, ed. Proc. of the Int’l Conf. on Neural Information Processing. Berlin: Springer, 2018. 362-373.
    [85] Narayan A, Roe PHNJIP. Learning graph dynamics using deep neural networks. IFAC-PapersOnLine, 2018, 51(2): 433-438.
    [86] Niepert M, Ahmed M, Kutzkov K. Learning convolutional neural networks for graphs. In: Jebara T, ed. Proc. of the Int’l Conf. on Machine Learning (ICML). New York: ACM, 2016. 2014-2023.
    [87] Li Y, Yu R, Shahabi C, Liu YJ. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In: CoRR. 2017.
    [88] Goyal P, Kamra N, He X, Liu Y. Dyngem: Deep embedding method for dynamic graphs. In: Rosenschein JS, ed. Proc. of the IJCAI Int’l Workshop on Representation Learning for Graphs. Burlington: Morgan Kaufmann Publishers, 2017. 1-10.
    [89] Chen T, Goodfellow I, Shlens JJ. Net2net: Accelerating learning via knowledge transfer. In: Sainath T, ed. Proc. of the ICLR. New York: Open Review, 2016. 1-10.
    [90] Goyal P, Chhetri SR, Canedo A. dyngraph2vec: Capturing network dynamics using dynamic graph representation learning. Knowledge-based Systems, 2020, 187: 104816.
    [91] Rahman M, Al Hasan M. Link prediction in dynamic networks using graphlet. In: Proc. of the Joint European Conf. on Machine Learning and Knowledge Discovery in Databases. Berlin: Springer, 2016. 394-409.
    [92] Bonner S, Brennan J, Kureshi I, Theodoropoulos G, McGough AS, Obara B. Temporal graph offset reconstruction: Towards temporally robust graph representation learning. In: Proc. of the 2018 IEEE Int’l Conf. on Big Data (Big Data). IEEE, 2018. 3737-3746.
    [93] Wu T, Khan A, Gao H, Li CJ. Efficiently embedding dynamic knowledge graphs. In: CoRR. 2019. 1-14.
    [94] Ghosh P, Yao Y, Davis L, Divakaran A. Stacked spatio-temporal graph convolutional networks for action segmentation. In: Proc. of the IEEE/CVF Winter Conf. on Applications of Computer Vision. Piscataway: IEEE, 2020. 576-585.
    [95] Yu B, Yin H, Zhu ZJ. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In: Rosenschein JS, ed. Proc. of the 27th Int’l Joint Conf. on Artificial Intelligence Burlington: Morgan Kaufmann Publishers, 2018. 3634-3640.
    [96] Guo S, Lin Y, Feng N, Song C, Wan H. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Palo Alto: AAAI Press, 2019. 922-929.
    [97] Heidari N, Iosifidis A. Progressive spatio-temporal graph convolutional network for skeleton-based human action recognition. In: Proc. of the ICASSP 2021-2021 IEEE Int’l Conf. on Acoustics, Speech and Signal Processing (ICASSP). Piscataway: IEEE, 2021. 3220-3224.
    [98] Bolla MJDM. Spectra, euclidean representations and clusterings of hypergraphs. Discrete Mathematics, 1993, 117(1-3): 19-39.
    [99] Rodríguez JA. On the Laplacian eigenvalues and metric parameters of hypergraphs. Linear Multilinear Algebra, 2002, 50(1): 1-14.
    [100] Zhou D, Huang J, Schölkopf B. Beyond pairwise classification and clustering using hypergraphs. 2005. https://citeseerx.ist.psu.edu/ viewdoc/download?doi=10.1.1.113.6516&rep=rep1&type=pdf
    [101] Tan S, Guan Z, Cai D, Qin X, Bu J, Chen C. Mapping users across networks by manifold alignment on hypergraph. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI Press, 2014. 159-165.
    [102] Zhu Y, Guan Z, Tan S, Liu H, Cai D, He XJN. Heterogeneous hypergraph embedding for document recommendation. Neurocomputing, 2016, 216: 150-162.
    [103] Tan S, Bu J, Chen C, Xu B, Wang C, He X. Using rich social media information for music recommendation via hypergraph model. ACM Trans. on Multimedia Computing, Communications, 2011, 1-22.
    [104] Fatemi B, Taslakian P, Vazquez D, Poole DJ. Knowledge hypergraphs: Extending knowledge graphs beyond binary relations. arXiv: 1906.00137v1, 2019.
    [105] Hwang T, Tian Z, Kuangy R, Kocher JP. Learning on weighted hypergraphs to integrate protein interactions and gene expressions for cancer outcome prediction. In: Proc. of the IEEE Int’l Conf. on Data Mining. Piscataway: IEEE, 2008. 293-302.
    [106] Klamt S, Haus UU, Theis F. Hypergraphs and cellular networks. PLoS Computational Biology, 2009, 5(5): e1000385.
    [107] Huang Y, Liu Q, Zhang S, Metaxas DN. Image retrieval via probabilistic hypergraph ranking. In: Proc. of the 2010 IEEE Computer Society Conf. on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2010. 3376-3383.
    [108] Agarwal S, Branson K, Belongie S. Higher order learning with graphs. In: Jebara T, ed. Proc. of the 23rd Int’l Conf. on Machine Learning. New York: ACM, 2006. 17-24.
    [109] Shi J, Malik J. Normalized cuts and image segmentation. IEEE Trans. on Pattern Analysis Machine Intelligence, 2000, 22(8): 888-905.
    [110] Feng Y, You H, Zhang Z, Ji R, Gao Y. Hypergraph neural networks. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI Press, 2019. 3558-3565.
    [111] Jiang J, Wei Y, Feng Y, Cao J, Gao Y. Dynamic hypergraph neural networks. In: Rosenschein JS, ed. Proc. of the IJCAI. Burlington: Morgan Kaufmann Publishers, 2019. 2635-2641.
    [112] Chen H, Yin H, Sun X, Chen T, Gabrys B, Musial K. Multi-level graph convolutional networks for cross-platform anchor link prediction. In: Krishnapuram B, ed. Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. New York: ACM, 2020. 1503-1511.
    [113] Yi J, Park J. Hypergraph convolutional recurrent neural network. In: Krishnapuram B, ed. Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. New York: ACM, 2020. 3366-3376.
    [114] Chan THH, Liang Z. Generalizing the hypergraph laplacian via a diffusion process with mediators. Theoretical Computer Science, 2020, 806: 416-428.
    [115] Yadati N, Nitin V, Nimishakavi M, Yadav P, Louis A, Talukdar P. NHP: Neural hypergraph link prediction. In: Winslett MS, ed. Proc. of the 29th ACM Int’l Conf. on Information & Knowledge Management. New York: ACM, 2020. 1705-1714.
    [116] Scheinerman ER, Ullman DH. Fractional Graph Theory: A Rational Approach to the Theory of Graphs. Courier Corporation, 2011.
    [117] Yang C, Wang R, Yao S, Abdelzaher T. Hypergraph learning with line expansion. arXiv preprint arXiv: 04843, 2020.
    [118] Ding K, Wang J, Li J, Li D, Liu HJ. Be More with less: Hypergraph attention networks for inductive text classification. In: Proc. of the EMNLP. Stroudsburg PA: ACL, 2020. 4927-4936.
    [119] Zhang R, Zou Y, Ma JJ. Hyper-SAGNN: A self-attention based graph neural network for hypergraphs. In: Proc. of the Int’l Conf. on Learning Representations (ICLR). Open Review, 2019. 1-10.
    [120] Liu Z, Chen C, Yang X, Zhou J, Li X, Song L. Heterogeneous graph neural networks for malicious account detection. In: Proc. of the 27th ACM Int’l Conf. on Information and Knowledge Management. New York: ACM, 2018. 2077-2085.
    [121] Chen F, Gao Y, Cao D, Ji R. Multimodal hypergraph learning for microblog sentiment prediction. In: Proc. of the 2015 IEEE Int’l Conf. on Multimedia and Expo (ICME). Piscataway: IEEE, 2015. 1-6.
    [122] Gao Y, Wang M, Tao D, Ji R, Dai Q. 3-D object retrieval and recognition with hypergraph analysis. IEEE Trans. on Image Processing, 2012, 21(9): 4290-4303.
    [123] Huang X, Fang Q, Qian S, Sang J, Li Y, Xu C. Explainable interaction-driven user modeling over knowledge graph for sequential recommendation. In: Proc. of the 27th ACM Int’l Conf. on Multimedia. New York: ACM, 2019. 548-556.
    [124] Ma W, Zhang M, Cao Y, Jin W, Wang C, Liu Y, Ma S, Ren X. Jointly learning explainable rules for recommendation with knowledge graph. In: Liu L, ed. Proc. of the World Wide Web Conf. New York: ACM, 2019. 1210-1221.
    [125] Liu Z, Chen C, Li L, Zhou J, Li X, Song L, Qi Y. Geniepath: Graph neural networks with adaptive receptive paths. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI Press, 2019. 4424-4431.
    [126] Bahnsen AC, Stojanovic A, Aouada D, Ottersten B. Cost sensitive credit card fraud detection using Bayes minimum risk. In: Proc. of the 12th Int’l Conf. on Machine Learning and Applications. Piscataway: IEEE, 2013. 333-338.
    [127] Hooi B, Shah N, Beutel A, Günnemann S, Akoglu L, Kumar M, Makhija D, Faloutsos C. Birdnest: Bayesian inference for ratings-fraud detection. In: Proc. of the 2016 SIAM Int’l Conf. on Data Mining. Philadelphia: SIAM, 2016. 495-503.
    [128] Jia Z, Abujabal A, Saha RR, Strötgen J, Weikum G. Tequila: Temporal question answering over knowledge bases. In: Proc. of the 27th ACM Int’l Conf. on Information and Knowledge Management. New York: ACM, 2018. 1807-1810.
    [129] Teney D, Liu L, van Den Hengel A. Graph-structured representations for visual question answering. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2017. 1-9.
    [130] Liang Z, Yang M, Deng L, Wang C, Wang B. Hierarchical depthwise graph convolutional neural network for 3D semantic segmentation of point clouds. In: Proc. of the 2019 Int’l Conf. on Robotics and Automation (ICRA). Piscataway: IEEE, 2019. 8152-8158.
    [131] Gao D, Li K, Wang R, Shan S, Chen X. Multi-modal graph neural network for joint reasoning on vision and scene text. In: Proc. of the IEEE/CVF Conf. on Computer Vision and Pattern Recognition. IEEE, 2020. 12746-12756.
    [132] Te G, Hu W, Zheng A, Guo Z. RGCNN: Regularized graph CNN for point cloud segmentation. In: Proc. of the 26th ACM Int’l Conf. on Multimedia. New York: ACM, 2018. 746-754.
    [133] Landrieu L, Simonovsky M. Large-scale point cloud semantic segmentation with superpoint graphs. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2018. 4558-4567.
    [134] Chen J, Zhang J, Xu X, Fu C, Zhang D, Zhang Q, Xuan Q. E-LSTM-D: A deep learning framework for dynamic network link prediction. IEEE Trans. on Systems, Man, Cybernetics: Systems, 2021, 51(6): 3699-3712.
    [135] Lei K, Qin M, Bai B, Zhang G, Yang M. GCN-gan: A non-linear temporal link prediction model for weighted dynamic networks. In: Proc. of the IEEE INFOCOM 2019-IEEE Conf. on Computer Communications. Piscataway: IEEE, 2019. 388-396.
    [136] Yang M, Liu J, Chen L, Zhao Z, Chen X, Shen Y. An advanced deep generative framework for temporal link prediction in dynamic networks. IEEE Trans. on Cybernetics, 2019, 50(12): 4946-4957.
    [137] Ahmed NM, Chen L. An efficient algorithm for link prediction in temporal uncertain social networks. Information Sciences, 2016, 331: 120-136.
    [138] Divakaran A, Mohan A. Temporal link prediction: A survey. New Generation Computing, 2019, 38(1): 213-258.
    [139] Yu L, Du B, Hu X, Sun L, Han L, Lv W. Deep spatio-temporal graph convolutional network for traffic accident prediction. Neurocomputing, 2021, 423: 135-147.
    [140] Xiao G, Wang R, Zhang C, Ni A. Demand prediction for a public bike sharing program based on spatio-temporal graph convolutional networks. Multimedia Tools Applications, 2020, 80(15): 22907-22925.
    [141] Dai R, Xu S, Gu Q, Ji C, Liu K. Hybrid spatio-temporal graph convolutional network: Improving traffic prediction with navigation data. In: Krishnapuram B, ed. Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. New York: ACM, 2020. 3074-3082.
    [142] Li B, Li X, Zhang Z, Wu F. Spatio-temporal graph routing for skeleton-based action recognition. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. Menlo Park: AAAI Press, 2019. 8561-8568.
    [143] Konstantinova EV, Skorobogatov VA. Application of hypergraph theory in chemistry. Discrete Mathematics, 2001, 235(1-3): 365-383.
    [144] Zhang Z, Lin H, Zhao X, Ji R, Gao Y. Inductive multi-hypergraph learning and its application on view-based 3D object classification. IEEE Trans. on Image Processing, 2018, 27(12): 5957-5968.
    [145] Kim ES, Kang WY, On KW, Heo YJ, Zhang BT. Hypergraph attention networks for multimodal learning. In: Proc. of the IEEE/ CVF Conf. on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2020. 14581-14590.
    [146] Zhang S, Cui S, Ding Z. Hypergraph spectral analysis and processing in 3D point cloud. IEEE Trans. on Image Processing, 2020, 30: 1193-1206.
    [147] Shao W, Peng Y, Zu C, Wang M, Zhang D. Hypergraph based multi-task feature selection for multimodal classification of Alzheimer’s disease. Computerized Medical Imaging Graphics, 2020, 80: 101663.
    [148] Carlson A, Betteridge J, Kisiel B, Settles B, Hruschka E, Mitchell T. Toward an architecture for never-ending language learning. In: Zilberstein S, ed. Proc. of the AAAI Conf. on Artificial Intelligence. New York: AAAI, 2010. 1-12.
    [149] Dong X, Gabrilovich E, Heitz G, Horn W, Lao N, Murphy K, Strohmann T, Sun S, Zhang W. Knowledge vault: A Web-scale approach to probabilistic knowledge fusion. In: Krishnapuram B, ed. Proc. of the 20th ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. New York: ACM, 2014. 601-610.
    [150] Tang X, Yuan R, Li Q, Wang T, Yang H, Cai Y, Song H. Timespan-aware dynamic knowledge graph embedding by incorporating temporal evolution. IEEE Access, 2020, 8: 6849-6860.
    [151] Fatemi B, Taslakian P, Vazquez D, Poole DJ. Knowledge hypergraphs: Extending knowledge graphs beyond binary relations. In: CoRR. 2019. 1-14.
    [152] https://dblp.uni-trier.de
    [153] http://dl.acm.org/
    [154] https://ericdongyx.github.io/metapath2vec/m2v.html
    [155] Yang Z, Cohen W, Salakhudinov R. Revisiting semi-supervised learning with graph embeddings. In: Proc. of the Int’l Conf. on Machine Learning. PMLR, 2016. 40-48.
    [156] Goldberg Y, Levy O. word2vec Explained: deriving Mikolov et al.’s negative-sampling word-embedding method. arXiv preprint arXiv: 02263, 2014.
    [157] https://www.imdb.com/
    [158] https://grouplens.org/datasets/movielens/10m/
    [159] http://socialcomputing.asu.edu
    [160] https://www.yelp.com/dataset/challenge
    [161] https://www.ncbi.nlm.nih.gov/pubmed/
    [162] http://www.freebase.com/
    [163] Socher R, Chen D, Manning CD, Ng A. Reasoning with neural tensor networks for knowledge base completion. In: Lee DD, ed. Proc. of the NIPS. Cambridge: MIT Press, 2013. 926-934.
    [164] Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko OJA. Translating embeddings for modeling multi-relational data. In: Lee DD, ed. Proc. of the NIPS. Cambridge: MIT Press, 2013. 2787-2795.
    [165] Vrandečić D, Krötzsch MJCA. Wikidata: A free collaborative knowledgebase. Communications of the ACM, 2014, 57(10): 78-85.
    [166] Suchanek FM, Kasneci G, Weikum G. Yago: A core of semantic knowledge. In: Proc. of the 16th Int’l Conf. on World Wide Web. New York: ACM, 2007. 697-706.
    [167] http://realitycommons.media.mit.edu/socialevolution.html
    [168] Trivedi R, Farajtabar M, Biswal P, Zha H. Dyrep: Learning representations over dynamic graphs. In: Sainath T, ed. Proc. of the Int’l Conf. on Learning Representations. New York: Open Review, 2019.
    [169] http://konect.uni-koblenz.de/networks/opsahl-ucsocial
    [170] Chen C, Petty K, Skabardonis A, Varaiya P, Jia Z. Freeway performance measurement system: mining loop detector data. Transportation Research Record, 2001, 1748(1): 96-102.
    [171] Kay W, Carreira J, Simonyan K, Zhang B, Hillier C, Vijayanarasimhan S, Viola F, Green T, Back T, Natsev P. The kinetics human action video dataset. arXiv preprint arXiv: 06950, 2017.
    [172] https://archive.ics.uci.edu/ml/datasets/Yeast
    [173] Chen DY, Tian XP, Shen YT, Ouhyoung M. On visual similarity based 3D model retrieval. In: Computer Graphics Forum. Wiley Online Library, 2003. 223-232.
    [174] Wu Z, Song S, Khosla A, Yu F, Zhang L, Tang X, Xiao J. 3D shapenets: A deep representation for volumetric shapes. In: Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition. Piscataway: IEEE, 2015. 1912-1920.
    [175] https://grouplens.org/datasets/hetrec-2011/
    [176] http://jmcauley.ucsd.edu/data/amazon
    [177] http://snap.stanford.edu/data/soc-RedditHyperlinks.html
    [178] Chiang WL, Liu X, Si S, Li Y, Bengio S, Hsieh CJ. Cluster-GCN: An efficient algorithm for training deep and large graph convolutional networks. In: Krishnapuram B, ed. Proc. of the Int’l Conf. on Knowledge Discovery & Data Mining (KDD). New York: ACM, 2019. 257-266.
    [179] Yuan H, Yu H, Gui S, Ji S. Explainability in graph neural networks: A taxonomic survey. arXiv preprint arXiv: 15445, 2020.
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

刘杰,尚学群,宋凌云,谭亚聪.图神经网络在复杂图挖掘上的研究进展.软件学报,2022,33(10):3582-3618

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2021-07-19
  • 最后修改日期:2021-08-30
  • 在线发布日期: 2022-02-22
  • 出版日期: 2022-10-06
文章二维码
您是第19710135位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号