一种融合伴随信息的网络表示学习模型
作者:
作者简介:

杜航原(1985-),男,博士,副教授,CCF专业会员,主要研究领域为图机器学习;王文剑(1968-),女,博士,教授,博士生导师,CCF杰出会员,主要研究领域为机器学习,数据挖掘;白亮(1982-),男,博士,教授,博士生导师,CCF专业会员,主要研究领域为无监督学习,社会网络挖掘

通讯作者:

王文剑,wjwang@sxu.edu.cn

中图分类号:

TP181

基金项目:

国家自然科学基金(61902227,62076154,U1805263,61773247);山西省自然科学基金(201901D211192);山西省高等学校科技创新项目(2019L0039)


Network Representation Learning Model Integrating Accompanying Information
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [29]
  • |
  • 相似文献
  • |
  • 引证文献
  • | |
  • 文章评论
    摘要:

    网络表示学习被认为是提高信息网络分析效率的关键技术之一,旨在将网络中每个节点映射为低维隐空间中的向量表示,并使这些向量高效的保持原网络的结构和特性.近年来,大量研究致力于网络拓扑和节点属性的深度挖掘,并在一些网络分析任务中取得了良好应用效果.事实上,在这两类关键信息之外,真实网络中广泛存在的伴随信息,反映了网络中复杂微妙的各种关系,对网络的形成和演化起着重要作用.为提高网络表示学习的有效性,提出了一种能够融合伴随信息的网络表示学习模型NRLIAI.该模型以变分自编码器(VAE)作为信息传播和处理的框架,在编码器中利用图卷积算子进行网络拓扑和节点属性的聚合与映射,在解码器中完成网络的重构,并融合伴随信息对网络表示学习过程进行指导.该模型克服了现有方法无法有效利用伴随信息的缺点,同时具有一定的生成能力,能减轻表示学习过程中的过拟合问题.在真实网络数据集上,通过节点分类和链路预测任务对NRLIAI模型与几种现有方法进行了对比实验,实验结果验证了该模型的有效性.

    Abstract:

    Network representation learning is regarded as a key technology for improving the efficiency of information network analysis. It maps network nodes to low-dimensional vectors in a latent space and maintains the structure and characteristics of the original network in these vectors efficiently. In recent years, many studies focus on exploring network topology and node features intensively, and the application bears fruit in many network analysis tasks. In fact, besides these two kinds of key information, the accompanying information widely existing in the network reflects various complex relationships and plays an important role in the network’s construction and evolution. In order to improve the efficiency of network representation learning, a novel model integrating the accompanying information is proposed with the name NRLIAI. The model employs the variational auto-encoders (VAE) to propagate and process information. In addition, it aggregates and maps network topology and node features by graph convolutional operators in the encoder, reconstructs the network in the decoder, and integrates the accompanying information to guide the network representation learning. Furthermore, the proposed model solves the problem that the existing methods fail to utilize the accompanying information effectively. At the same time, the model possesses a generative ability, which enables it to reduce the overfitting problem in the learning process. With several real-world network datasets, this study conducts extensive comparative experiments on the existing methods of NRLIAL through node classification and link prediction tasks, and the experimental results have proved the feasibility of the proposed model.

    参考文献
    [1] Cui P, Wang X, Pei J, Zhu WW. A survey on network embedding. IEEE Transactions on Knowledge and Data Engineering, 2019, 31(5): 833–852. [doi: 10.1109/TKDE.2018.2849727]
    [2] Tu CC, Yang C, Liu ZY, Sun MS. Network representation learning: An overview. SCIENTIA SINICA Informationis, 2017, 47(8): 980–996 (in Chinese with English abstract). [doi: 10.1360/N112017-00145] 涂存超, 杨成, 刘知远, 孙茂松. 网络表示学习综述. 中国科学: 信息科学, 2017, 47(8): 980–996. [doi: 10.1360/N112017-00145]
    [3] Goyal P, Ferrara E. Graph embedding techniques, applications, and performance: A survey. Knowledge-Based Systems, 2018, 151: 78–94. [doi: 10.1016/j.knosys.2018.03.022]
    [4] Perozzi B, Al-Rfou R, Skiena S. DeepWalk: Online learning of social representations. In: Proc. of the 20th ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. New York: ACM, 2014. 701–710.
    [5] Tang J, Qu M, Wang MZ, Zhang M, Yan J, Mei QZ. LINE: Large-scale information network embedding. In: Proc. of the 24th Int’l Conf. on World Wide Web. Florence: Int’l World Wide Web Conf. Steering Committee, 2015. 1067–1077.
    [6] Berg RVD, Kipf TN, Welling M. Graph convolutional matrix completion. In: Proc. of the 24th ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. New York: ACM, 2018.
    [7] Jin D, Wang KZ, Zhang G, Jiao PF, He DX, Fogelman-Soulié F, Huang X. Detecting Communities with multiplex semantics by distinguishing background, general, and specialized topics. IEEE Transactions on Knowledge and Data Engineering, 2020, 32(11): 2144–2158. [doi: 10.1109/TKDE.2019.2937298]
    [8] Hamilton WL, Ying R, Leskovec J. Representation learning on graphs: Methods and applications. IEEE Data Engineering Bulletin, 2017, 40(3): 52–74.
    [9] Tang L, Liu H. Relational learning via latent social dimensions. In: Proc. of the 15th ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. Paris: ACM, 2009. 817–826.
    [10] Yang C, Liu ZY, Zhao DL, Sun MS, Chang EY. Network representation learning with rich text information. In: Proc. of the 24th Int’l Conf. on Artificial Intelligence. Buenos Aires: AAAI Press, 2015. 2111–2117.
    [11] Grover A, Leskovec J. Node2vec: Scalable feature learning for networks. In: Proc. of the 22nd ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. San Francisco: ACM, 2016. 855–864.
    [12] Li JZ, Zhu J, Zhang B. Discriminative deep random walk for network classification. In: Proc. of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin: Association for Computational Linguistics, 2016. 1004–1013.
    [13] Pan SR, Wu J, Zhu XQ, Zhang CQ, Wang Y. Tri-party deep network representation. In: Proc. of the 25th Int’l Joint Conf. on Artificial Intelligence. New York: AAAI Press, 2016. 1895–1901.
    [14] Chen L, Zhu PS, Qian TY, Zhu H, Zhou J. Edge sampling based network embedding model. Ruan Jian Xue Bao/Journal of Software, 2018, 29(3): 756–771 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/5435.htm 陈丽, 朱裴松, 钱铁云, 朱辉, 周静. 基于边采样的网络表示学习模型. 软件学报, 2018, 29(3): 756–771. http://www.jos.org.cn/1000-9825/5435.htm
    [15] Wang HW, Wang J, Wang JL, Zhao M, Zhang WN, Zhang FZ, Xie X, Guo MY. GraphGAN: Graph representation learning with generative adversarial nets. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence. New Orleans: AAAI, 2018. 2508–2515.
    [16] Wang SH, Aggarwal C, Tang JL, Liu H. Attributed signed network embedding. In: Proc. of the 2017 ACM on Conf. on Information and Knowledge Management. Singapore: ACM, 2017. 137–146.
    [17] Cao SS, Lu W, Xu QK. Deep neural networks for learning graph representations. In: Proc. of the 30th AAAI Conf. on Artificial Intelligence. Phoenix: AAAI, 2016. 1145–1152.
    [18] Wang DX, Cui P, Zhu WW. Structural deep network embedding. In: Proc. of the 22nd ACM SIGKDD Int’l Conf. on Knowledge Discovery and Data Mining. San Francisco: ACM, 2016. 1225–1234.
    [19] Bruna J, Zaremba W, Szlam A, LeCun Y. Spectral networks and locally connected networks on graphs. In: Proc. of the 2nd Int’l Conf. on Learning Representations. Banff, 2014.
    [20] Hamilton WL, Ying R, Leskovec J. Inductive representation learning on large graphs. In: Proc. of the 31st Int’l Conf. on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 1025–1035.
    [21] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. In: Proc. of the 4th Int’l Conf. on Learning Representations. 2016.
    [22] Kipf TN, Welling M. Variational graph auto-encoders. In: Proc. of the 30th Conf. on Neural Information Processing Systems. New York: Curran Associates Inc., 2016. 11313–11320.
    [23] Feng R, Yang Y, Hu WJ, Wu F, Zhang YT. Representation learning for scale-free networks. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence. New Orleans: AAAI, 2018. 282–289.
    [24] Chen HC, Perozzi B, Hu YF, Skiena S. HARP: Hierarchical representation learning for networks. In: Proc. of the 32nd AAAI Conf. on Artificial Intelligence. New Orleans: AAAI, 2018. 2127–2134.
    [25] Li CZ, Li ZJ, Wang SZ, Yang Y, Zhang XM, Zhou JS. Semi-supervised network embedding. In: Proc. of the 22nd Int’l Conf. on Database Systems for Advanced Applications. Suzhou: Springer, 2017. 131–147.
    [26] Huang X, Li JD, Hu X. Label informed attributed network embedding. In: Proc. of the 10th ACM Int’l Conf. on Web Search and Data Mining. Cambridge: ACM, 2017. 731–739.
    [27] Yang ZL, Cohen WW, Salakhutdinov R. Revisiting semi-supervised learning with graph embeddings. In: Proc. of the 33rd Int’l Conf. on Int’l Conf. on Machine Learning. New York: JMLR.org, 2016. 40–48.
    [28] He T, Gao LL, Song JK, Wang X, Huang KJ, Li YF. SNEQ: Semi-supervised attributed network embedding with attention-based quantisation. In: Proc. of the 34th AAAI Conf. on Artificial Intelligence. New York: AAAI, 2020. 4091–4098.
    [29] Kingma DP, Welling M. Auto-encoding variational Bayes. In: Proc. of the 2nd Int’l Conf. on Learning Representations. Banff, 2014.
    相似文献
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

杜航原,王文剑,白亮.一种融合伴随信息的网络表示学习模型.软件学报,2023,34(6):2749-2764

复制
相关视频

分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2021-03-09
  • 最后修改日期:2021-07-16
  • 在线发布日期: 2022-07-22
  • 出版日期: 2023-06-06
文章二维码
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号