基于扩散概率分布的时序知识图谱推理
作者:
作者简介:

周光有(1983-), 男, 博士, 教授, 博士生导师, CCF 专业会员, 主要研究领域为自然语言处理, 信息检索. ;李鹏飞(1997-), 男, 硕士, 主要研究领域为自然语言处理, 计算机科学与技术. ;谢鹏辉(1998-), 男, 硕士, 主要研究领域为自然语言处理, 计算机科学与技术. ;罗昌银(1983-), 男, 博士, 副教授, CCF 专业会员, 主要研究领域为大数据管理, 数据挖掘.

通讯作者:

周光有, E-mail: gyzhou@mail.ccnu.edu.cn

中图分类号:

TP18

基金项目:

国家自然科学基金(61972173); 中央高校基本科研业务费(CCNU22QN015); 武汉市知识创新专项基础研究项目(2022010801010278)


Temporal Knowledge Graph Reasoning Based on Diffusion Probability Distribution
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [43]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    时序知识图谱推理旨在补充知识图谱中缺失的链接(事实), 其中每个事实都与时间戳进行绑定. 基于变分自动编码器的动态变分框架在这项任务中显示出独特的优势. 通过将实体和关系基于高斯分布进行联合建模, 该方法不仅具备很强的可解释性, 而且解决了复杂的概率分布问题. 然而, 传统的变分自动编码器方法在训练过程中容易出现过拟合问题, 从而不能精确捕捉实体语义的演化过程. 为了解决这个问题, 提出基于扩散概率分布的时序知识图谱推理模型. 具体来讲, 建立一个双向的迭代过程, 将实体语义建模过程分为多个子模块. 其中, 每个子模块通过一个正向的加噪变换和反向的高斯采样组成, 负责建模实体语义的一个微小演变过程. 相对基于变分自动编码器的方法, 通过多个子模块联合建模显示地学习度量空间中实体语义随时间的动态表示, 能够得到更为精确的建模. 与基于变分自动编码器的方法相比, 对于评估指标 $ MRR $, 模型在Yago11k数据集和Wikidata12k数据集分别提高4.18%和1.87%, 在ICEWS14和ICEWS05-15数据集上分别提高1.63%和2.48%.

    Abstract:

    Temporal knowledge graph reasoning aims to fill in missing links or facts in knowledge graphs, where each fact is associated with a specific timestamp. The dynamic variational framework based on variational autoencoder is particularly effective for this task. By jointly modeling entities and relations using Gaussian distributions, this method not only offers high interpretability but also solves complex probability distribution problems. However, traditional variational autoencoder-based methods often suffer from overfitting during training, which limits their ability to accurately capture the semantic evolution of entities over time. To address this challenge, this study proposes a new temporal knowledge graph reasoning model based on a diffusion probability distribution approach. Specifically, the model uses a bi-directional iterative process to divide the entity semantic modeling process into multiple sub-modules. Each sub-module uses a forward noisy transformation and a backward Gaussian sampling to model a small-scale evolution process of entity semantics. Compared with the variational autoencoder-based method, this study can obtain more accurate modeling by learning the dynamic representation of entity semantics in the metric space over time through the joint modeling of multiple submodules. Compared with the variational autoencoder-based method, the model improves by 4.18% and 1.87% on the Yago11k dataset and Wikidata12k dataset for evaluating the MRR of the indicator and by 1.63% and 2.48% on the ICEWS14 and ICEWS05-15 datasets, respectively.

    参考文献
    [1] Trivedi R, Dai HJ, Wang YC, Song L. Know-evolve: Deep temporal reasoning for dynamic knowledge graphs. In: Proc. of the 34th Int’l Conf. on Machine Learning. Sydney: JMLR.org, 2017. 3462–3471.
    [2] 官赛萍, 靳小龙, 贾岩涛, 王元卓, 程学旗. 面向知识图谱的知识推理研究进展. 软件学报, 2018, 29(10): 2966–2994. http://www.jos.org.cn/1000-9825/5551.htm
    Guan SP, Jin XL, Jia YT, Wang YZ, Cheng XQ. Knowledge reasoning over knowledge graph: A survey. Ruan Jian Xue Bao/Journal of Software, 2018, 29(10): 2966–2994 (in Chinese with English abstract). http://www.jos.org.cn/1000-9825/5551.htm
    [3] 王鑫, 陈蔚雪, 杨雅君, 张小旺, 冯志勇. 知识图谱划分算法研究综述. 计算机学报, 2021, 44(1): 235–260. [doi: 10.11897/SP.J.1016.2021.00235]
    Wang X, Chen WX, Yang YJ, Zhang XW, Feng ZY. Research on knowledge graph partitioning algorithms: A survey. Chinese Journal of Computers, 2021, 44(1): 235–260 (in Chinese with English abstract). [doi: 10.11897/SP.J.1016.2021.00235]
    [4] Trivedi R, Farajtabar M, Biswal P, Zha HY. DyRep: Learning representations over dynamic graphs. In: Proc. of the 2019 Int’l Conf. on Learning Representations. New Orleans: OpenReview.net, 2019.
    [5] 林旺群, 汪淼, 王伟, 王重楠, 金松昌. 知识图谱研究现状及军事应用. 中文信息学报, 2020, 34(12): 9–16.
    Lin WQ, Wang M, Wang W, Wang CN, Jin SC. A survey to knowledge graph and its military application. Journal of Chinese Information Processing, 2020, 34(12): 9–16 (in Chinese with English abstract). [doi: 10.3969/j.issn.1003-0077.2020.12.003]
    [6] 张子辰. 时序知识图谱的增量构建与可视化 [硕士学位论文]. 云南: 云南大学, 2021.
    Zhang ZC. Incremental construction and visualization of time-series knowledge graph [MS. Thesis]. Kunming: Yunnan University, 2021 (in Chinese with English abstract).
    [7] Zhu CC, Chen MH, Fan CJ, Cheng GQ, Zhan Y. Learning from history: Modeling temporal knowledge graphs with sequential copy-generation networks. Proceedings of the AAAI Conference on Artificial Intelligence, 2020, 35(5): 4732–4740. [doi: 10.1609/aaai.v35i5.16604]
    [8] Jin W, Qu M, Jin XS, Ren X. Recurrent event network: Autoregressive structure inferenceover temporal knowledge graphs. In: Proc. of the 2020 Conf. on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. 6669–6683.
    [9] Liao SY, Liang SS, Meng ZQ, Zhang Q. Learning dynamic embeddings for temporal knowledge graphs. In: Proc. of the 14th ACM Int’l Conf. on Web Search and Data Mining. ACM, 2021. 535–543.
    [10] 马昂, 于艳华, 杨胜利, 石川, 李劼, 蔡修秀. 基于强化学习的知识图谱综述. 计算机研究与发展, 2022, 59(8): 1694–1722. [doi: 10.7544/issn1000-1239.20211264]
    Ma A, Yu YH, Yang SL, Shi C, Li J, Cai XX. Survey of knowledge graph based on reinforcement learning. Journal of Computer Research and Development, 2022, 59(8): 1694–1722 (in Chinese with English abstract). [doi: 10.7544/issn1000-1239.20211264]
    [11] Leblay J, Chekol MW. Deriving validity time in knowledge graph. In: Proc. of the 2018 Web Conf. Lyon: Int’l World Wide Web Conf. Steering Committee, 2018. 1771–1776.
    [12] García-Durán A, Dumančić S, Niepert M. Learning sequence encoders for temporal knowledge graph completion. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 4816–4821.
    [13] Dasgupta SS, Ray SN, Talukdar P. HyTE: Hyperplane-based temporally aware knowledge graph embedding. In: Proc. of the 2018 Conf. on Empirical Methods in Natural Language Processing. Brussels: Association for Computational Linguistics, 2018. 2001–2011.
    [14] Goel R, Kazemi SM, Brubaker M, Poupart P. Diachronic embedding for temporal knowledge graph completion. Proceedings of the AAAI Conference on Artificial Intelligence, 2020, 34(4): 3988–3995. [doi: 10.1609/aaai.v34i04.5815]
    [15] Xu CJ, Nayyeri M, Alkhoury F, Yazdi H, Lehmann J. Temporal knowledge graph completion based on time series Gaussian embedding. In: Proc. of the 19th Int’l Semantic Web Conf. Athens: Springer, 2020. 654–671.
    [16] Xu CJ, Nayyeri M, Alkhoury F, Yazdi HS, Lehmann J. TeRo: A time-aware knowledge graph embedding via temporal rotation. In: Proc. of the 28th Int’l Conf. on Computational Linguistics. Barcelona: Int’l Committee on Computational Linguistics, 2020. 1583–1593.
    [17] Ma SX, Li AP, Zhao XJ, Song YC. Learning bilstm-based embeddings for relation prediction in temporal knowledge graph. Journal of Physics: Conference Series, 2021, 1871(1): 012050. [doi: 10.1088/1742-6596/1871/1/012050]
    [18] Han Z, Chen P, Ma YP, Tresp V. DyERNIE: Dynamic evolution of riemannian manifold embeddings for temporal knowledge graph completion. In: Proc. of the 2020 Conf. on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, 2020. 7301–7316.
    [19] Xu CJ, Chen YY, Nayyeri M, Lehmann J. Temporal knowledge graph completion using a linear temporal regularizer and multivector embeddings. In: Proc. of the 2021 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, 2021. 2569–2578.
    [20] Liu Y, Hua W, Qu JF, Xin KX, Zhou XF. Temporal knowledge completion with context-aware embeddings. World Wide Web, 2021, 24(2): 675–695. [doi: 10.1007/s11280-021-00867-6]
    [21] Shao PP, Zhang DW, Yang GH, Tao JH, Che FH, Liu T. Tucker decomposition-based temporal knowledge graph completion. Knowledge-Based Systems, 2022, 238: 107841. [doi: 10.1016/j.knosys.2021.107841]
    [22] Li ZX, Jin XL, Li W, Guan SP, Guo JF, Shen HW, Wang YZ, Cheng XQ. Temporal knowledge graph reasoning based on evolutional representation learning. In: Proc. of the 44th Int’l ACM SIGIR Conf. on Research and Development in Information Retrieval. Association for Computing Machinery, 2021. 408–417.
    [23] Han Z, Chen P, Ma YP, Tresp V. Explainable subgraph reasoning for forecasting on temporal knowledge graphs. In: Proc. of the 2020 Int’l Conf. on Learning Representations. OpenReview.net, 2020.
    [24] Sun HH, Zhong JL, Ma YP, Han Z, He K. TimeTraveler: Reinforcement learning for temporal knowledge graph forecasting. In: Proc. of the 2021 Conf. on Empirical Methods in Natural Language Processing. Punta Cana: Association for Computational Linguistics, 2021. 8306–8319.
    [25] Deng S, Rangwala H, Ning Y. Dynamic knowledge graph based multi-event forecasting. In: Proc. of the 26th ACM SIGKDD Int’l Conf. on Knowledge Discovery & Data Mining. ACM, 2020. 1585–1595.
    [26] Li ZX, Jin XL, Guan SP, Li W, Guo JF, Wang YZ, Cheng XQ. Search from history and reason for future: Two-stage reasoning on temporal knowledge graphs. In: Proc. of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th Int’l Joint Conf. on Natural Language Processing (Vol. 1: Long Papers). Association for Computational Linguistics, 2021. 4732–4743.
    [27] Park N, Liu FC, Mehta P, Cristofor D, Faloutsos C, Dong YX. EvoKG: Jointly modeling event time and network structure for reasoning over temporal knowledge graphs. In: Proc. of the 15th ACM Int’l Conf. on Web Search and Data Mining. ACM, 2022. 794–803.
    [28] Jung J, Jung J, Kang U. Learning to walk across time for interpretable temporal knowledge graph completion. In: Proc. of the 27th ACM SIGKDD Conf. on Knowledge Discovery & Data Mining. ACM, 2021. 786–795.
    [29] Luo ST, Hu W. Diffusion probabilistic models for 3D point cloud generation. In: Proc. of the 2021 IEEE/CVF Conf. on Computer Vision and Pattern Recognition. Nashville: IEEE, 2021. 2836–2844.
    [30] Nichol A, Dhariwal P. Improved denoising diffusion probabilistic models. arXiv:2102.09672, 2021.
    [31] Kingma DP, Ba J. Adam: A method for stochastic optimization. arXiv:1412.6980, 2017.
    [32] Bordes A, Usunier N, Garcia-Duran A, Weston J, Yakhnenko O. Translating embeddings for modeling multi-relational data. In: Proc. of the 26th Int’l Conf. on Neural Information Processing Systems. Lake Tahoe: Curran Associates Inc., 2013. 2787–2795.
    [33] Yang BS, Yih WT, He XD, Gao JF, Deng L. Embedding entities and relations for learning and inference in knowledge bases. arXiv:1412.6575, 2015.
    [34] Schlichtkrull M, Kipf TN, Bloem P, van den Berg R, Titov I, Welling M. Modeling relational data with graph convolutional networks. In: Proc. of the 15th Int’l Conf. on the Semantic Web. Heraklion: Springer, 2018. 593–607.
    [35] Dettmers T, Minervini P, Stenetorp P, Riedel S. Convolutional 2D knowledge graph embeddings. Proceedings of the AAAI Conference on Artificial Intelligence, 2018, 32(1): 1811–1818. [doi: 10.1609/aaai.v32i1.11573]
    [36] Shang C, Tang Y, Huang J, Bi JB, He XD, Zhou BW. End-to-end structure-aware convolutional networks for knowledge base completion. Proceedings of the AAAI Conference on Artificial Intelligence, 2019, 33(1): 3060–3067. [doi: 10.1609/aaai.v33i01.33013060]
    [37] Sun ZQ, Deng HZ, Nie YJ, Tang J. RotatE: Knowledge graph embedding by relational rotation in complex space. arXiv:1902.10197, 2019.
    [38] Griffith S, Subramanian K, Scholz J, Isbell CL, Thomaz A. Policy shaping: Integrating human feedback with reinforcement learning. In: Proc. of the 26th Int’l Conf. on Neural Information Processing Systems. Lake Tahoe: Curran Associates Inc., 2013. 2625–2633.
    引证文献
    网友评论
    网友评论
    分享到微博
    发 布
引用本文

周光有,李鹏飞,谢鹏辉,罗昌银.基于扩散概率分布的时序知识图谱推理.软件学报,2024,35(11):5083-5097

复制
分享
文章指标
  • 点击次数:793
  • 下载次数: 2878
  • HTML阅读次数: 472
  • 引用次数: 0
历史
  • 收稿日期:2023-02-28
  • 最后修改日期:2023-04-25
  • 在线发布日期: 2023-11-15
  • 出版日期: 2024-11-06
文章二维码
您是第19785995位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京市海淀区中关村南四街4号,邮政编码:100190
电话:010-62562563 传真:010-62562533 Email:jos@iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号