Abstract:Knowledge graphs (KGs) store a great amount of structured knowledge and semantic information. They have been widely used by many knowledge-powered intelligent applications. With the rapid development of these applications, their requirements for knowledge also change. A single KG usually suffers from the incompleteness issue and is therefore unable to meet the requirement. This suggests an urgent demand for supporting new data sources and fusing multi-sourced knowledge. The conventional paradigm for KG representation learning and application only considers a single KG while ignores the knowledge transfer between different sources. Joint representation learning on multi-sourced KGs can bring performance improvement, but it cannot support the extended representation learning of new KGs. To resolve these issues, this paper presents a new paradigm, i.e., lifelong representation learning on multi-sourced KGs. Given a sequence of multi-sourced KGs, lifelong representation learning aims at benefiting from the previously-learned KG and embedding model when learning a new KG. To this end, this study proposes a lifelong learning framework based on linked entity replay. First, it designs a Transformer-based KG embedding model that leverages relation correlations for link prediction between entities. Second, it proposes a linked subgraph generation method. It leverages the entity alignment between different sources to build the subgraph and replays the linked entities to enable lifelong learning and knowledge transfer. Finally, it uses a dynamic model structure with model parameters and embeddings stored for each KG to avoid catastrophic forgetting. Experiments on benchmarks show that the proposed KG embedding model can achieve the state-of-the-art performance in link prediction, and the lifelong representation learning framework is effective and efficient in multi-sourced knowledge transfer compared with baselines.