Abstract:Knowledge graph completion can make the knowledge graph more complete. Unfortunately, most of existing methods on knowledge graph completion assume that the entities or relations in the knowledge graph have sufficient triple instances. Nevertheless, there are great deals of long-tail triple sin general domains. Furthermore, it is challenging to obtain a large amount of high-quality annotation data in vertical domains. To address these issues, a knowledge collaborative fine-tuning approach is proposed for low-resource knowledge graph completion. The structured knowledge is leveraged to construct the initial prompt template and the optimal templates, labels, and model parameters are learnt through a collaborative fine-tuning algorithm. The proposed method leverages the explicit structured knowledge in the knowledge graph and the implicit triple knowledge from the language model, which can be applied to the tasks of link prediction and relation extraction. Experimental results show that the proposed approach can obtain state-of-the-art performance on three knowledge graph reasoning datasets and five relation extraction datasets.