Research article

MRE: A translational knowledge graph completion model based on multiple relation embedding

  • Academic editor: Haiyan Wang
  • Received: 03 November 2022 Revised: 13 December 2022 Accepted: 03 January 2023 Published: 13 January 2023
  • Knowledge graph completion (KGC) has attracted significant research interest in applying knowledge graphs (KGs). Previously, many works have been proposed to solve the KGC problem, such as a series of translational and semantic matching models. However, most previous methods suffer from two limitations. First, current models only consider the single form of relations, thus failing to simultaneously capture the semantics of multiple relations (direct, multi-hop and rule-based). Second, the data-sparse problem of knowledge graphs would make part of relations challenging to embed. This paper proposes a novel translational knowledge graph completion model named multiple relation embedding (MRE) to address the above limitations. We attempt to embed multiple relations to provide more semantic information for representing KGs. To be more specific, we first leverage PTransE and AMIE+ to extract multi-hop and rule-based relations. Then, we propose two specific encoders to encode extracted relations and capture semantic information of multiple relations. We note that our proposed encoders can achieve interactions between relations and connected entities in relation encoding, which is rarely considered in existing methods. Next, we define three energy functions to model KGs based on the translational assumption. At last, a joint training method is adopted to perform KGC. Experimental results illustrate that MRE outperforms other baselines on KGC, demonstrating the effectiveness of embedding multiple relations for advancing knowledge graph completion.

    Citation: Xinyu Lu, Lifang Wang, Zejun Jiang, Shizhong Liu, Jiashi Lin. MRE: A translational knowledge graph completion model based on multiple relation embedding[J]. Mathematical Biosciences and Engineering, 2023, 20(3): 5881-5900. doi: 10.3934/mbe.2023253

    Related Papers:

  • Knowledge graph completion (KGC) has attracted significant research interest in applying knowledge graphs (KGs). Previously, many works have been proposed to solve the KGC problem, such as a series of translational and semantic matching models. However, most previous methods suffer from two limitations. First, current models only consider the single form of relations, thus failing to simultaneously capture the semantics of multiple relations (direct, multi-hop and rule-based). Second, the data-sparse problem of knowledge graphs would make part of relations challenging to embed. This paper proposes a novel translational knowledge graph completion model named multiple relation embedding (MRE) to address the above limitations. We attempt to embed multiple relations to provide more semantic information for representing KGs. To be more specific, we first leverage PTransE and AMIE+ to extract multi-hop and rule-based relations. Then, we propose two specific encoders to encode extracted relations and capture semantic information of multiple relations. We note that our proposed encoders can achieve interactions between relations and connected entities in relation encoding, which is rarely considered in existing methods. Next, we define three energy functions to model KGs based on the translational assumption. At last, a joint training method is adopted to perform KGC. Experimental results illustrate that MRE outperforms other baselines on KGC, demonstrating the effectiveness of embedding multiple relations for advancing knowledge graph completion.



    加载中


    [1] L. F. Wang, X. Lu, Z. Jiang, Z. Zhang, R. Li, M. Zhao, et al., Frs: A simple knowledge graph embedding model for entity prediction, Math. Biosci. Eng., 16 (2019), 7789–7807. https://doi.org/10.3934/mbe.2019391 doi: 10.3934/mbe.2019391
    [2] K. Zhang, B. Hu, F. Zhou, Y. Song, X. Zhao, X. Huang, Graph-based structural knowledge-aware network for diagnosis assistant, Math. Biosci. Eng., 19 (2022), 10533–10549. https://doi.org/10.3934/mbe.2022492 doi: 10.3934/mbe.2022492
    [3] S. Dost, L. Serafini, M. Rospocher, L. Ballan, A. Sperduti, Aligning and linking entity mentions in image, text, and knowledge base, Data Knowl. Eng., 138 (2022), 101975. https://doi.org/10.1016/j.datak.2021.101975 doi: 10.1016/j.datak.2021.101975
    [4] Z. Gomolka, B. Twarog, E. Zeslawska, E. Dudek-Dyduch, Knowledge base component of intelligent ALMM system based on the ontology approach, Expert Syst. Appl., 199 (2022), 116975. https://doi.org/10.1016/j.eswa.2022.116975 doi: 10.1016/j.eswa.2022.116975
    [5] P. Do, T. H. V. Phan, Developing a BERT based triple classification model using knowledge graph embedding for question answering system, Appl. Intell., 52 (2022), 636–651. https://doi.org/10.1007/s10489-021-02460-w doi: 10.1007/s10489-021-02460-w
    [6] K. D. Bollacker, C. Evans, P. K. Paritosh, T. Sturge, J. Taylor, Freebase: a collaboratively created graph database for structuring human knowledge, in Proceedings of the ACM SIGMOD International Conference on Management of Data, ACM, Vancouver, Canada, (2008), 1247–1250. https://doi.org/10.1145/1376616.1376746
    [7] T. Mitchell, W. Cohen, E. Hruschka, P. Talukdar, B. Yang, J. Betteridge, et al., Never-ending learning, Commun. ACM, 61 (2018), 103–115. https://doi.org/10.1145/3191513 doi: 10.1145/3191513
    [8] L. Hou, M. Wu, H. Y. Kang, S. Zheng, L. Shen, Q. Qian, et al., Pmo: A knowledge representation model towards precision medicine, Math. Biosci. Eng., 17 (2020), 4098–4114. https://doi.org/10.3934/mbe.2020227 doi: 10.3934/mbe.2020227
    [9] X. Lu, L. Wang, Z. Jiang, S. He, S. Liu, MMKRL: A robust embedding approach for multi-modal knowledge graph representation learning, Appl. Intell., 52 (2022), 7480–7497. https://doi.org/10.1007/s10489-021-02693-9 doi: 10.1007/s10489-021-02693-9
    [10] N. D. Rodríguez, A. Lamas, J. Sanchez, G. Franchi, I. Donadello, S. Tabik, et al., Explainable neural-symbolic learning (X-NeSyL) methodology to fuse deep learning representations with expert knowledge graphs: The monumai cultural heritage use case, Inf. Fusion, 79 (2022), 58–83. https://doi.org/10.1016/j.inffus.2021.09.022 doi: 10.1016/j.inffus.2021.09.022
    [11] S. Chakrabarti, Deep knowledge graph representation learning for completion, alignment, and question answering, in Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, ACM, Madrid, Spain, (2022), 3451–3454. https://doi.org/10.1145/3477495.3532679
    [12] A. Bordes, N. Usunier, A. García-Durán, J. Weston, O. Yakhnenko, Translating embeddings for modeling multi-relational data, in Advances in Neural Information Processing Systems 26, Curran Associates Inc., Lake Tahoe, United States, (2013), 2787–2795.
    [13] Z. Wang, J. Zhang, J. Feng, Z. Chen, Knowledge graph embedding by translating on hyperplanes, in Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, AAAI, Québec City, Canada, (2014), 1112–1119. https://doi.org/10.1609/aaai.v28i1.8870
    [14] Y. Lin, Z. Liu, M. Sun, Y. Liu, X. Zhu, Learning entity and relation embeddings for knowledge graph completion, in Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, AAAI, Austin, USA, (2015), 2181–2187. https://doi.org/10.1609/aaai.v29i1.9491
    [15] M. Nickel, V. Tresp, H. Kriegel, A three-way model for collective learning on multi-relational data, in Proceedings of the 28th International Conference on Machine Learning, Omnipress, Bellevue, USA, (2011), 809–816.
    [16] B. Yang, W. Yih, X. He, J. Gao, L. Deng, Embedding entities and relations for learning and inference in knowledge bases, in 3rd International Conference on Learning Representations, San Diego, USA, 2015.
    [17] T. Dettmers, P. Minervini, P. Stenetorp, S. Riedel, Convolutional 2d knowledge graph embeddings, in Proceedings of the AAAI conference on artificial intelligence, AAAI, New Orleans, Louisiana, USA, (2018), 1811–1818. https://doi.org/10.1609/aaai.v32i1.11573
    [18] J. Huang, T. Zhang, J. Zhu, W. Yu, Y. Tang, Y. He, A deep embedding model for knowledge graph completion based on attention mechanism, Neural Comput. Appl., 33 (2021), 9751–9760. https://doi.org/10.1007/s00521-021-05742-z doi: 10.1007/s00521-021-05742-z
    [19] Y. Lin, Z. Liu, H. Luan, M. Sun, S. Rao, S. Liu, Modeling relation paths for representation learning of knowledge bases, in Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, The Association for Computational Linguistics, Lisbon, Portugal, (2015), 705–714. https://doi.org/10.18653/v1/d15-1082
    [20] S. Guo, Q. Wang, L. Wang, B. Wang, L. Guo, Knowledge graph embedding with iterative guidance from soft rules, in Proceedings of the AAAI Conference on Artificial Intelligence, AAAI, New Orleans, USA, (2018), 4816–4823. https://doi.org/10.1609/aaai.v32i1.11918
    [21] M. Pitsikalis, T. Do, A. Lisitsa, S. Luo, Logic rules meet deep learning: A novel approach for ship type classification (extended abstract), in Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence, IJCAI, Vienna, Austria, (2022), 5324–5328. https://doi.org/10.24963/ijcai.2022/744
    [22] S. Matsuoka, T. Sawaragi, Recovery planning of industrial robots based on semantic information of failures and time-dependent utility, Adv. Eng. Inf., 51 (2022), 101507. https://doi.org/10.1016/j.aei.2021.101507 doi: 10.1016/j.aei.2021.101507
    [23] M. Nickel, L. Rosasco, T. A. Poggio, Holographic embeddings of knowledge graphs, in Proceedings of the AAAI Conference on Artificial Intelligence, AAAI, Phoenix, USA, (2016), 1955–1961. https://doi.org/10.1609/aaai.v30i1.10314
    [24] R. Biswas, M. Alam, H. Sack, MADLINK: Attentive multihop and entity descriptions for link prediction in knowledge graphs, Semant. Web, (2021), 1–24. https://doi.org/10.3233/SW-222960 doi: 10.3233/SW-222960
    [25] L. Galárraga, C. Teflioudi, K. Hose, F. M. Suchanek, Fast rule mining in ontological knowledge bases with AMIE+, VLDB J., 24 (2015), 707–730. https://doi.org/10.1007/s00778-015-0394-1 doi: 10.1007/s00778-015-0394-1
    [26] J. Kalina, J. Tumpach, M. Holena, On combining robustness and regularization in training multilayer perceptrons over small data, in 2022 International Joint Conference on Neural Networks (IJCNN), IEEE, Padua, Italy, (2022), 1–8. https://doi.org/10.1109/IJCNN55064.2022.9892510
    [27] Y. Bai, Z. Ying, H. Ren, J. Leskovec, Modeling heterogeneous hierarchies with relation-specific hyperbolic cones, in Advances in Neural Information Processing Systems 34, (2021), 12316–12327.
    [28] S. Chen, X. Liu, J. Gao, J. Jiao, R. Zhang, Y. Ji, Hitter: Hierarchical transformers for knowledge graph embeddings, in Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, (2021), 10395–10407. https://doi.org/10.18653/v1/2021.emnlp-main.812
    [29] D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, in 3rd International Conference on Learning Representations, Amsterdam Machine Learning lab, 2015.
    [30] G. Dai, X. Wang, X. Zou, C. Liu, S. Cen, MRGAT: multi-relational graph attention network for knowledge graph completion, Neural Networks, 154 (2022), 234–245. https://doi.org/10.1016/j.neunet.2022.07.014 doi: 10.1016/j.neunet.2022.07.014
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1916) PDF downloads(110) Cited by(4)

Article outline

Figures and Tables

Figures(6)  /  Tables(9)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog