Research article Special Issues

A lightweight CNN-based knowledge graph embedding model with channel attention for link prediction


  • Received: 22 November 2022 Revised: 10 March 2023 Accepted: 19 March 2023 Published: 21 March 2023
  • Knowledge graph (KG) embedding is to embed the entities and relations of a KG into a low-dimensional continuous vector space while preserving the intrinsic semantic associations between entities and relations. One of the most important applications of knowledge graph embedding (KGE) is link prediction (LP), which aims to predict the missing fact triples in the KG. A promising approach to improving the performance of KGE for the task of LP is to increase the feature interactions between entities and relations so as to express richer semantics between them. Convolutional neural networks (CNNs) have thus become one of the most popular KGE models due to their strong expression and generalization abilities. To further enhance favorable features from increased feature interactions, we propose a lightweight CNN-based KGE model called IntSE in this paper. Specifically, IntSE not only increases the feature interactions between the components of entity and relationship embeddings with more efficient CNN components but also incorporates the channel attention mechanism that can adaptively recalibrate channel-wise feature responses by modeling the interdependencies between channels to enhance the useful features while suppressing the useless ones for improving its performance for LP. The experimental results on public datasets confirm that IntSE is superior to state-of-the-art CNN-based KGE models for link prediction in KGs.

    Citation: Xin Zhou, Jingnan Guo, Liling Jiang, Bo Ning, Yanhao Wang. A lightweight CNN-based knowledge graph embedding model with channel attention for link prediction[J]. Mathematical Biosciences and Engineering, 2023, 20(6): 9607-9624. doi: 10.3934/mbe.2023421

    Related Papers:

  • Knowledge graph (KG) embedding is to embed the entities and relations of a KG into a low-dimensional continuous vector space while preserving the intrinsic semantic associations between entities and relations. One of the most important applications of knowledge graph embedding (KGE) is link prediction (LP), which aims to predict the missing fact triples in the KG. A promising approach to improving the performance of KGE for the task of LP is to increase the feature interactions between entities and relations so as to express richer semantics between them. Convolutional neural networks (CNNs) have thus become one of the most popular KGE models due to their strong expression and generalization abilities. To further enhance favorable features from increased feature interactions, we propose a lightweight CNN-based KGE model called IntSE in this paper. Specifically, IntSE not only increases the feature interactions between the components of entity and relationship embeddings with more efficient CNN components but also incorporates the channel attention mechanism that can adaptively recalibrate channel-wise feature responses by modeling the interdependencies between channels to enhance the useful features while suppressing the useless ones for improving its performance for LP. The experimental results on public datasets confirm that IntSE is superior to state-of-the-art CNN-based KGE models for link prediction in KGs.



    加载中


    [1] K. D. Bollacker, C. Evans, P. K. Paritosh, T. Sturge, J. Taylor, Freebase: A collaboratively created graph database for structuring human knowledge, in Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, (2008), 1247–1250. https://doi.org/10.1145/1376616.1376746
    [2] J. Lehmann, R. Isele, M. Jakob, A. Jentzsch, D. Kontokostas, P. N. Mendes, et al., DBpedia - A large-scale, multilingual knowledge base extracted from Wikipedia, Semant. Web, 6 (2015), 167–195. https://doi.org/10.3233/SW-140134 doi: 10.3233/SW-140134
    [3] F. M. Suchanek, G. Kasneci, G. Weikum, Yago: A core of semantic knowledge, in Proceedings of the 16th International Conference on World Wide Web, (2007), 697–706. https://doi.org/10.1145/1242572.1242667
    [4] G. A. Miller, WordNet: A lexical database for English, Commun. ACM, 38 (1995), 39–41. https://doi.org/10.1145/219717.219748 doi: 10.1145/219717.219748
    [5] C. Xiong, R. Power, J. Callan, Explicit semantic ranking for academic search via knowledge graph embedding, in Proceedings of the 26th International Conference on World Wide Web, (2017), 1271–1279. https://doi.org/10.1145/3038912.3052558
    [6] Y. Hao, Y. Zhang, K. Liu, S. He, Z. Liu, H. Wu, et al., An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge, in Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), (2017), 221–231. https://doi.org/10.18653/v1/P17-1021
    [7] F. Zhang, N. J. Yuan, D. Lian, X. Xie, W. Y. Ma, Collaborative knowledge base embedding for recommender systems, in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (2016), 353–362. https://doi.org/10.1145/2939672.2939673
    [8] L. Niu, C. Fu, Q. Yang, Z. Li, Z. Chen, Q. Liu, et al., Open-world knowledge graph completion with multiple interaction attention, World Wide Web, 24 (2021), 419–439. https://doi.org/10.1007/s11280-020-00847-2 doi: 10.1007/s11280-020-00847-2
    [9] D. Q. Nguyen, A survey of embedding models of entities and relationships for knowledge graph completion, in Proceedings of the Graph-based Methods for Natural Language Processing (TextGraphs), (2020), 1–14. http://doi.org/10.18653/v1/2020.textgraphs-1.1
    [10] A. Bordes, N. Usunier, A. García-Durán, J. Weston, O. Yakhnenko, Translating embeddings for modeling multi-relational data, in Advances in Neural Information Processing Systems, 26 (2013), 2787–2795. Available from: https://proceedings.neurips.cc/paper/2013/file/1cecc7a77928ca8133fa24680a88d2f9-Paper.pdf.
    [11] Z. Wang, J. Zhang, J. Feng, Z. Chen, Knowledge graph embedding by translating on hyperplanes, in Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, (2014), 1112–1119. https://doi.org/10.1609/aaai.v28i1.8870
    [12] Y. Fang, X. Zhao, Z. Tan, S. Yang, W. Xiao, A revised translation-based method for knowledge graph representation, J. Comput. Res. Dev., 55 (2018), 139–150. https://doi.org/10.7544/issn1000-1239.2018.20160723 doi: 10.7544/issn1000-1239.2018.20160723
    [13] Y. Lin, Z. Liu, M. Sun, Y. Liu, X. Zhu, Learning entity and relation embeddings for knowledge graph completion, in Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, (2015), 2181–2187. https://doi.org/10.1145/3132847.3133095
    [14] G. Ji, S. He, L. Xu, K. Liu, J. Zhao, Knowledge graph embedding via dynamic mapping matrix, in Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), (2015), 687–696. https://doi.org/10.3115/v1/P15-1067
    [15] B. Yang, W. Yih, X. He, J. Gao, L. Deng, Embedding entities and relations for learning and inference in knowledge bases, in Conference Track Proceedings of the 3rd International Conference on Learning Representations, preprint, arXiv: 1412.6575.
    [16] S. M. Kazemi, D. Poole, SimplE embedding for link prediction in knowledge graphs, in Advances in Neural Information Processing Systems, preprint, arXiv: 1802.04868.
    [17] I. Balazevic, C. Allen, T. M. Hospedales, TuckER: Tensor factorization for knowledge graph completion, in Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, (2019), 5184–5193. http://doi.org/10.18653/v1/D19-1522
    [18] R. Socher, D. Chen, C. D. Manning, A. Y. Ng, Reasoning with neural tensor networks for knowledge base completion, in Advances in Neural Information Processing Systems, 26, (2013), 926–934. Available from: https://proceedings.neurips.cc/paper/2013/file/b337e84de8752b27eda3a12363109e80-Paper.pdf.
    [19] X. Dong, E. Gabrilovich, G. Heitz, W. Horn, N. Lao, K. Murphy, et al., Knowledge vault: a web-scale approach to probabilistic knowledge fusion, in Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (2014), 601–610. https://doi.org/10.1145/2623330.2623623
    [20] T. Dettmers, P. Minervini, P. Stenetorp, S. Riedel, Convolutional 2D knowledge graph embeddings, in Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, (2018), 1811–1818. https://doi.org/10.1609/aaai.v32i1.11573
    [21] D. Q. Nguyen, T. D. Nguyen, D. Q. Nguyen, D. Q. Phung, A novel embedding model for knowledge base completion based on convolutional neural network, in Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), (2018), 327–333. http://doi.org/10.18653/v1/N18-2053
    [22] C. Shang, Y. Tang, J. Huang, J. Bi, X. He, B. Zhou, End-to-end structure-aware convolutional networks for knowledge base completion, in Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence, (2019), 3060–3067. https://doi.org/10.1609/aaai.v33i01.33013060
    [23] S. Vashishth, S. Sanyal, V. Nitin, N. Agrawal, P. Talukdar, InteractE: Improving convolution-based knowledge graph embeddings by increasing feature interactions, in Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence, (2020), 3009–3016. https://doi.org/10.1609/aaai.v34i03.5694
    [24] J. Huang, T. Zhang, J. Zhu, W. Yu, Y. Tang, Y. He, A deep embedding model for knowledge graph completion based on attention mechanism, Neural Comput. Appl., 33 (2021), 9751–9760. https://doi.org/10.1007/s00521-021-05742-z doi: 10.1007/s00521-021-05742-z
    [25] D. Jiang, R. Wang, J. Yang, L. Xue, Kernel multi-attention neural network for knowledge graph embedding, Knowledge-Based Syst., 227 (2021), 107188. https://doi.org/10.1016/j.knosys.2021.107188 doi: 10.1016/j.knosys.2021.107188
    [26] Z. Zhou, C. Wang, Y. Feng, D. Chen, JointE: Jointly utilizing 1D and 2D convolution for knowledge graph embedding, Knowledge-Based Syst., 240 (2022), 108100. https://doi.org/10.1016/j.knosys.2021.108100 doi: 10.1016/j.knosys.2021.108100
    [27] J. Feng, Q. Wei, J. Cui, J. Chen, Novel translation knowledge graph completion model based on 2D convolution, Appl. Intell., 52 (2022), 3266–3275. https://doi.org/10.1007/s10489-021-02438-8 doi: 10.1007/s10489-021-02438-8
    [28] T. Trouillon, J. Welbl, S. Riedel, É. Gaussier, G. Bouchard, Complex embeddings for simple link prediction, in Proceedings of the 33nd International Conference on Machine Learning, (2016), 2071–2080. Available from: https://dl.acm.org/doi/10.5555/3045390.3045609.
    [29] Z. Sun, Z. Deng, J, Nie, J. Tang, RotatE: Knowledge graph embedding by relational rotation in complex space, in Proceedings of the 7th International Conference on Learning Representations, (2019), 1–18. Available from: https://www.researchgate.net/publication/331397037.
    [30] S. Zhang, Y. Tay, L. Yao, Q. Liu, Quaternion knowledge graph embeddings, in Advances in Neural Information Processing Systems, preprint, arXiv: 1904.10281.
    [31] Q. Wang, Z. Mao, B. Wang, L. Guo, Knowledge graph embedding: a survey of approaches and applications, IEEE Transactions on Knowledge and Data Engineering, 29 (2017), 2724–2743. http://doi.org/10.1109/TKDE.2017.2754499 doi: 10.1109/TKDE.2017.2754499
    [32] Z. Liu, M. Sun, Y. Lin, R. Xie, Knowledge representation learning: a review, J. Comput. Res. Dev., 53 (2016), 247–261. https://doi.org/10.7544/ISSN1000-1239.2016.20160020 doi: 10.7544/ISSN1000-1239.2016.20160020
    [33] A. Rossi, D. Barbosa, D. Firmani, A. Matinata, P. Merialdo, Knowledge graph embedding for link prediction: a comparative analysis, ACM Trans. Knowl. Discovery Data, 15 (2021), 1–49. https://doi.org/10.1145/3424672 doi: 10.1145/3424672
    [34] F. Akrami, M. S. Saeef, Q. Zhang, W. Hu, C. Li, Realistic re-evaluation of knowledge graph completion methods: an experimental study, in Proceedings of the 2020 ACM SIGMOD International Conference on Management of Data, (2020), 1995–2010. https://doi.org/10.1145/3318464.3380599
    [35] K. Toutanova, D. Chen, Observed versus latent features for knowledge base and text inference, in Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality, (2015), 57–66. http://doi.org/10.18653/v1/W15-4007
    [36] J. Hu, L. Shen, G. Sun, Squeeze-and-excitation networks, in 2018 IEEE Conference on Computer Vision and Pattern Recognition, (2018), 7132–7141. https://doi.org/10.1109/CVPR.2018.00745
    [37] V. Nair, G. E. Hinton, Rectified linear units improve restricted boltzmann machines, in Proceedings of the 27th International Conference on Machine Learning, (2010), 807–814. Available from: https://icml.cc/Conferences/2010/papers/432.pdf.
    [38] D. P. Kingma, J. Ba, Adam: A method for stochastic optimization, in Conference Track Proceedings of the 3rd International Conference on Learning Representations, preprint, arXiv: 1412.6980.
    [39] C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, Z. Wojna, Rethinking the inception architecture for computer vision, in 2016 IEEE Conference on Computer Vision and Pattern Recognition, (2016), 2818–2826. http://doi.org/10.1109/CVPR.2016.308
    [40] X. Li, W. Wang, X. Hu, J. Yang, Selective kernel networks, in 2019 IEEE Conference on Computer Vision and Pattern Recognition, (2019), 510–519. https://doi.org/10.1109/CVPR.2019.00060
    [41] Q. Wang, B. Wu, P. Zhu, P. Li, W. Zuo, Q. Hu, ECA-Net: Efficient channel attention for deep convolutional neural networks, in 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, (2020), 11531–11539. https://doi.org/10.1109/CVPR42600.2020.01155
    [42] S. Woo, J. Park, J. Y. Lee, I. S. Kweon, CBAM: Convolutional block attention module, in Computer Vision - ECCV 2018 - 15th European Conference, Munich, Germany, September 8-14, 2018, Proceedings, Part VII, (2018), 3–19. https://doi.org/10.1007/978-3-030-01234-2_1
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1334) PDF downloads(160) Cited by(0)

Article outline

Figures and Tables

Figures(3)  /  Tables(9)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog