TY - GEN
T1 - Multi-view Contrastive Multiple Knowledge Graph Embedding for Knowledge Completion
AU - Kurokawa, Mori
AU - Yonekawa, Kei
AU - Haruta, Shuichiro
AU - Konishi, Tatsuya
AU - Asoh, Hideki
AU - Ono, Chihiro
AU - Hagiwara, Masafumi
N1 - Funding Information:
This research was partially supported by JST CREST Grant Number JPMJCR21F2, Japan.
Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Knowledge graphs (KGs) are useful information sources to make machine learning efficient with human knowledge. Since KGs are often incomplete, KG completion has become an important problem to complete missing facts in KGs. Whereas most of the KG completion methods are conducted on a single KG, multiple KGs can be effective to enrich embedding space for KG completion. However, most of the recent studies have concentrated on entity alignment prediction and ignored KG-invariant semantics in multiple KGs that can improve the completion performance. In this paper, we propose a new multiple KG embedding method composed of intra-KG and inter-KG regularization to introduce KG-invariant semantics into KG embedding space using aligned entities between related KGs. The intra-KG regularization adjusts local distance between aligned and not-aligned entities using contrastive loss, while the inter-KG regularization globally correlates aligned entity embeddings between KGs using multi-view loss. Our experimental results demonstrate that our proposed method combining both regularization terms largely outperforms existing baselines in the KG completion task.
AB - Knowledge graphs (KGs) are useful information sources to make machine learning efficient with human knowledge. Since KGs are often incomplete, KG completion has become an important problem to complete missing facts in KGs. Whereas most of the KG completion methods are conducted on a single KG, multiple KGs can be effective to enrich embedding space for KG completion. However, most of the recent studies have concentrated on entity alignment prediction and ignored KG-invariant semantics in multiple KGs that can improve the completion performance. In this paper, we propose a new multiple KG embedding method composed of intra-KG and inter-KG regularization to introduce KG-invariant semantics into KG embedding space using aligned entities between related KGs. The intra-KG regularization adjusts local distance between aligned and not-aligned entities using contrastive loss, while the inter-KG regularization globally correlates aligned entity embeddings between KGs using multi-view loss. Our experimental results demonstrate that our proposed method combining both regularization terms largely outperforms existing baselines in the KG completion task.
KW - Contrastive learning
KW - Embedding
KW - Knowledge graph completion
KW - Multi-view learning
UR - http://www.scopus.com/inward/record.url?scp=85152215253&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85152215253&partnerID=8YFLogxK
U2 - 10.1109/ICMLA55696.2022.00223
DO - 10.1109/ICMLA55696.2022.00223
M3 - Conference contribution
AN - SCOPUS:85152215253
T3 - Proceedings - 21st IEEE International Conference on Machine Learning and Applications, ICMLA 2022
SP - 1412
EP - 1418
BT - Proceedings - 21st IEEE International Conference on Machine Learning and Applications, ICMLA 2022
A2 - Wani, M. Arif
A2 - Kantardzic, Mehmed
A2 - Palade, Vasile
A2 - Neagu, Daniel
A2 - Yang, Longzhi
A2 - Chan, Kit-Yan
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 21st IEEE International Conference on Machine Learning and Applications, ICMLA 2022
Y2 - 12 December 2022 through 14 December 2022
ER -