Knowledge graphs (KGs) are useful information sources to make machine learning efficient with human knowledge. Since KGs are often incomplete, KG completion has become an important problem to complete missing facts in KGs. Whereas most of the KG completion methods are conducted on a single KG, multiple KGs can be effective to enrich embedding space for KG completion. However, most of the recent studies have concentrated on entity alignment prediction and ignored KG-invariant semantics in multiple KGs that can improve the completion performance. In this paper, we propose a new multiple KG embedding method composed of intra-KG and inter-KG regularization to introduce KG-invariant semantics into KG embedding space using aligned entities between related KGs. The intra-KG regularization adjusts local distance between aligned and not-aligned entities using contrastive loss, while the inter-KG regularization globally correlates aligned entity embeddings between KGs using multi-view loss. Our experimental results demonstrate that our proposed method combining both regularization terms largely outperforms existing baselines in the KG completion task.