Knowledgebra: An Algebraic Learning Framework for Knowledge Graph

Authors:

Yang, Tong, Yifei Wang, Long Sha, Jan Engelbrecht, and Pengyu Hong

Affiliation:

Department of Physics, Boston College, Chestnut Hill, MA 02135, USA

Department of Computer Science, Brandeis University, Waltham, MA 02453, USA

Description:

Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented. Dense embeddings trained from KG datasets benefit a variety of downstream tasks such as KG completion and link prediction. However, existing KG embedding methods fell short to provide a systematic solution for the global consistency of knowledge representation. We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra. By analyzing five distinct algebraic properties, we proved that the semigroup is the most reasonable algebraic structure for the relation embedding of a general knowledge graph. We implemented an instantiation model, SemE, using simple matrix semigroups, which exhibits state-of-the-art performance on standard datasets. Moreover, we proposed a regularization-based method to integrate chain-like logic rules derived from human knowledge into embedding training, which further demonstrates the power of the developed language. As far as we know, by applying abstract algebra in statistical learning, this work develops the first formal language for general knowledge graphs, and also sheds light on the problem of neural-symbolic integration from an algebraic perspective.

Publications:

  • Yang, Tong, Yifei Wang, Long Sha, Jan Engelbrecht, and Pengyu Hong.; Knowledgebra: An Algebraic Learning Framework for Knowledge Graph; Machine Learning and Knowledge Extraction, 2022
  • Tags:

    Machine learning
    Mathematical methods

    No related files available