torchkge-team / torchkge

TorchKGE: Knowledge Graph embedding in Python and PyTorch.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Missing soft constraints in TransH

thsno02 opened this issue · comments

In the original paper, the author proposed three soft constraints and added a hyperparameter C to weight the importance of these constraints. While referring to the

class MarginLoss(Module):
, I did not find the C term.

I found a similar issue in OpenKE Weight C in TransH missing , is this the same reason ttorchkge ignores C though torchkge uses a different normalization method from OpenKE?

Hi @thsno02, TorchKGE does not implement the soft constraint normalization proposed in the original article yet. Instead, the hard constraints can be enforced using the following :

  • constraint 2 (orthogonal) : the project static method can be used to project relation embedding vectors on relation-specific hyperplanes
  • constraints 1 and 3 (scale and unit normal vector) : the normalize_parameters method brings entity embeddings and normal vectors back to norm 1. This is indeed too strong a constraint as the scale of entity embeddings should be allowed to get very small.

I'll leave the issue open until someone contributes to add a soft constraint module. Feel free to do so.