malllabiisc / CompGCN

ICLR 2020: Composition-Based Multi-Relational Graph Convolutional Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Random adjacency matrices do not affect the performance

zhanqiuzhang opened this issue · comments

Hi, thanks for sharing the code!

I find that after randomly breaking the adjacency matrices, the performance of CompGCN remains unchanged (0.334, DistMult+multiplication). The codes in run.py that I have changed are as follows.

for sub, rel, obj in self.data['train']:
    obj = random.randint(0, self.p.num_ent)
    edge_index.append((sub, obj))
    edge_type.append(rel)

# Adding inverse edges
for sub, rel, obj in self.data['train']:
    obj = random.randint(0, self.p.num_ent)
    edge_index.append((obj, sub))
    edge_type.append(rel + self.p.num_rel)

Did I have any misunderstanding about the codes?

I think the main contribution to performance is the decoder. I tried to remove convolution layer but still obtain similar results