rikdz / GraphWriter

Code for "Text Generation from Knowledge Graphs with Graph Transformers"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

About Position Encoding

wutaiqiang opened this issue · comments

Hey,i noticed that you did not use the positional encoding in the model but the orginal Transformer Model used the triangle positional encoding, why did not you use that ? was the PE useless?

I don't know how to include position encoding in a graph because there is no ordering to the nodes.