why using output of relu for rebuilding knn graph
chenchaoxu opened this issue · comments
chenchao_xu commented
I have a question about rebuilding knn graph in gcn-v inference.
Why using the output of relu layer, not that of linear layer to get feat for rebuilding knn graph, since there are lots of zeros after passing feat map through the relu layer.