bknyaz / graph_nn

Graph Classification with Graph Convolutional Networks in PyTorch (NeurIPS 2018 Workshop)

Home Page:https://arxiv.org/abs/1811.09595

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

gcn

tonyandsunny opened this issue · comments

hi! sir. I am very interested in your code and would like to ask how to load other data,such as Cora data sets. The parameter W in GCN and GCNUNET is not trained in your code

Look forward to your reply

Hi,
This code is for graph classification only, so you will need to adapt it to make it work for node classification data like Cora. You can use PyTorch Geometric functions to load Cora data https://github.com/rusty1s/pytorch_geometric/blob/master/examples/cora.py.

How did you decide that those parameters are not trained? GCN and GraphUnet have several trainable parameters, which ones exactly do you mean?
I just checked that parameters are trainable by looking at the dynamics of their values during training.

I‘m very happy that you can reply to me . the parameter W in your gcn's 'DADHW' formula is implemented by full connection?

Do you have a pooling mask for a graph? I don't quite understand the mask

the parameter W in your gcn's 'DADHW' formula is implemented by full connection?

Yes

Do you have a pooling mask for a graph? I don't quite understand the mask

Yes, in the pooling mask value 1 means that the node exists and 0 means that there is no node. We need this mask, because we need to feed batches of equal sized tensors to the network, so this pooling mask lets the network know where there is actually no data in the tensor. For the first layer you don't need this mask, but for the following layers you need it to avoid propagating the bias term.
I also use this mask for the pooling, so if the node is dropped then I just set its mask value to 0 and I don't change the size of tensors. So, the nodes are still there after dropping but all their features are zeros and they are disconnected from the graph, so they do not affect the features of other nodes, which is equivalent to actually dropping them.

Has your paper been accepted

This repo is partially an attempt to reproduced the paper by other authors submitted, but not accepted, to ICLR 2019 [1] and our paper accepted to NeurIPS Workshop [2]. Please see https://github.com/bknyaz/graph_nn/blob/master/README.md for more details.

[1] Graph U-Net, submitted to ICLR 2019, https://openreview.net/forum?id=HJePRoAct7
[2] Spectral Multigraph Networks for Discovering and Fusing Relationships in Molecules, NIPS Workshop on Machine Learning for Molecules and Materials, 2018, https://arxiv.org/abs/1811.09595