jianhao2016 / AllSet

This is the GitHub repository for our ICLR22 paper: "You are AllSet: A Multiset Function Framework for Hypergraph Neural Networks"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to set batch size when training AllSet?

thu-wangz17 opened this issue · comments

Hi, this is a nice work. I read the code:

AllSet/src/train.py

Lines 325 to 327 in 0d0e399

dataset = dataset_Hypergraph(name=dname,root = '../data/pyg_data/hypergraph_dataset_updated/',
p2raw = p2raw)
data = dataset.data

data = ExtractV2E(data)

out = model(data)

It seems that you train the model with all the data in a batch. Thus I have a question about how to set batch size when I train the AllSet model. Thank you.

Hi @sakuraiiiii ,

This is a great question! Unfortunately, mini-batch training in AllSet is currently an open problem, where we stated in our paper (Appendix A). We conjecture one can develop a GraphSAINT or ClusterGCN type of method to allow mini-batch training. There are also many open directions stated in Appendix A, which we conjecture that one can leverage the similar idea in the GNN literature.

Please let me know if you manage to develop mini-batch training! It would be a great improvement to AllSet and to general hypergraph neural networks!

Best,
Eli

Thank you very much. If I fix this question, I will give you feedback :)