How to set batch size when training AllSet?
thu-wangz17 opened this issue · comments
Hi @sakuraiiiii ,
This is a great question! Unfortunately, mini-batch training in AllSet is currently an open problem, where we stated in our paper (Appendix A). We conjecture one can develop a GraphSAINT or ClusterGCN type of method to allow mini-batch training. There are also many open directions stated in Appendix A, which we conjecture that one can leverage the similar idea in the GNN literature.
Please let me know if you manage to develop mini-batch training! It would be a great improvement to AllSet and to general hypergraph neural networks!
Best,
Eli
Thank you very much. If I fix this question, I will give you feedback :)