lsj2408 / Transformer-M

[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)

Home Page:https://arxiv.org/abs/2210.01765

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Training on QM9

RobDHess opened this issue · comments

Hi,

Would it be possible to provide the commands for training a model on QM9 from scratch? This is mentioned in appendix B5 when investigating the effectiveness of pre-training.

Kind regards,

Rob

Hi, I have released the fine-tuning code of QM9. For training the Transformer-M model from scratch, you could simply add 'export no_pretrain="true" ' before running the finetune_qm9.sh.