jerryji1993 / DNABERT

DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome

Home Page:https://doi.org/10.1093/bioinformatics/btab083

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Steps required for pre-training completion

smruti241 opened this issue · comments

Hii @jerryji1993

I had ran the pre-train python script since 25th Oct night and it is still running. Can you tell me how many steps or iterations it will take to complete pre-training. Since human genome is quite big, so for every genome the no of steps would be same or different? Please let me know ASAP.