grammarly / gector

Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

What should I update if I want to do distributed training?

xiuzhilu opened this issue · comments

Hi, dear. Thank you for your sharing. According to the code you gave when I used multi-GPU training, it is equivalent to torch.nn. data_parallel. If I want to achieve distributed training to achieve torch.distributed effect. What changes do I need to make. @skurzhanskyi @komelianchuk

As the repository uses AllenNLP 0.8.4, we are limited with the functionality of the library