BaguaSys / bagua

Bagua Speeds up PyTorch

Home Page:https://tutorials-8ro.pages.dev/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to do gradient accumulation in bugua

CaRRotOne opened this issue · comments

I could use no_sync() with DDP in pytorch to do gradient accumulation. I haven't found related inferface in bagua.

@shjwudp there is an issue related to gradient accumulation.