bigscience-workshop / Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How can we access to the gradients while the model is training?

BilgehanSel opened this issue · comments

Does deepspeed offer an API to access the gradients while training for any of the stages (1-2-3). When I try to access the gradients, I only get None types. I'm mostly interested in the stage 3. If there is no such API, is there any chance to delve into the deepspeed code to access them somehow, I would also be glad if someone just points me to the right direction in the source code for this. I'm not that concerned about any slowdowns in the code as long as I have access to the gradients.