Polisetty V R K Jyothendra Varma's repositories
lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Language:PythonNOASSERTION000
optimum-habana
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Language:PythonApache-2.0000