TimDettmers / bitsandbytes

Accessible large language models via k-bit quantization for PyTorch.

Home Page:https://huggingface.co/docs/bitsandbytes/main/en/index

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pytorch XLA/PJRT TPU support

opooladz opened this issue · comments

Feature request

Pytorch XLA/PJRT TPU support for bitsandbytes

Motivation

Would allow for faster and more memory efficient training of models on TPUs.

Your contribution

Happy to provide TPUs.