sambanova / bloomchat

This repo contains the data preparation, tokenization, training and inference code for BLOOMChat. BLOOMChat is a 176 billion parameter multilingual chat model based on BLOOM.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How many resources do I need to fine-tuning?

Betai18n opened this issue · comments

commented

May I use LoRA to fine-tuning this model? And how many NVIDIA A100 cards do I need to prepare?
I am looking forward to your reply, thx

at least 4x A100 just to load the weights, + as many as possible to load data (depending on various options)

you may want to join their discord to directly talk to them