Hardware requirement for inference
dibbla opened this issue · comments
Hi Pandora Team,
Great, exciting work! Thank you for releasing it.
I have noticed that Pandora has integrated a 7B LLM and video model. I wonder what is the minimum requirement for the inference? In addition, some more tutorials in README on using Pnadora would be great help. 🚀
Thanks for your interests in our work! Pandora currently works with a GPU of at least 24GB Memory and flash-attn supported. One round of generation takes 20s on an A100.
We are still working on the polishing the gradio demo in this repo and will add more tutorials later.
Thanks for your interests in our work! Pandora currently works with a GPU of at least 24GB Memory and flash-attn supported. One round of generation takes 20s on an A100.
We are still working on the polishing the gradio demo in this repo and will add more tutorials later.
Thanks for the information! Looking forward to see the weight re-opening 🤤