Mozilla-Ocho / llamafile

Distribute and run LLMs with a single file.

Home Page:https://llamafile.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Running llamafile on 2 GPU's instead of 1

TheAmpPlayer opened this issue · comments

commented

I am trying to run llamafile on windows and llamafiles uses both GPU's and limit's the VRAM to the weaker one. Is there a way to manually select on what GPU to run? I get incompatible error codes when it's trying to use both GPU's