ollama / ollama

Get up and running with Llama 3, Mistral, Gemma, and other large language models.

Home Page:https://ollama.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add GPU number to ps command.

saul-jb opened this issue · comments

commented

The ollama ps command is great but it would be nice to have flags to get some additional information such as which GPU(s) the model is running on and how much it is using on that GPU.

I've already been thinking about this. It'll probably be in some kind of verbose output.

I've already been thinking about this. It'll probably be in some kind of verbose output.

IMHO, it should be the default, not verbose