mistralai / mistral-inference

Official inference library for Mistral models

Home Page:https://mistral.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Parameter for returning `logprobs`

StatsGary opened this issue · comments

I would like to know which parameter to pass to the model to return the generated tokens and associated logprobs? As I am doing a comparison of these with OpenAI's models.

Apologies if I have missed something obvious here, but I am using a vLLM deployment of `Mistral7B-V.01' in GCPs Model Garden.

I have solved this - see issue: vllm-project/vllm#2649.