EricLBuehler / candle-vllm

Efficent platform for inference and serving local LLMs including an OpenAI compatible API server.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LongRope support for Phi 3

EricLBuehler opened this issue · comments

Reference implementation: https://github.com/EricLBuehler/mistral.rs/blob/master/mistralrs-core/src/layers.rs#L69

Supported, please refer to latest updates.

Thank you, that looks great. Implemented in #46.