mustafaaljadery / mlxserver

Start a server from the MLX library.

Home Page:https://mlxserver.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Plans to make the HTTP server compatible w/ OAI chat completions API compliant?

Vaibhavs10 opened this issue · comments

Hey hey!

Love the project, are there plans to make the API OAI compatible. IMO this could unlock quite a lot of usage for the project and MLX-lm package in general too.

Majority of the on-device libraries like llama.cpp, vllm etc offer it so it'd be cool to have parity with them.

Cheers!
VB