lm studio support
ricardo-eth opened this issue · comments
ricardo-eth commented
Congratulations for this great Extension!!
Have you planned the possibility of using it with lm studio? because when we put “ip:1234/v1” we get the message “Ollama is running” but it does not load the models.
Thx !
Muhammed Nazeem commented
Currently, only Ollama API is supported. Ialready have plans to support other local providers soon :)