v0.0.1
Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant.
Currently supports:
Proxy from: OpenAI, Dashscope(Alibaba Qwen), Gemini, Deepseek, Mistral, SiliconFlow.
Proxy as: LM Studio, Ollama.
Streaming chat completion API only.