Skip to content

v0.0.1

Choose a tag to compare

@Stream29 Stream29 released this 02 May 01:07
· 38 commits to master since this release
5737f14

Proxy remote LLM API as Local model. Especially works for using custom LLM in JetBrains AI Assistant.

Currently supports:

Proxy from: OpenAI, Dashscope(Alibaba Qwen), Gemini, Deepseek, Mistral, SiliconFlow.

Proxy as: LM Studio, Ollama.

Streaming chat completion API only.