OpenAI-compatible HTTP API that wraps Cursor Agent CLI. Uses your Cursor subscription (Pro/Business) so you can point OpenAI-compatible clients at this proxy instead of paying per-API token.
Reference: claude-max-api-proxy (same idea for Claude Code CLI).
Detailed server setup, startup, and usage guide: docs/SERVER-SETUP.md
- Cursor Pro or Business subscription.
- Cursor Agent CLI installed and authenticated:
- Install:
curl https://cursor.com/install -fsS | bash - Auth: run
agent login(browser session, recommended) or set CURSOR_API_KEY (Integrations > User API Keys).
- Install:
- Node.js
v24.14.0(Active LTS recommended, npm 10.x).
cd cursor-cli-api-proxy
npm install
npm run buildAuthentication (either one):
- Browser session (recommended, like Claude Code CLI)
On the machine that runs the proxy, run once:agent login. Credentials are stored locally; no API key needed. - API key
For headless/CI:export CURSOR_API_KEY=your_key_from_cursor_dashboard(Cursor dashboard > Integrations > User API Keys).
# If using session: run once on this machine
agent login
npm start
# Or: node dist/server/standalone.js [port]
# Default port: 3457. Listens on 0.0.0.0 so LNVPS/remote access works.| Endpoint | Method | Description |
|---|---|---|
| /health | GET | Health check |
| /v1/models | GET | List models |
| /v1/chat/completions | POST | Chat (streaming and non-streaming) |
Use these in the model field of requests:
cursor-defaultcursor-opuscursor-sonnetcursor-haiku
In Cursor Settings > Models, add a custom model:
- baseUrl:
http://<LNVPS_IP>:3457/v1 - apiKey: any value (proxy ignores it; auth is via CURSOR_API_KEY on the server)
- model:
cursor-default(or cursor-opus / cursor-sonnet / cursor-haiku)
curl http://localhost:3457/health
curl http://localhost:3457/v1/models
curl -X POST http://localhost:3457/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"cursor-default","messages":[{"role":"user","content":"Say hello in one word."}]}'MIT