β
https://api.deepseek.com), paste your
API key, and pick the request format. OpenAI-compatible is
right for almost every alternative provider; the local proxy will
translate Claude Code's Anthropic-format calls. Pick Anthropic
(direct) only when the upstream itself speaks
/v1/messages.
The provider's API host. With or without /v1 at the
end β both work. Examples: https://api.deepseek.com,
https://integrate.api.nvidia.com,
https://openrouter.ai/api/v1,
http://localhost:11434.
Saving propagates to every preset in the same provider group (one API key = one provider account). If a sibling preset isn't picking it up, click Save key again with the same key β that re-runs propagation across siblings.
Pick OpenAI-compatible if the provider's docs show
/v1/chat/completions. Pick Anthropic only when the
upstream natively serves /v1/messages in Anthropic's
request shape β most third-party providers don't.
Models
GET /v1/models, or Ollama's /api/tags).
For NVIDIA NIMs that's everything they host right now β Llama,
Nemotron, DeepSeek V4 Pro/Flash, GLM 4 / 4.5, Phi, Mistral, etc.
The dropdown is also a free-text input.
No model list loaded yet β click Refresh.
Aliases
Map names users type (e.g. claude-sonnet-4) to upstream model ids.
top_k
is silently dropped for OpenAI-style upstreams (they don't accept
it).
HTTP-Referer+X-Titleβ required by OpenRouter for proper attributionaccept: application/jsonβ required by some NVIDIA NIMs endpoints- custom
X-β¦headers your enterprise gateway requires
cache_control only matters on providers
that support Anthropic's prompt-cache markers.
JSON object merged into every outbound request. The literal string {{effort}} in any value is replaced by the slider's current level when reasoning is ON.
Roo Code (VSCode)
Add to VSCode settings.json.
Cline (VSCode)
Add to VSCode settings.json.
Continue.dev
Drop into ~/.continue/config.json under models.
Cursor
Cursor β Settings β Models β "Add custom OpenAI-compatible API".
Codex / OpenAI CLI
Shell env vars (works for the OpenAI Python SDK and most other OpenAI-format tools).
Aider
Shell env vars + a --model flag.
Generic shell env
For any tool that reads OPENAI_* or ANTHROPIC_* env.
~/.config/claude-oneclick/proxy.log. If
Claude Code is misbehaving, click Refresh to see the most
recent upstream calls and any error responses.