Providers
ccode supports a variety of model providers out of the box. Each provider gets its own profile in your config file. This page covers the setup for each supported provider type.
For the full list of available YAML fields, see the Config Reference.
Anthropic direct
The simplest setup. ccode launches Claude Code without setting any environment variables.
Subscription passthrough
If you have a Claude Pro, Max, Team, or Enterprise subscription, the default: {} profile launches Claude Code as-is:
profiles:
default: {}
API key billing
For pay-per-token billing through Anthropic’s API, set anthropic_api_key:
profiles:
anthropic:
anthropic_api_key: sk-ant-your-key-here
Get your API key at console.anthropic.com.
First-party cloud providers
These run Claude models on AWS, GCP, or Azure infrastructure. Authentication uses your cloud provider’s credentials, not an Anthropic auth token.
Amazon Bedrock
profiles:
bedrock:
claude_code_use_bedrock: true
AWS credentials are handled by IAM (roles, SSO, environment variables, credential files). See Anthropic’s Bedrock docs.
Optional settings: anthropic_bedrock_base_url, claude_code_skip_bedrock_auth, aws_bearer_token_bedrock, anthropic_bedrock_service_tier.
Amazon Bedrock Mantle
profiles:
mantle:
claude_code_use_mantle: true
A variant of Bedrock with a dedicated endpoint. See Anthropic’s Mantle docs.
Google Vertex AI
profiles:
vertex:
claude_code_use_vertex: true
anthropic_vertex_project_id: your-gcp-project-id
Uses Google Cloud application default credentials (gcloud auth). See Anthropic’s Vertex AI docs.
Microsoft Foundry (Azure)
profiles:
foundry:
claude_code_use_foundry: true
anthropic_foundry_base_url: https://your-resource.services.ai.azure.com/anthropic
anthropic_foundry_api_key: YOUR_FOUNDRY_API_KEY_HERE
Third-party providers (native Anthropic format)
These providers expose an Anthropic-compatible API endpoint. Each needs anthropic_base_url and anthropic_auth_token.
Tip: Most third-party providers need
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETASset to"1"in theirenv:block. This strips Anthropic-specific headers that non-Anthropic providers reject.
DeepSeek
Sign up at platform.deepseek.com.
profiles:
deepseek:
anthropic_base_url: https://api.deepseek.com/anthropic
anthropic_auth_token: YOUR_DEEPSEEK_API_KEY_HERE
anthropic_model: "deepseek-v4-pro[1m]"
anthropic_default_opus_model: "deepseek-v4-pro[1m]"
anthropic_default_sonnet_model: "deepseek-v4-pro[1m]"
anthropic_default_haiku_model: deepseek-v4-flash
claude_code_subagent_model: deepseek-v4-flash
claude_code_effort_level: max
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
OpenRouter
Sign up at openrouter.ai. OpenRouter gives access to 400+ models behind a single endpoint.
profiles:
openrouter:
anthropic_base_url: https://openrouter.ai/api
anthropic_auth_token: YOUR_OPENROUTER_API_KEY_HERE
anthropic_model: anthropic/claude-sonnet-4.6
anthropic_default_opus_model: anthropic/claude-opus-4.7
anthropic_default_sonnet_model: anthropic/claude-sonnet-4.6
anthropic_default_haiku_model: anthropic/claude-haiku-4.5
claude_code_subagent_model: anthropic/claude-haiku-4.5
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
OpenRouter presets: You can create presets at openrouter.ai/workspaces/default/presets and use them as model names with the @preset/slug syntax.
Z.AI (Zhipu GLM)
Sign up at z.ai (international) or open.bigmodel.cn (China).
profiles:
zai:
anthropic_base_url: https://api.z.ai/api/anthropic
anthropic_auth_token: YOUR_ZAI_API_KEY_HERE
anthropic_model: glm-5.1
anthropic_default_opus_model: glm-5.1
anthropic_default_sonnet_model: glm-5.1
anthropic_default_haiku_model: glm-4.5-air
claude_code_subagent_model: glm-4.5-air
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
Moonshot Kimi
Sign up at kimi.ai.
profiles:
kimi:
anthropic_base_url: https://api.moonshot.ai/anthropic
anthropic_auth_token: YOUR_KIMI_API_KEY_HERE
anthropic_model: kimi-k2.6
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
Alibaba Qwen (DashScope)
Sign up at alibabacloud.com.
profiles:
qwen:
anthropic_base_url: https://dashscope-intl.aliyuncs.com/apps/anthropic
anthropic_auth_token: YOUR_DASHSCOPE_API_KEY_HERE
anthropic_model: qwen3.6-plus
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
MiniMax
Sign up at minimax.io.
profiles:
minimax:
anthropic_base_url: https://api.minimax.io/anthropic
anthropic_auth_token: YOUR_MINIMAX_API_KEY_HERE
anthropic_model: MiniMax-M2.7
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
Local and self-hosted models
Point anthropic_base_url at your local server. Several tools support the Anthropic Messages API format natively:
- llama.cpp - native
/v1/messagessupport (github.com/ggml-org/llama.cpp) - vLLM - native
/v1/messagessince v0.4 (github.com/vllm-project/vllm) - Ollama - Anthropic-compatible endpoint (ollama.com)
- LM Studio - Anthropic-compatible endpoint (lmstudio.ai)
profiles:
local:
anthropic_base_url: http://localhost:8080/v1
anthropic_auth_token: not-needed
anthropic_model: your-local-model
env:
CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1"
DISABLE_INTERLEAVED_THINKING: "1"
CLAUDE_CODE_DISABLE_NONSTREAMING_FALLBACK: "1"
For tools that only speak OpenAI format, use a translation proxy:
- LiteLLM - github.com/BerriAI/litellm
Common gotchas
| Issue | Fix |
|---|---|
| “Unexpected value(s) for the anthropic-beta header” | Set CLAUDE_CODE_DISABLE_EXPERIMENTAL_BETAS: "1" in env: |
| Provider doesn’t support interleaved thinking | Set DISABLE_INTERLEAVED_THINKING: "1" in env: |
| Streaming fallback fails | Set CLAUDE_CODE_DISABLE_NONSTREAMING_FALLBACK: "1" in env: |
| Edit patches don’t land cleanly | Non-Claude models may struggle with Claude Code’s diff format. Try smaller scopes. |
Community resources
For an actively maintained reference of model IDs, pricing, and per-provider configs, see the Alorse/cc-compatible-models repo on GitHub.