OpenCode + XAI Router: Use Codex in opencode
Posted January 10, 2026 by The XAI Tech TeamΒ βΒ 3Β min read
OpenCode (opencode) is a developer-friendly coding assistant for the terminal and beyond. This guide shows you how to route opencode through XAI Router (xairouter) and use Codex models like gpt-5.2-codex reliably.
Prerequisites
- An XAI Router account: Sign up at m.xairouter.com and create an API Key.
- opencode installed locally.
- The model ID you want to use (e.g.
gpt-5.2-codex).
Step 1: Create an API Key in XAI Router
- Log in to m.xairouter.com.
- Go to API Keys and create a new key (use a label like
opencode). - Copy the key. Youβll use it as
OPENAI_API_KEY.
Step 2: Set the environment variable
We recommend OPENAI_API_KEY to match opencodeβs OpenAI-compatible setup.
macOS / Linux:
export OPENAI_API_KEY="sk-xxx"
Windows PowerShell:
$env:OPENAI_API_KEY="sk-xxx"
Step 3: Configure opencode (Codex models)
Create or overwrite ~/.config/opencode/opencode.json:
cat > ~/.config/opencode/opencode.json << 'EOF'
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.2-codex",
"small_model": "openai/gpt-5.2-codex",
"provider": {
"openai": {
"name": "XAI Router",
"env": ["OPENAI_API_KEY"],
"whitelist": ["gpt-5.2", "gpt-5.2-codex"],
"options": {
"baseURL": "https://api.xairouter.com"
},
"models": {
"gpt-5.2-codex": {
"id": "gpt-5.2-codex",
"name": "gpt-5.2-codex",
"tool_call": true,
"reasoning": true
}
}
}
},
"share": "disabled"
}
EOF
Note: We use the
openaiprovider withbaseURLpointing tohttps://api.xairouter.comso opencode calls the Responses API, which Codex requires.small_modelandwhitelistprevent fallback to other small models (e.g. gpt-5-nano).
Step 4: Enable Codex compatibility mode (required)
Codex Responses does not allow system messages and requires instructions plus store=false. opencodeβs Codex mode handles this automatically.
Two simple steps:
- Write a dummy OAuth entry to trigger Codex mode:
cat > ~/.local/share/opencode/auth.json << 'EOF'
{
"openai": {
"type": "oauth",
"refresh": "dummy",
"access": "dummy",
"expires": 0
}
}
EOF
chmod 600 ~/.local/share/opencode/auth.json
- Start opencode with default plugins disabled (prevents rewrites to the official ChatGPT endpoint):
OPENCODE_DISABLE_DEFAULT_PLUGINS=1
Step 5: Validate
opencode debug config
opencode models openai
You should see:
model = openai/gpt-5.2-codexbaseURL = https://api.xairouter.com
Common errors and fixes
Instructions are requiredStore must be set to falseSystem messages are not allowedUnsupported parameter: max_output_tokens
These usually mean Codex compatibility mode is not enabled. Re-check Step 4 and ensure you start with OPENCODE_DISABLE_DEFAULT_PLUGINS=1.
Simplified config for non-Codex models (optional)
If you only use Chat/Completions models (e.g. gpt-4o / gpt-4.1), switch to OpenAI-compatible provider:
{
"$schema": "https://opencode.ai/config.json",
"model": "xai/gpt-4o-mini",
"provider": {
"xai": {
"name": "XAI Router",
"npm": "@ai-sdk/openai-compatible",
"env": ["XAI_API_KEY"],
"options": {
"baseURL": "https://api.xairouter.com/v1"
}
}
}
}
With the setup above, your opencode runs Codex through XAI Router with centralized key management, observability, and cost control.